U.S. patent application number 14/017735 was filed with the patent office on 2015-03-05 for visual image authentication and transaction authorization using non-determinism.
The applicant listed for this patent is Michael Stephen Fiske. Invention is credited to Michael Stephen Fiske.
Application Number | 20150067786 14/017735 |
Document ID | / |
Family ID | 52585211 |
Filed Date | 2015-03-05 |
United States Patent
Application |
20150067786 |
Kind Code |
A1 |
Fiske; Michael Stephen |
March 5, 2015 |
VISUAL IMAGE AUTHENTICATION AND TRANSACTION AUTHORIZATION USING
NON-DETERMINISM
Abstract
Methods and systems described herein perform a secure
transaction. A display presents images that are difficult for
malware to recognize but a person can recognize. In at least one
embodiment, a person communicates transaction information using
visual images received from the service provider system. In at
least one embodiment, a universal identifier is represented by
images recognizable by a person, but difficult for malware to
recognize. In some embodiments, methods and systems are provided
for determining whether to grant access, by generating and
displaying visual images on a screen that the user can recognize.
In an embodiment, a person presses one's finger(s) on the screen to
select images as a method for authenticating and protecting
communication from malware. In at least one embodiment, quantum
randomness helps unpredictably vary the image location, generate
noise in the image, or change the shape or texture of the
image.
Inventors: |
Fiske; Michael Stephen; (San
Francisco, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Fiske; Michael Stephen |
San Francisco |
CA |
US |
|
|
Family ID: |
52585211 |
Appl. No.: |
14/017735 |
Filed: |
September 4, 2013 |
Current U.S.
Class: |
726/4 |
Current CPC
Class: |
H04L 9/14 20130101; G06Q
20/385 20130101; G06F 2221/2117 20130101; H04L 2463/102 20130101;
H04L 63/08 20130101; G06F 2221/2133 20130101; H04W 12/00522
20190101; G06Q 20/32 20130101; G06F 21/36 20130101; H04L 2209/24
20130101; G06F 3/0482 20130101; H04W 12/06 20130101; G06Q 20/38215
20130101; G06Q 20/4014 20130101; G06F 21/32 20130101; G09C 5/00
20130101 |
Class at
Publication: |
726/4 |
International
Class: |
H04L 29/06 20060101
H04L029/06 |
Claims
1. A method of securing a transaction comprising: transaction
information is entered into a user system, the user system having a
processor system having at least one processor, a communications
interface, and a memory system; the user selects or enters
transaction information using images received from the service
provider system.
2. The method of claim 1 wherein some of said images represent
letters or numbers.
3. The method of claim 1 wherein at least one of said images is an
image of an animal.
4. The method of claim 1 wherein at least one of said images has
color or texture.
5. The method of claim 1 wherein one-time information is
communicated with said images.
6. The method of claim 1 wherein at least one of said images is at
least part of a logo.
7. The method of claim 1 wherein said service provider encrypts one
or more said images before transmitting them to said user
system.
8. The method of claim 1 wherein a user looks at one or more images
to check that service provider is valid.
9. The method of claim 1 wherein at least some of said images are
used as a universal identifier for said user.
10. The method of claim 1 wherein said service provider is a bank
or financial exchange.
11. The method of claim 1, wherein at least one said visual image
is of at least part of a human face that expresses a smile.
12. The method of claim 1 wherein noise is combined with said
images and the noise is generated using quantum randomness.
13. A method for determining whether to grant access to a secure
entity comprising: generating visual images and displaying said
images on a screen; and a user selecting said visual images from
said display screen; wherein said determining uses a processor
system having a least one processor, a communications interface,
and a memory system.
14. The method of claim 13 wherein said screen is a touch sensitive
screen and the user selects said images with his or her
fingers.
15. The method of claim 13 where the order of said visual images is
randomly permuted based on a non-deterministic process generated by
hardware.
16. The method of claim 15 wherein said hardware is part of the web
server.
17. The method of claim 13 wherein said visual images are randomly
generated by a web server and transmitted to a mobile phone or
PC.
18. The method of claim 13 wherein noise is combined with the
visual images and said noise is generated using quantum
randomness.
19. The method of claim 13 wherein at least one of said images is
an image of an animal.
20. The method of claim 13 wherein at least one of said images has
texture.
21. The method of claim 13 wherein at least one of said images has
color.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application incorporates herein by reference U.S.
Provisional Patent Application No. 61/698,675, entitled "No More
Passwords", filed Sep. 9, 2012.
FIELD OF THE INVENTION
[0002] This specification relates to security in computers, mobile
phones and other devices.
BACKGROUND
[0003] The subject matter discussed in the background section
should not be assumed to be prior art merely as a result of its
mention in the background section. Similarly, a problem mentioned
in the background section or associated with the subject matter of
the background section should not be assumed to have been
previously recognized in the prior art. The subject matter in the
background section merely represents different approaches, which in
and of themselves may also be inventions.
LIMITATIONS AND WEAKNESSES OF PRIOR ART
[0004] A shortcoming in the prior art, recognized by this
specification, is that there is a lack of a secure integration of
the identity of the user to the protection of the user's data and
the control of the user's computer. A critical part of the computer
instructions for an action or a transaction are usually executed on
the host domain machine (e.g., the user's computer). Some examples
of the user's computer are a Mac Book Pro, a Dell desktop computer,
an IPhone, a Blackberry or an Android phone. Currently cryptography
keys are stored on the user's computer or a chip executing the
operating system, which is not secure. For example, when Bob's
computer communicates with Mary's computer, even when using
well-implemented Public Key Infrastructure (PKI), Bob's computer
can only be sure that it is communicating with Mary's computer. Bob
can not be sure that he is communicating with Mary and vice versa.
Similarly, even Bob cannot be certain that the communications he
sends Mary are the same as the communications that Mary receives as
coming from him.
[0005] Sending a secure communication using Public Key
Infrastructure (PKI) from one user machine to another user machine
ensures communication between the user machines, but may not ensure
secure communication between the users of the machines. Continuing,
with the above example, as a result of the use of a Public Key
Infrastructure, although Mary may be reasonably sure that Mary's
machine is communicating with Bob's machine, Boris may be operating
one or more computers in Russia and may have remotely broken into
Bob's computer and may be using Bob's machine and pretending to be
Bob.
[0006] In the prior art, each computer cannot be assured of who
controls the other computer. For example, even when a user is
present, an intruder (e.g., a hacker) may be physically located
thousands of miles away, but is remotely logged onto the user's
machine and hijacking the user's intended action(s). Even the
Trusted Platform Module (TPM) has the fundamental cyber security
weakness of not knowing who controls the other computer with which
a user may be in communication with or who controls the computer
which contains the Trusted Platform Module. Not knowing the other
computer with which a current computer is in communication with may
be a weakness that is significant when the operating system can
directly access the TPM. If the user's computer is compromised,
then the attacker can access the TPM. Another limitation and
weakness of the TPM is that there is no mechanism for binding the
identity of the user to the user's cryptography keys and other
confidential information that should be bound to the user's true
identity.
[0007] Another shortcoming of cyber security is that a secure link
is missing between the authentication of a valid user, and the
authorization of an action. The authorization of an action could be
the execution of a financial transaction from a user's bank
account, a stock trade in a user's brokerage account, the execution
of an important functionality on the electrical grid, or access to
important data on a private network such as SIPRnet (e.g.
WikiLeaks). The authorization of an action typically occurs through
the web browser since the web browser presents a convenient
interface for a person. However, the web browser is where the
important connection between authentication of a user and
authorization of an action may be broken. Existing systems have the
user authenticating the user's computer, and then the same user's
computer also authorizes (and may also execute) the action. Since
the user's computer can be hacked, the lack of a secure and direct
link between authenticating the user's computer and authorizing the
action may render the act of user verification irrelevant.
[0008] Part of the disconnect (vulnerability) between
authenticating the user and authorizing the user's action occurs,
because authentication (e.g., biometric authentication) is
typically and naively represented as an on/off switch. That is,
after the user has been authenticated and the initial transaction
approved, the remainder of the session is assumed to be secure and
all actions after authentication are assumed to be legitimate,
without performing any further checks. In the same way, if this
on/off implementation occurs in an untrusted computing environment,
then outstanding biometric algorithms and sensor(s) become
irrelevant because the biometric authentication can be circumvented
between the user authentication and the authorization or
confidentiality part of the security system.
[0009] The use of biometrics can be advantageous for security,
because biometrics offers a reliable method for verifying who (the
person) is that is actually initiating a transaction. However, even
with the use of biometrics, if the handling of the biometric
information, the storage of the biometric data, or the control of
actions based on a biometric verification is done on an unsecured
user's computer, the value of the biometrics may be greatly reduced
or nullified.
[0010] An additional aspect of the weakness of current
authentication and authorization processes (such as those using
biometrics) is that the action can be hijacked by executing a
Trojan attack on the user's computer, for example. A Trojan attack
is an attack in which the attacker pretends to be the user and/or
the other system to which the user is communicating with. In other
words, a valid, authorized user cannot verify that the action he or
she is trying to execute is what is actually being executed,
because a third party may be masquerading as the other system.
[0011] An example of this weakness is the untrusted browser attack
used to divert money from a user's bank account. Mary's web browser
may display to her that she is about to send $500 to Bob's account,
but in reality her untrusted browser is configured to send $50,000
to a thief's bank account.
[0012] Since the web browser is executed on the user's computer,
the browser cannot be trusted even when using PKI and one-time
passcodes! A recent untrusted browser attack on the gold standard
of security, RSA SecurID, demonstrates this surprising fact. The
consequences of this particular cyberattack were that $447,000 was
stolen from a company bank account in a matter of minutes, even
though the valid user was using one-time passcodes to make the
transaction more secure. The details of this cyberattack are quoted
below in a MIT Technology Review, entitled "Real-Time Hackers Foil
Two-Factor Security," Sep. 18, 2009, which states, In mid-July, an
account manager at Ferma, a construction firm in Mountain View,
Calif., logged into the company's bank account to pay bills, using
a one-time password to make the transactions more secure. Yet the
manager's computer had a hitchhiker. A forensic analysis performed
later would reveal that an earlier visit to another website had
allowed a malicious program to invade his computer. While the
manager issued legitimate payments the program initiated 27
transactions to various bank accounts, siphoning off $447,000 in a
matter of minutes. "They not only got into my system here, they
were able to ascertain how much they could draw, so they drew the
limit," says Roy Ferrari, Ferma's president. The theft happened
despite Ferma's use of a one-time password, a six-digit code issued
by a small electronic device every 30 or 60 seconds. Online thieves
have adapted to this additional security by creating special
programs--real-time Trojan horses--that can issue transactions to a
bank while the account holder is online, turning the one-time
password into a weak link in the financial security chain. "I think
if a broken model," Ferrari says. Security experts say that banks
and consumers alike need to adapt--that banks should offer their
account holders more security and consumers should take more steps
to stay secure, especially protecting the computers they use for
financial transactions. `We have to fundamentally rethink how
customers interact with their banks online,` says Joe Stewart,
director of malware (malicious software) research fear security
firm SecureWorks, in Atlanta, Ga. `Putting all the issues with the
technology aside, if [attackers] can run their code on your system
they can do anything you can do on your computer. They can become
you."
[0013] There is now widespread understanding, both in popular and
technical domains, of the theoretical and practical fragility of
online transaction security. The RSA SecurID.RTM. token is the
industry-leading technology for authenticating and securing
identity in online transactions. The recent attack and subsequent
breach of the RSA SecurID token (announced March 2011) has
highlighted the fundamental problems with current cybersecurity
solutions. Malware played a significant role in causing this
breach. Malicious software has many forms: virus, worm, Trojan
horse, spyware etc. all of which have the singular purpose of
undermining the security, confidentiality, integrity or
availability of computer systems. Recent uber malware is invisible.
It encrypts and camouflages itself using the same mathematical
techniques used by traditional, white hat cryptography. Eric
Filiol, "Malicious Cryptology and Mathematics," Cryptography and
Security in Computing (Intech, 2012), pp. 23-50.
http://cdn.intechopen.com/pdfs/29700/InTechMalicious_cryptology_and_mathe-
matics.pdf
[0014] Malware is able to phish passwords or hijack financial
transactions made via mobile devices or personal computers without
the user's knowledge. It is not necessary for malware to break the
cryptography of a device to compromise its security. Contemporary
computers and electronic devices are particularly susceptible to
malware attacks due to their processor architecture.
[0015] Specifically, the processors have a von Neumann
architecture, which only execute one computing instruction at a
time. As a consequence, malware has to corrupt or transform only a
single machine instruction to initiate execution of malignant code.
This is a deep vulnerability arising from current processor
architecture and it cannot be easily rectified. Only one legitimate
jump or branch instruction needs to be changed in a digital
computer program to start it executing malware. During machine
execution, after the von Neumann machine program has been hijacked
by malware, anti-virus software, that is supposed to check the
program, might not get executed, may be disabled or in other cases
may never detect the malware. The sequential execution of von
Neumann machine instructions hinders a digital computer program
from protecting itself
[0016] A common malware technique is the so-called
"man-in-the-middle" attack. This attack is an active form of
eavesdropping in which the attacker makes independent connections
with the counterparties in a given transaction; by using
appropriate authentication the attacker controls the entire
transaction. The counterparties are unaware of the presence of the
attacker and assume they are transacting securely with each other.
Internet communications and financial transactions can be
intercepted and hijacked by malware (malicious software) performing
a "man-in-the-middle" attack. These attacks are not easy to detect
or prevent. In particular, the RSA SecurID breach demonstrated that
pseudo-random number generators (i.e., deterministic algorithms),
typically used in two-factor authentication solutions cannot
prevent "man-in-the-middle" attacks launched by malware.
[0017] Malware, however, has a significant weakness: malware is
poor at recognizing visual images since computer algorithms cannot
match the visual pattern recognition ability of the human brain.
Human beings have highly advanced visual pattern recognition
skills. The embodiments described here exploit this fundamental
weakness of malware.
[0018] A third fundamental shortcoming of current cybersecurity
solutions is the fact that static authentication factors, such as
passwords, PINs and biometrics, entered directly into the user's
computer or stored on computers in a digital or binary format such
as ASCII code. This weakness makes static authentication factors
vulnerable to phishing attacks in the host domain or security
breaches in the network domain.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] In the following drawings like reference numbers are used to
refer to like elements. Although the following figures depict
various examples, the one or more implementations are not limited
to the examples depicted in the figures.
[0020] FIG. 1A shows a block diagram of an embodiment of a system
for executing secure transactions resistant to malware.
[0021] FIG. 1B shows a memory system that is a component of the
system shown in 1A.
[0022] FIG. 2A shows a block diagram of an embodiment of a service
provider system.
[0023] FIG. 2B shows memory system that is a component of the
system in FIG. 2A.
[0024] FIG. 3 shows a flow diagram of a user setting up the system
to enable the execution of secure transactions.
[0025] FIG. 3A shows a flow diagram of an embodiment of step A of
executing a secure transaction.
[0026] FIG. 3B shows a flow diagram of an embodiment of step B of
executing a secure transaction.
[0027] FIG. 4 shows a collection of images that are parts or a
whole of a logo. Some of the images are rotated.
[0028] FIG. 5 shows a collection of images. One is part of a logo.
There are 26 visual images of the alphabet letters
"ABCDEFGHIJKLMNOPQRSTUVWXYZ". The word NAME is made up of a
collection of images with a doodle background texture.
[0029] FIG. 6 shows a collection of images. One is part of a logo.
Another is the word BANK with a simplicial and dotted texture
background. There are 26 visual images of the alphabet letters
"ABCDEFGHIJKLMNOPQRSTUVWXYZ".
[0030] FIG. 7 shows a collection of images. One is part of a logo.
Another is the word "ACCEPT" written with bubble texture on a
simplicial background texture. And a third is the word "ABORT"
written with bubble texture on a foliation background texture.
[0031] FIG. 8 shows a collection of images. One object is a
geometric image of a blue rectangle on top of a blue triangle which
is on top of a red blob. Just to the right is a rectangle with
vertical texture on top of a triangle with dotted texture on top of
a blob with simplicial texture.
[0032] FIG. 9 shows a collection of images, illustrating some
different textures. FIG. 9 shows nine different textures: vertical,
horizontal, mixed, dotted, bubble, simplicial and foliation.
[0033] FIG. 10 shows recipient account number 9568342710
represented by two distinct collection of images. In the top
representation, the number 3 is represented with a visual image
using a triangular texture. In the bottom representation the number
4 is represented with the letters "FOUR" using bubble texture to
write the letters. There is also part of a logo in the lower left
corner of FIG. 10.
[0034] FIG. 11 shows a collection of images representing a
universal identifier. A subset of these can be used for user
authentication.
[0035] FIG. 12a shows a user interface page for enrollment.
[0036] FIG. 12b shows a user interface page for enrollment that
displays different visual image categories.
[0037] FIG. 12c shows a user interface page for enrollment that
displays different images in the "sports" category. One image
represents "cycling". Another image represents "tennis". Another
image represents "skiing".
[0038] FIG. 13a shows a user interface page for user verification
or user login using visual images.
[0039] FIG. 13b shows a user interface page for user verification
that displays different visual images. One image displays an
elephant. Another image displays a car.
[0040] FIGS. 14a, 14b and 14c show the use of correlation to detect
and find the locations of features in an image.
[0041] FIG. 14a shows an image of the word "apple" in a handwritten
style font.
[0042] FIG. 14b shows an image, representing the letter "p" in a
handwritten style font.
[0043] FIG. 14c shows a correlation function image, indicating the
detection of the presence and exact locations of the letter "p" in
the image in 14, indicated by the bright peaks.
[0044] FIGS. 15a, 15b and 15c show the use of special types of
noise to hinder the use of the correlation operation to find
features in an image.
[0045] FIG. 15a show an image, representing the word "apple" in the
same handwritten style font as FIG. 14a, but with non-deterministic
noise added.
[0046] FIG. 15b shows a raw image, representing the letter "p" in a
handwritten style font.
[0047] FIG. 15c shows an unintelligible correlation function image,
indicating the inability to detect the locations of the "p" in the
image in FIG. 15a.
[0048] FIG. 16 shows a semiconductor device that is a
photodetector. This hardware device can detect the arrival of
single photons, which is an embodiment of quantum randomness.
[0049] FIG. 17 shows a device that receives a polarized photon and
splits it into a linear/horizontal vertical analyzer with 50%
chance of detecting a "0" or "1". This hardware device can detect
the polarization of single photons, which is an embodiment of
quantum randomness. Under the device is a diagram representing a
photon that is circularly polarized. Under the device, on the
right, is a diagram representing a photon that is linearly
polarized.
[0050] FIG. 18 shows a random noise generator and a digital logic
circuit that captures and outputs this randomness. Below the
generator are the time delays between separate events that are
detected. In an embodiment, the random noise generator may be
implemented with a photodetector as shown in FIG. 16. In this
embodiment, the arrival times of photons enable quantum
randomness.
DETAILED DESCRIPTION
[0051] Although the issues discussed in the background or elsewhere
may have motivated some of the subject matter disclosed below,
nonetheless, the embodiments disclosed below do not necessarily
solve all of the problems associated with the subject matter
discussed in the background or elsewhere. Some embodiments only
address one of the problems, and some embodiments do not solve any
of the problems associated with the subject matter discussed in the
background or elsewhere. In general, the word "embodiment" is used
to specify an optional feature and/or configuration.
[0052] A groundbreaking method for cybersecurity is described that
is more secure against modern malware, and provides a much better
user experience compared with passwords or hardware tokens such as
SecurID. No More Passwords uses visual images that are selected by
a user to create a set of "favorites" that can easily be recalled
and quickly selected by the user at login.
[0053] No more passwords leverages the superior power of eye-brain
processing of humans versus machines to ensure that a human, and
not a bot or malware, is involved in a transaction or
communication.
[0054] Underlying the simplicity of this approach is a security
technology that includes: [0055] A.) A non-deterministic random
number generator hardware based on quantum physics. [0056] B.)
Noise modification of images using the random number generator.
[0057] C.) Visual Image morphing, positioning and reordering based
on the random number generator. [0058] D.) Transaction-dependent
passcodes.
[0059] The application of all these methods in concert addresses
current cybersecurity issues, as well as anticipating other
possible approaches that hackers may attempt in the future, while
the flexibility of the approach supports the creation of advanced,
user-friendly user interface designs.
[0060] Malware, phishing scams and other various forms of hacking
and cybersecurity breaches have become a major issue today. The use
of passwords is inadequate, inefficient and problematic for users
and companies, and the problems with password use are increasing
steadily.
[0061] The invention(s) described herein uses the unique, innate
pattern recognition skills of humans to transform cybersecurity It
advances online transaction security, which currently relies mainly
on the straightforward use of passwords or, in some cases, the
addition of other security enhancements that may provide some
improvement in security, but are still inadequate. These measures
typically increase the cost of the system while greatly reducing
the convenience to the user.
[0062] Malware resistant authentication and transaction
authorization is provided through the combined application of
various methods and embodiments. In an embodiment, this invention
can eliminate use of the alpha-numeric "password" such as
"34YUiklmn" or a sequence of ASCII symbols such as
"94Yzi2_e$mx&" The invention(s) herein also provides a basis
for a much-improved user interface and the overall user experience
around securing online transactions, access control, and the
protection of an individual's personal data and identity.
[0063] The invention(s) described herein use visual representations
(images) that are both personal and memorable to each individual
user. There is an enrollment process in which the user selects a
set of images from a group of categories representing the user's
"favorites." At verification (i.e., login time) the user is asked
to select some or all personal favorites from a set of
randomly-selected options as verification of both the user's
identity and the fact that the user is in fact a human instead of
an automated system that has hijacked the transaction flow. This
approach has a number of advantages in terms of convenience to the
user, while allowing anti-malware methods to be applied that
provide substantial anti-hacking capability.
[0064] User Interface Design
[0065] The use of visual images to create a unique identity for a
user has many advantages: The system is not only highly secure and
resistant to various hacks and malware attacks, but is also
intuitive, easy to use and attractive to users. The core technology
behind an identity security system should support a user interface
(UI) that provides all of these benefits; there is sufficient
flexibility in the UI design and a range of security-enhancing
features that can be used together in various ways to allow the UI
design to be tailored to the needs of both the user and the device
(e.g., PC, IPhone, Android IPhone, IPad, tablet computer) on which
it is being used.
[0066] Since interaction with the user is a key part of the
technology, it is helpful to describe a UI design example for two
reasons: 1) to ensure that the technology is both effective and
easy to use; 2) to help explain how embodiments work. The UI should
be designed to run on the device(s) of choice within the intended
application and tested for intuitiveness, ease of use,
functionality, acceptance by and attractiveness to product
users.
[0067] The UI described in this section is intended as an example
design, and shows how it might be implemented on a mobile phone.
The example shown here is only meant to provide general clarity
about what can be done with this technology and to serve as a
high-level use case to describe the flow for creating and entering
a unique login identity for a user.
[0068] Enrollment.
[0069] To enroll, the user first initiates the enrollment. The
process will start with the launching of an application, or a
request to enroll within a running application on a particular
device such as a mobile phone, computer, terminal or website. In
the example here, and figures below, the device is a mobile phone
and the user starts the enrollment process by launching an app.
[0070] Once launched, the application starts enrollment by
displaying the first enrollment screen with a superimposed popup
window that provides brief instructions for enrollment and a box in
which the user is asked to enter a username. This is shown in FIG.
12a.
[0071] As soon as the username has been entered, the popup window
disappears, showing the first enrollment screen that provides a
list of categories for the user's "favorites", as shown in FIG.
12b. These categories can include almost anything, such as animals,
musical instruments, travel destinations, famous people, sports,
etc. In this example, a few items from the list of categories are
displayed on the screen, but the central portion of the screen with
the category icons can be scrolled up or down to show other
choices. The items that the user has currently chosen are shown as
small icons at the bottom of the screen. This part of the display
(and the header at the top) does not scroll, and provides a running
tally of the user's choices throughout the process. This also
serves as messaging to the user as to the progress of the
enrollment process.
[0072] In an embodiment, after a category has been selected, a
second screen appears showing specific items in the chosen
category. This screen is shown in FIG. 12c. The tally of choices is
carried over, and shown at the bottom of this screen as well.
Again, in this example, the central portion of the screen with the
icons can be scrolled up or down to display more than the nine
items shown on the screen at one time. The user can then select
his/her item from the available choices. Once a selection is made,
the item chosen appears in the running tally below, and the display
reverts back to the first enrollment screen which provides the
category choices again.
[0073] This process is repeated seven times in this example. The
number of choices required from the user for enrollment can be
changed, depending on the security level required, and an
acceptable enrollment process for a particular case. In general,
the fewer the choices required by the user, the less secure the
embodiment will be, but the trade off between security and ease of
use is important, and should be decided on a case-by-case
basis.
[0074] FIG. 12a shows page 1 of an enrollment user interface (UI),
which shows the popup window superimposed requesting entry of the
username. In an embodiment, for added security, the username is
obscured in a similar way as is typically accomplished with a
password field.
[0075] FIG. 12b shows page 1 of an enrollment UI after the
disappearance of the popup window, showing the same screen 1 the
scrollable "favorites" category selections, and the boxes at the
bottom of the screen where the running tally of the user's choices
will be displayed
[0076] FIG. 12c shows enrollment page 2 showing 9 of the specific
item choices available, and allowing the user to scroll for more.
In this diagram, the user had selected "sports" as the category on
the previous page, and this is the third favorite, as indicated by
the running tally below showing the two previous choices, and the
note in the header that reads "selection 3."
[0077] Verification (Login)
[0078] While it may be acceptable for the enrollment process to
take a few minutes, and require the user to be guided through
multiple steps, verification should be as quick and simple as
possible. This is well-known in biometrics since biometric devices
usually require a series of steps to enroll. At verification,
however, the expectation of the user is that the use of the
technology will make verification of their identity not only more
secure, but much easier and faster. The same is applicable with
implementations. Despite widespread identity theft and hacking,
many users are far more concerned with convenience than they are
about security.
[0079] In the example presented here, an enrolled user initiates
verification by launching an image-enabled app, or requesting login
to a local or remote system. Immediately, the verification screen
appears with a randomized group of choices for the user, and a
popup window superimposed that requests entry of the username, as
depicted in FIG. 13a. Once the username is entered, the popup
window disappears exposing the screen with the randomized choices
for the user to select. This is shown in FIG. 13b. The options
offered on a single screen contain at least one of the user's
"favorite" images that were selected at enrollment, but also
contains a number of other "incorrect" options that are selected
randomly from a large set of options. The central portion of the
screen with the icons of the selectable items can be scrolled up or
down to expose more choices. In this example, to pass verification,
the user chooses four of the seven favorite items that were chosen
at enrollment. Once all favorites have been correctly selected, the
screen disappears, and login proceeds. If the choices are
incorrect, the login process starts over again from the beginning
For added security, there may be a limit placed on the number of
failed attempts a user can make in a login session.
[0080] FIG. 13a shows a verification user interface (UI) page with
the popup window superimposed requesting entry of the username. In
an embodiment, the verification page is a single page to make
verification quick and simple. Also note again that for added
security, the username can be obscured in the same way as is
typically done with a password field.
[0081] FIG. 13b shows the verification user interface page after
the disappearance of the popup window showing the scrollable
"favorites" selections that have been randomly selected from a
large array of options. As indicated in the footer at the bottom of
the screen, the application is ready for the third favorite out of
a total of four required for verification. In other embodiments,
more than four favorites may be requested for a successful login.
In another embodiment, more than four favorites may be requested to
complete a financial transaction. In other embodiments, less than
four favorites may be requested for a successful login.
[0082] In embodiments, robust security is desired but also
convenience and a positive experience of the user are also
important. There is sometimes a tradeoff between security and
convenience for the user, and this tradeoff is fundamental to
security technology from the old-fashioned lock and key, to the
most modern and sophisticated security technology used today.
[0083] There is a correlation between the number of favorites
required during enrollment, the number of favorites needed to
verify, in the specific requirements of the order of the choices,
and in the layout and presentation of the images themselves. For
example, if the user is required to select his/her favorite images
in the same order they were chosen at enrollment, this increases
the security greatly, but makes remembering the images much easier
than a password, since people memorize and remember by association.
Each person has his own personal unique association, which makes
this a natural approach to a stronger, more effective security
system.
[0084] It is helpful to note that the technology and embodiments
have flexibility in this aspect, and that the choice of these
parameters can be adjusted, not only from one application to
another, but if desired, from one transaction to another. For
example, if in an embodiment a user has chosen seven items at
enrollment, he/she may be asked to select only four items to unlock
the phone interface, but when logging into a bank account, he/she
may be asked to enter all seven items. In an alternative
embodiment, the user may be requested to select 12 items instead of
7. This means that the technology can be adjusted "on the fly" to
accommodate varying security levels for different embodiments.
[0085] In addition, as explained in a previous section, the use of
images, plus the application of image processing, and
non-deterministic random number generator, makes the UI and the
system secure against sophisticated malware and hacking methods.
The images shown in the UI diagrams above can be reordered, and the
options offered can be changed using the non-deterministic random
numbers on every screen during enrollment and verification. This
removes the possibility of malware or onlookers recognizing
patterns in what is being presented to the user, or following the
user's behavior. As explained above, to address security, the
images themselves are modified to prevent sophisticated malware
from running in the background to recognize the images directly by
means of computational pattern recognition. This can be
accomplished by again using the non-deterministic random number
generator to produce unpredictable parameters for the algorithms
that modify the images using special types of noise, or applying
rotation or translation to change the orientation or position of
the image on the screen, or distorting the images slightly to
change their shape. In fact, all of the above modifications can be
applied simultaneously, randomly to each image, differently on
every step in the enrollment or verification process, each time it
is used. The same can be done to the text on the screen in order to
make it unreadable by malware as well, if needed. Because the human
eye/brain system is so highly adept at recognizing images, these
modifications to the images can be made so that it is extremely
difficult for sophisticated malware to recognize what is happening
on the device, without spoiling the human user's experience
[0086] As stated above, the UI design presented here is an example
of how embodiments can be implemented. There are other UI
embodiments that use visual images for login and entry of
information a non-digital or non ASCII format. The intent is to
highlight the main components that make up this system, while
showing flexibility. The exact layout and features of the UI are up
to the designer of the product or system which uses the technology.
Depending on the details of the device, the application and the
security requirements, the user interface may be configured very
differently. On some systems, it may be best to guide users though
a series of separate screens instead of scrolling. If scrolling is
preferred, it can be done in one or two dimensions on the screen,
or perhaps using scroll wheels, similar to those used in the Apple
iPhone's date and time settings. In some cases, more category
options, or sub category options may be useful. During the
verification process, if preferred, the items can be categorized,
similar to the example for enrollment, and it may be desirable to
have all the choices displayed on a single screen, rather than
offering more items to choose via scrolling, in which case the
categories could be panelized on the screen.
[0087] The choice of the images used is also to be considered.
Simple binary images, such as those shown in the example of an
embodiment, may be used in some embodiments. Full-color images
could be used as well, depending on what sort of image processing
is preferred for security enhancements. The shape and size of the
images is flexible as well. The images chosen could even be opened
up to the user by providing a large database of downloadable
images, similar to the wide array of ringtones now available for
cell phones. There may be some restrictions on the properties of
the images used, however, again depending on the specifics of the
security needs, the device, and the user interface design, but
overall, it is extremely flexible.
[0088] Security Advantages
[0089] Given the dangers posed by malware, it is essential that
recipients of internet dataflow in a transaction can be assured
that the sender is human and the recipient (on the server side) is
the actual institution (e.g., a bank) and not malware posing as a
bank. The solution ensures a live human is reading, entering and
broadcasting information. A GUI based on special processed images
renders messages that are "unreadable" by machines or automated
processes. This robust security solution is web server driven
making it usable by personal computers, mobile devices and any
device with a visual interface. Before describing the interface and
GUI, we discuss some security advantages.
[0090] Unpredictability
[0091] On the web server, it uses one or more hardware devices that
utilize fundamental laws of physics to generate non-deterministic
random numbers. This is in contrast to the use of pseudo-random
number generators in RSA SecurID, for example, which are based on
deterministic algorithms. These unpredictable numbers are used for
three major purposes:
[0092] Unpredictable numbers are used to unpredictably place images
on the screen.
[0093] Unpredictable numbers are used to unpredictably change the
image shape
[0094] Unpredictable numbers are used to add unpredictable noise to
images.
[0095] Given this unpredictability at multiple sites, the sequence
of images used for a login/authentication cannot be reproduced by a
digital computer program because the numbers are not generated by a
deterministic algorithm (i.e., a digital computer program).
Instead, quantum devices are used. In some embodiments, the quantum
devices utilize one or more photons being emitted from a device and
generating a random 0 or 1 based on the time at which the photon is
emitted.
[0096] A well-designed quantum device can generate numbers
according to the following two quantum-random properties of no bias
and history has no effect on the next event.
[0097] There is no bias: A single outcome x.sub.k of a bit sequence
(x.sub.1 x.sub.2 . . . ) generated by quantum randomness is
unbiased: P(x.sub.k=1)=P(x.sub.k=0)=1/2.
[0098] History has no effect on the next event: Each outcome
x.sub.k is independent of the history. There is no correlation that
exists between previous or future outcomes. For each b.sub.j
.di-elect cons. {0, 1}, P(x.sub.k=1|x.sub.1=b.sub.1, . . . ,
x.sub.k-1=b.sub.k-1)=1/2 and P(x.sub.k=0|x.sub.1=b.sub.1, . . . ,
x.sub.k-1=b.sub.k-1)=1/2.
[0099] Let .PI.={(b.sub.1 b.sub.2 . . . ): b.sub.k .di-elect cons.
{0, 1}} be the space of infinite sequences of 0's and 1's
representing infinite quantum random bit sequences. It can be shown
that if a quantum device producing the quantum randomness runs
under ideal conditions to infinity, then the resulting infinite
sequence of 0's and 1's (i.e., sequence in .PI.) is incomputable.
In other words, no digital computer program (i.e., deterministic
algorithm) can reproduce this infinite sequence of 0's and 1's.
This incomputability of quantum random sequences is a useful
property of non-deterministic random numbers. The resulting
unpredictability incorporated into the image generation and
manipulation in the system can make the recognition of the visual
images a difficult artificial intelligence (AI) problem for
machines. This unpredictability can be applied in the noise
generation that is used to make visual images more difficult for
machine algorithms to recognize.
[0100] In an embodiment, a hardware device, as shown in FIG. 17,
detects the polarization of photons and uses this detection to
determine a quantum random 0 or 1. In an embodiment, the hardware
detector uses linearly polarized photons (light). In an embodiment,
the hardware detector uses circularly polarized photons (light). In
an embodiment, a quantum random 0 or 1 is generated by the
detection of a single photon. In an alternative embodiment, a
quantum random 0 or 1 is generated by the detection of more than
one photon.
[0101] In an embodiment, as shown in FIG. 18, a quantum random 0 or
1 is generated based on the relative timing based on quantum events
0, 1 and 2. In FIG. 18, T.sub.1 is the time elapsed between quantum
event 0 and quantum event 1; T.sub.2 is the time elapsed between
quantum event 1 and quantum event 2. In an embodiment, if elapsed
time T.sub.1 is greater than elapsed time T.sub.2 then a quantum
random 1 is generated; if elapsed time T.sub.1 is less than elapsed
time T.sub.2 then a quantum random 0 is generated. In an
alternative embodiment, if elapsed time T.sub.1 is greater than
elapsed time T.sub.2 then a quantum random 0 is generated; if
elapsed time T.sub.1 is less than elapsed time T.sub.2 then a
quantum random 1 is generated. In an embodiment, events 0, 1, and 2
are the result of detecting a photon. In another embodiment, events
0, 1 and 2 are the result of detecting a photon that is
horizontally polarized.
[0102] In an embodiment, the detection of a photon may occur in a
semiconductor chip as shown in FIG. 16.
[0103] Noise
[0104] As the number, scope and value of transactions being
conducted via the Internet and through the use of mobile devices
increases, so do the incentives for hackers to apply ever greater
resources to their craft. At the same time, the available computing
power that can be applied by malware towards attacks of escalating
sophistication is increasing. Smart phones today have unprecedented
number crunching power; while this power can be used to create
clever security systems, it can also be harnessed by malware at any
node in the communication path to which malware can gain
access.
[0105] In embodiments, images help ensure that a human, not a
machine, is controlling the transaction or the communication
between the user and institution. This is based on the highly
developed ability of humans to recognize images. Although machine
vision is embryonic by comparison with the mature image recognition
abilities of the human eye-brain combination, it is possible for
machines to recognize images. In order to provide robust security
in anticipation of the possibility that sophisticated malware may
incorporate machine vision techniques to attack image-based
security systems, proprietary methods were developed to counteract
computational image recognition, and fully exploit innate human
pattern recognition abilities.
[0106] One widely used approach to computational pattern
recognition is the correlation operation. This is a direct
point-by-point mathematical "comparison" of two functions that can
be used not only to detect the presence of a feature in an image,
but to also find its location accurately. The continuous expression
that describes the non-normalized correlation operation C between
two real, one-dimensional functions A and B is:
C ( t ' ) = A ( t ) .circle-w/dot. B ( t ) = .intg. - .infin. +
.infin. A ( t + t ' ) B ( t ) t ##EQU00001##
[0107] The .circleincircle. operator represents the correlation
operation. In discrete form, as implemented in a digital computer,
the correlation can be written as:
C ( t ' ) = t A ( t + t ' ) B ( t ) ##EQU00002##
[0108] This can be extended to two dimensions for use with images
as:
C ( x ' , y ' ) = y x A ( x + x ' , y + y ' ) B ( x , y )
##EQU00003##
[0109] It can be further extended to be used with two dimensional
images, as well as finding the rotational orientation of one image
with respect to the other as:
C ( x ' , y ' , .theta. ) = .theta. y x [ R ( .theta. ) A ( x + x '
, y + y ' ) ] B ( x , y ) ##EQU00004##
R is a rotation operator applied to A.
[0110] One reason the correlation operation is so powerful and
widely used is that the calculation of the correlation function can
be done efficiently using a fast Fourier transform (FFT).
Performing the correlation operation directly, point-by-point, can
be done very rapidly with modern computers for small images, but
the computational complexity increases as N.sup.2, where N is the
number of data points in the image being cross correlated (for
images of equal size). However, the correlation operation can be
calculated using FFTs as follows:
A.circleincircle.B=IFFT((FFT(A)).times.(FFT(B)))
[0111] Here, A and B are the two image arrays and "IFFT" represents
the inverse FFT operation. This computation scales with image size
much more slowly and increases as N*log(N). In addition, since the
FFT is so widely used for many data processing tasks, and FFTs are
a common component of most floating-point benchmark tests for
processors, many modern processors are designed with FFTs in mind
and some are even optimized for performing FFTs. Therefore, for
sufficiently large images, the use of FFTs to compare images is
efficient. However, as the complexity of the correlation increases,
for example if rotation is added, the computational load increases
quickly, making computational pattern recognition more
difficult.
[0112] If the images are small enough, the use of FFTs for doing
correlations will become inefficient compared with direct
correlation because of the extra computations needed to perform the
forward FFTs and the inverse FFT. However, image recognition using
correlation operations can be extremely effective with the power of
modern computers and the choice of direct correlation or the
alternate use of FFTs to calculate the correlation function
(depending on image size).
[0113] It is important that the system be resistant to hacking
through the use of correlation operations, and other computational
pattern recognition techniques. Consequently, techniques can be
applied to images to disrupt the use of correlation operations that
either recognize images or locate features within an image, yet the
image remains fully recognizable by a living human observer.
[0114] One of these techniques is the processing of the image using
a specialized noise structure to create a "noise modified image."
There are several different noise structures that can use the
non-deterministic random numbers generated by quantum physics-based
hardware. Having various noise structures further enhances the
security of the technique because the type of noise used to modify
the image can be varied.
[0115] An example of using the noise structure is demonstrated in
FIGS. 14 and 15 below. In the binarized (black or white pixels
only) image in FIG. 14a, both the presence and exact locations of
the letter p are found in the word apple using a correlation
operation. When the "noise modified image" is correlated with an
exact copy of the letters used in the base image, the result is
unintelligible noise as shown in FIG. 15c.
[0116] FIGS. 14a, 14b and 14c shows the use of correlation
operation to detect and find the locations of features in an image.
FIG. 14a shows an image containing the word "apple" in a
handwritten style font. FIG. 14b shows an image of the the letter
"p" in the same font used in the image. FIG. 14c shows the
correlation function image showing the detection of the presence,
and exact locations of the letter "p" in the image in FIG. 14a,
indicated by the bright peaks in FIG. 14c.
[0117] FIGS. 15a, 15b and 15c show the addition of special types of
noise to defeat the use of the correlation operation to find
features in an image. FIG. 15a shows an image containing the same
word "apple" in the handwritten style font from FIG. 14a but with a
special type of noise added that enhances the contrast of the noise
over the letters versus the background, where the noise contrast is
reduced.
[0118] FIG. 15b shows the raw image of the letter "p" in the same
font used in the original image before the noise is added.
[0119] FIG. 15c shows the correlation function image which is
unintelligible, indicating the inability to detect the presence or
locations in the letter "p" in the image in FIG. 15a.
[0120] In addition to the various noise structures that can be
used, other randomized mathematical transformations can be applied
to the images to make them even more difficult for machine
algorithms to hack. These transformations include (1) translation,
as in the figures above with the letters in the word "apple" being
shifted up and down randomly; (2) rotation; (3) various types of
morphing, including size and aspect ratio changes as well as both
linear and non-linear geometric distortion. All of these
transformations can be based on the non-deterministic random number
generator for maximum security. Several of these different
modifications can all be applied to a single image simultaneously,
making recognition by a machine nearly impossible. Again, the image
of the word "apple" in FIG. 15a is an example. Here, the letters
are distorted slightly in shape and size, their positions are
randomly altered, and the noise structure is applied.
[0121] These noise methods may be applied to number images (e.g.,
images of the numbers 0, 1, 2, 3, 4, 5, 6, 7, 8 or 9), images of
animals, images of sports items, face images, and other images of
favorites.
[0122] In some embodiments, security solutions are provided for
secure transactions against untrusted browser attacks and other
cyberattacks. In some embodiments, the solution(s) described in the
specification secure payment transactions. In other embodiments,
the solution(s) may secure access and use of private networks such
as Secret Internet Protocol Router Network (SIPRnet) or resources
on a public infrastructure such as the electrical grid.
The System
[0123] FIG. 1A shows an embodiment of a system 100 for providing
secure transactions. In an embodiment, system 100 may include user
system 101, and user system 101 may include secure area 102, secure
memory system 104, secure processor system 106, output system 108,
input system 110, sensor 111, communication system 112, memory
system 114, processor system 116, input/output system 118,
operating system 120, and network interface 122. System 100 may
also include network 124 and service provider system 126. In other
embodiments, system 100 may not have all of the elements or
features listed and/or may have other elements or features instead
of, or in addition to, those listed.
[0124] System 100 is a system within which a secure transaction
takes place (FIGS. 1A, 1B, 2A, 2B, 3, 3A, and 3B describe various
details of system 100 and various methods for using system 100). In
this specification the word system refers to any device or system
of devices that communicate with one another. User system 101 is
one that has a secure area that is dedicated for performing secure
transactions over a network. User system 101 may be a single device
or a combination of multiple devices. User system 101 may be a
portable device, personal computer, laptop, tablet computer,
handheld computer, mobile phone, or other network system, for
example (in this specification a network system is any device or
system that is capable of sending and/or receiving communications
via a network). In an embodiment, a secure area 102 may be provided
for performing secure transactions. In this specification,
authentication information references to any form of information
used for authenticating a user. In an embodiment, within secure
area 102, authentication information, such as a biometric
authentication and/or another form of authentication is bound to
the authorization of an action. In other words, the authentication
information is in some way combined with the information for
performing the action, such as by being concatenated together and
then applying a hash function to the result of the concatenation.
In this specification, the words "action" and "transaction" may be
switched one with another to obtain different embodiments.
Throughout this specification, whenever information is disclosed as
being combined, the information may be concatenated, added together
(e.g., in a binary addition of the binary values of information),
be different inputs to the same function, and/or combined in
another manner.
[0125] A hash function, denoted by .PHI., is a function that
accepts as its input argument an arbitrarily long string of bits
(or bytes) and produces a fixed-size output. In other words, a hash
function maps a variable length message m to a fixed-sized output,
.PHI.(m). Typical output sizes range from 160 bits, 256 bits, 512
bits, or can also be substantially larger.
[0126] An ideal hash function is a function .PHI. whose output is
uniformly distributed in the following way: Suppose the output size
of .PHI. is n bits. If the message m is chosen randomly, then for
each of the 2.sup.n possible outputs z, the probability that
.PHI.(m)=z is 2.sup.-n. In an embodiment, the hash functions that
are used are one-way. A one-way function .PHI. has the property
that given an output value z, it is computationally extremely
difficult to find a message m.sub.z such that .PHI.(m.sub.z)=z. In
other words, a one-way function .PHI. is a function that can be
easily computed, but that its inverse .PHI..sup.-1 is extremely
difficult to compute. Other types of one way functions may be used
in place of a hash function.
[0127] Any of a number of hash functions may be used. One possible
hash function is SHA-1, designed by the National Security Agency
and standardized by NIST. The output size of SHA-1 is 160 bits.
Other alternative hash functions are of the type that conform with
the standard SHA-256, which produces output values of 256 bits, and
SHA-512, which produces output values of 512 bits. A hash function
could be one of the SHA-3 candidates. A candidate example of a hash
function is BLAKE. Another example of a hash function is GrOstl.
Another example of a hash function is JH. Another example of a hash
function is Keccak. Another example of a hash function is
Skein.
[0128] In an embodiment, secure area 102 may have its own secure
processor system and secure memory system, which are not accessible
by the rest of user system 101. Secure area 102 may be capable of
taking over and/or blocking access to other parts of user system
101.
[0129] Secure memory system 104 may be a dedicated memory for
securing transactions. In an embodiment, secure memory system 104
may not be accessed by the other processor systems of user system
101. Memory system 104 may include, for example, any one of, some
of, any combination of, or all of a long-term storage system, such
as a hard drive; a short-term storage system, such as random access
memory; a removable storage system, such as a floppy drive or a
removable drive; and/or flash memory. Memory system 104 may include
one or more machine-readable mediums that may store a variety of
different types of information. Secure memory system 104 may store
methods and information needed to perform the secure transaction,
user information, a method of generating a registration key, and
encryption/decryption code. Secure memory system 104 may include
one or more memory units that each write and/or read to one or more
machine readable media. The term machine-readable medium is used to
refer to any non-transient medium capable carrying information that
is readable by a machine. One example of a machine-readable medium
is a computer-readable medium. Another example of a
machine-readable medium is paper having holes that are detected
that trigger different mechanical, electrical, and/or logic
responses. The content of secure memory 104 is discussed further in
FIG. 1B, below.
[0130] Secure processor system 106 may include one or more
processors. Processor system 116 may include any one of, some of,
any combination of, or all of multiple parallel processors, a
single processor, a system of processors having one or more central
processors and/or one or more specialized processors dedicated to
specific tasks. Processor system 116 implements the machine
instructions stored in memory 114. Secure processor system 106 may
include one or more processors that cannot be accessed by the main
processor of the user system 101. For example, in an embodiment all
of the processors of secure processor system 106 cannot be accessed
by the main processor of system 101. In an embodiment, the
operating system of user system 101 may have no access to secure
area 102, and in an embodiment, secure area 102 may be programmed
without benefit of an operating system, so that there is no
standard manner of programming secure area 102, which thwarts
hackers from sending read and/or write commands (or any other
commands) to secure area 102, because secure area does not use
standard read and write commands (and does not use any other
standard commands). As a consequence, providing secure area 102
addresses the weakness of biometric authentication and other
authentication methods.
[0131] Output system 108 may include any one of, some of, any
combination of, or all of a monitor system, a handheld display
system, a printer system, a speaker system, a connection or
interface system to a sound system, an interface system to
peripheral devices and/or a connection and/or interface system to a
computer system, intranet, and/or internet, for example. In an
embodiment, secure processor system 106 may be capable of taking
over and using any portion of and/or all of output system 108. In
an embodiment, a portion of the output system may be a dedicated
display system that may be accessed only by secure area 102. In an
embodiment, secure processor 106 may be capable of receiving input
from input system 110 and/or blocking access to output system 108
by the main processor system and/or other devices.
[0132] Input system 110 may include any one of, some of, any
combination of, or all of a biometric sensor 111, a keyboard
system, a touch sensitive screen, a tablet pen, a stylus, a mouse
system, a track ball system, a track pad system, buttons on a
handheld system, a scanner system, a microphone system, a
connection to a sound system, and/or a connection and/or interface
system to a computer system, intranet, and/or internet (e.g. IrDA,
USB). In an embodiment, biometric sensor 111 may be a finger print
scanner or a retinal scanner. In an embodiment, user system 101
stores the processed data from user information 104B during
registration. In an embodiment user system 101 retrieves user
information 104B and compares the scanned output of sensor 111 to
user information 104B to authenticate a user. In an embodiment
secure processor 106 may be capable of receiving input from input
system 110 and/or blocking access to input system 110 by the main
processor system and/or other devices. In at least one embodiment,
processor 116 may capture pressure (e.g., pressing fingers) events
on a touch sensitive screen or a mouse clicking corresponding to
something of interest (e.g., a visual image) on a PC display. FIG.
5 shows images of part of an icon, the word "NAME" and the letters
of the alphabet "ABCDEFGHIJLKLMNOPQRSTUVWXYZ".
[0133] Communication system 112 communicatively links output system
108, input system 110, memory system 114, processor system 116,
and/or input/output system 118 to each other. Communications system
112 may include any one of, some of, any combination of, or all of
electrical cables, fiber optic cables, and/or means of sending
signals through air or water (e.g. wireless communications), or the
like. Some examples of means of sending signals through air and/or
water include systems for transmitting electromagnetic waves such
as infrared and/or radio waves and/or systems for sending sound
waves.
[0134] Memory system 114 may include, for example, any one of, some
of, any combination of, or all of a long-term storage system, such
as a hard drive; a short-term storage system, such as random access
memory; a removable storage system, such as a floppy drive or a
removable drive; and/or flash memory. Memory system 114 may include
one or more machine-readable mediums that may store a variety of
different types of information. Memory system 114 and memory system
104 may use the same type memory units and/or machine readable
media. Memory system 114 may also store the operating system of
user system 101 and/or a web browser (which may also be referred to
as an HTTP client). In embodiment, memory system 114 may also store
instructions for input system 110 to read in biometric data and
send the biometric data to secure area 102.
[0135] Processor system 116 may include one or more processors.
Processor system 116 may include any one of, some of, any
combination of, or all of multiple parallel processors, a single
processor, a system of processors having one or more central
processors and/or one or more specialized processors dedicated to
specific tasks. Processor system 116 implements the machine
instructions stored in memory 114. In an embodiment, processor 116
does not have access to secure area 102. In at least one
embodiment, processor 116 may capture pressure (e.g., pressing
fingers) events on a touch sensitive screen or a mouse clicking
corresponding to something of interest (e.g., a visual image) on a
PC display.
[0136] In an embodiment, clicking on the red letter "R" (e.g., via
image entry 179 in FIG. 1B) shown at the bottom of the FIG. 6 would
have a similar effect to typing the letter "R" on the keyboard but
would make it more difficult for malware to know what the user is
entering.
[0137] In an alternative embodiment, processor 116 only
communicates to secure area 102 when secure area 102 authorizes
processor 116 to communicate with secure area 102. Secure area 102
may prevent processor 116 from communicating to secure 102 during
the secure area's execution of critical operations such as setup,
generation of keys, registration key, biometric authentication or
decryption of transaction information.
[0138] Input/output system 118 may include devices that have the
dual function as input and output devices. For example,
input/output system 118 may include one or more touch sensitive
screens, which display an image and therefore are an output device
and accept input when the screens are pressed by a finger or
stylus, for example. In at least one embodiment, the user may see
visual images of letters on a screen as shown in FIG. 5. In FIG. 5,
pressing a finger over the letter "B" shown just below the word
NAME would indicate typing or entering the letter "B".
[0139] The touch sensitive screen may be sensitive to heat and/or
pressure. One or more of the input/output devices may be sensitive
to a voltage or current produced by a stylus, for example.
Input/output system 118 is optional, and may be used in addition to
or in place of output system 108 and/or input device 110. In an
embodiment, a portion of the input/output system 118 may be
dedicated to secure transactions providing access only to secure
area 102. In an embodiment, secure processor 106 may be capable of
receiving/sending input/output from/via input system 110 and/or
blocking access to input system 110 by the main processor system
and/or other devices. Restricting access to a portion of and/or all
of the input/output system 118 denies access to third party systems
trying to hijack the secure transaction.
[0140] Operating system 120 may be a set of machine instructions,
stored in memory system 110, to manage output system 108, input
system 110, memory system 114, input/output system 118 and
processor system 116. Operating system 120 may not have access to
secure area 102. Network interface 122 may be an interface that
connects user system 101 with the network. Network interface 122
may be part of input/output system 118.
[0141] Network 124 may be any network and/or combination of
networks of devices that communicate with one another (e.g., and
combination of the Internet, telephone networks, and/or mobile
phone networks). Service provider system 126 (which will be
discussed further in conjunction with FIG. 2A) may receive the
transactions. The recipient may be the final recipient or an
intermediary recipient of transactions.
[0142] Service provider system 126 may be a financial institution
or a recipient of a secure transaction. User system 101 may
interact with any of a variety of service provider systems, such as
service provider system 126, via a network 124, using a network
interface 122. Service provider system 126 may be a system of one
or more computers or another electronic device, and may be operated
by a person that grants a particular user access to its resources
or enables a particular event (e.g., a financial transaction, a
stock trade, or landing a plane at an airport, and so on).
[0143] Methods for securing transactions are disclosed in this
specification, which may be implemented using system 100. A
financial transaction may be an instance or embodiment of a
transaction. Further, a stock trade is one embodiment of a
financial transaction; a bank wire transfer is an embodiment of a
financial transaction and an online credit card payment is an
embodiment of a financial transaction. Any operation(s) that runs
in a trusted environment, which may be secure area 102 may be
treated as a secure transaction. In an embodiment, every secure
transaction may include one or more atomic operations and the use
of the word transaction is generic to both financial transactions
and operations including atomic operations unless stated otherwise.
In this specification, the word transactions is also generic to an
individual or indivisible set of operations that must succeed or
fail atomically (i.e., as a complete unit that cannot remain in an
intermediate state). Operations that require security may include
operations that make use of, or rely on, the confidentiality,
integrity, authenticity, authority, and/or accountability of a
system should be executed in a trusted environment (e.g., in a
secure area, such as secure area 102). Types of operations that
require security may be treated as secure transactions. Further, a
successful transaction other than logging information alters a
system (e.g., of service provider 126) from one known, good state
to another, while a failed transaction does not. To be sure that a
transaction results in a change of state only when the transaction
is successful--particularly in systems that handle simultaneous
actions--rollbacks, rollforwards, and deadlock handling mechanisms
may be employed to assure atomicity and system state integrity, so
that if there is an error in the transaction, the transaction does
not take effect or does not cause an unacceptable state to
occur.
[0144] In at least one embodiment, a secure transaction assures the
following properties: [0145] A. Availability: Having timely and
reliable access to a transactional resource. [0146] B.
Confidentiality: Ensuring that transactional information is
accessible only to those authorized to use the transactional
information. [0147] C. Integrity: Ensuring that transactional
information is protected from unauthorized modification. [0148] D.
Authentication: Ensuring that transactional resources and users
accessing the transactional resources are correctly labeled
(identified). [0149] E. Authorization: Ensuring that only
authorized users have access rights to transactional resources.
[0150] F. Accounting: Ensuring that a transaction cannot be
repudiated. Any operation that handles or provides access to data
deemed too sensitive for an untrusted environment (e.g., any
private data) may be treated as a secure transaction to ensure that
information leakage does not occur.
[0151] In at least one embodiment, these functionalities may be
processed using a mobile phone. Some examples of a mobile phone are
an Android phone, the iPhone and the Blackberry. In at least one
embodiment, a secure chip or secure part of the chip may reside in
a personal computer. In at least one embodiment involving a mobile
phone or computer, a secure chip may be temporarily or permanently
disconnected from the rest of the system so that the operating
system 120 does not have access to critical information entered
into and received (e.g., read or heard) from the secure area's user
interface. In at least one embodiment, this critical information
may be authentication information, such as a collection of images,
biometric information, passwords, passcodes, PINS, other kinds of
authentication factors, transaction information, and/or other user
credentials.
[0152] In at least one embodiment in which user system 101 is a
portable device, the portable device may have a user interface with
a keyboard and mouse or display screen that is sensitive to the
placement of fingers enables the user to select buttons, images,
letters, numbers or symbols. In at least one embodiment, the screen
may be used to select one or more images. As an example, FIG. 7
shows the choice of selecting "ACCEPT" or "ABORT" using images. The
selection is captured by image entry 179 shown in FIG. 1B. At least
one embodiment may enable the user to enter transaction information
using this keyboard and mouse or the display screen.
[0153] Portable embodiments of user system 101 enable users to
execute secure transactions in remote places such as inside a jet,
on a golf course, inside a moving automobile, from a hotel room, in
a satellite, at a military gate, and/or other isolated places.
[0154] In at least one embodiment, a person may be requested to
choose their favorite food and he or she may select an apple
image--via the user interface--as user verification. In another
instance at a later time, a transaction may require a person to
select one or more images (i.e., a collection of images) from a
display screen. Example images could be a picture or photo of an
orange, a train, a specific pattern such as a peace sign or a
diagram or a logo, a Mercedes car, a house, a candle, the Golden
Gate bridge or a pen.
[0155] FIG. 4 shows images of parts of a logo. FIG. 9 shows
different texture images: horizontal, vertical, triangular, mixed,
dotted, bubble, simplicial, and foliation. At the bottom of FIG. 8
is a geometric image of a blue rectangle on top of a blue triangle
which is on top of a red blob. Just to the right in FIG. 8, is a
rectangle with vertical texture on top of a triangle with dotted
texture on top of a blob with simplicial texture. FIG. 5 shows
images of the alphabet letters: "ABCDEFGHIJKLMNOPQRSTUVWXYZ". In at
least one embodiment, during setup the person may add his or her
own images using image acquisition 173, which are then used for
user verification during the transaction. When images are a part of
the user verification process, a display screen may be used, which
may call image display 177.
[0156] Although some embodiments of user system 101 below may be
described as using collections of visual images as a user's
universal identifier or as user authentication, other items or a
combination of these items may be used for verifying the identity
of the person such as face prints, iris scans, finger veins, DNA,
toe prints, palm prints, handprints, voice prints and/or
footprints. Any place, the expression "biometric prints" occurs any
of the above listed different specific types of biometrics may be
substituted to get specific embodiments. In terms of what a person
knows, the authentication items may be PINs, passwords, sequences,
collections of images that are easy to remember, and/or even
psychometrics. In an embodiment, the item used to verify the person
may be any item that is unique. In an embodiment, the item(s) used
to verify the person may be one or more items that as a combination
are difficult for malware to fabricate, guess, find by trial and
error, and/or compute. In an embodiment, the item(s) used to verify
the person are uniquely associated with this person. In an
embodiment, the item used to verify the person has an unpredictable
element.
[0157] In at least one embodiment, there is a secure area 102 that
may be a specialized part of the chip (e.g., a microprocessor),
where the operating system 120 and web browser software do not have
access to this specialized part of the chip. In at least one
embodiment, a specialized part of the chip may be able to turn off
the operating system 120's access to presses of the buttons or a
screen of a mobile phone (or other computing device), preventing
malware and key or screen logging software from intercepting a PIN
or the selection of an image. In at least one embodiment, a
specialized part of the chip may be able to temporarily disconnect
the rest of the chip's access to the screen (e.g., by preventing
the execution of the operating system 120 and web browser). In at
least one embodiment, part of the display screen may be permanently
disconnected from the part of the chip (e.g., from the
microprocessor of the chip) that executes the operating system 120
and web browser. In at least one embodiment, a part of the chip may
only have access to the biometric sensor, while the rest of the
chip--executing the operating system 120 and web browser--is
permanently disconnected from the biometric sensor.
[0158] In at least one embodiment, there includes a secure area,
such as secure area 102, that executes a biometric acquisition
and/or storage of cryptography keys, and other user credentials,
which may be created from the biometric prints or created from
unpredictable physical processes in secure area 102, or created
from a combination of the biometric prints and unpredictable
processes In at least one embodiment, photons may be produced by
the hardware as a part of the unpredictable process. In least one
embodiment, the unpredictable process may be produced by a
specialized circuit in the secure area.
[0159] In yet another embodiment of the invention, biometric prints
and/or unpredictable information from unpredictable physical
process are used to generate one or more keys in the secure area
102. The secure area 102 may include embedded software. In at least
one embodiment, the embedded software is on a chip with a physical
barrier around the chip to hinder reverse engineering of the chip,
and/or hinder access to keys, transaction information, and/or
possibly other user credentials.
[0160] By executing software from server provider system 126, the
selection of visual images, using image entry 179, are less
susceptible to theft as they can be displayed on the screen in a
form that is not easily recognizable or captured by malware.
Because they are difficult for malware to recognize or apprehend,
they can be presented by image display 177 in a less secure part of
the system such as operating system 120 running a web browser. Each
of the above embodiments may be used separately from one another in
combination with any of the other embodiments. All of the
embodiments of this specification may be used together or
separately.
Secure Area in a Device or a Chip
[0161] To provide additional security, some embodiments may use a
secure area 102 that may be part of user system 101 or a special
part of the chip that is able to acquire biometric prints, store
authentication information, and/or authenticate the newly acquired
items. The authentication information may include templates of
biometric prints, images, pins, and/or passwords. The secure area
may also be a part of the device where critical transaction
information may be entered or verified on a display that the secure
area only has access to. In at least one embodiment, the host
computer (domain) and the network have no access to the transaction
information, no access to the keys, no access to biometrics, and/or
no access to other critical user credentials (the transaction
information, the keys, the biometrics, and/or other critical user
credentials may be the contained and processed by the secure
area).
Payment Transaction Information
[0162] In this specification, transaction information refers to one
or more items of information that describe the transaction. For a
payment transaction, one item of transaction information may be the
name of the person or entity sending the money. Another item of
transaction information may be the name of the person or entity
receiving the money. Another item of transaction information, may
be the date or time of day. Another item of transaction information
may be the sending person's (or entity's) account number. Another
item of transaction information may be the receiving person's (or
entity's) bank account number. FIG. 10 shows a recipient account
number 9568342710 with a collection of visual images.
[0163] The sending person or entity is the person or entity that
sends a message that is part of the transaction and the receiving
person or entity (recipient) is the person or entity that receives
the message that is part of the transaction. Another item of
transaction information may be the sending person's (or entity's)
routing number. Another item of transaction information may be the
receiving person's (or entity's) routing number. Another item of
transaction information may be the amount of money that may be
expressed in dollars, Eros, yen, francs, deutschmark, yuan or
another currency.
Setup
[0164] During setup, one or more images may be acquired by using
image acquisition 173 in user system 101. These one or more images
may serve as a user's universal identifier or provide a method to
authenticate the user. An example of one or more images that may
serve as a universal identifier is shown in FIG. 11. In at least
one embodiment, no images are stored in user system 101. In at
least one embodiment, these images are acquired and encrypted by
image encrypt/decrypt 175 and transmitted to service provider
system 126. During setup, in at least one embodiment, a background
texture may be selected by the user, that was generated by image
generator 238 in service provider system 126. FIG. 9 shows some
examples of textures.
[0165] In at least one embodiment, a symbol, letter, number and/or
image texture may be selected or generated. As an example, FIG. 8
shows the word "SIMPLICIAL" written with a bubble texture using a
simplicial background texture. In at least embodiment, a unique
icon or image may be chosen or generated by the user system 101
and/or the user and/or service provider system 126.
[0166] In at least one embodiment, user makes sure that a
recognizable image generated by image generator 238 appears on the
user interface that is only known to service provider system 126.
FIG. 4 shows an example of different parts of a unique logo that
may serve this purpose. The use of recognizable image in different
forms helps hinder malware from capturing important setup
information and helps assure that the user is communicating with
the appropriate service provider system 126.
[0167] During setup, in at least one embodiment, some initial
transaction information is provided to service provider system 126.
This transaction information may include the user's name, the
user's bank account number and bank. In at least one embodiment,
some of this transaction information provided via image entry 179
to service provider system 126, may be provided by using images
(i.e., acquired with image acquisition 173) that are difficult for
malware to capture or apprehend.
[0168] In at least one embodiment, during setup one or more
biometric prints may be acquired, and one or more unique
registration keys and cryptography keys may be generated from the
one or more of the biometric prints (items) or generated from an
unpredictable physical process or both. In at least one embodiment,
the unpredictable physical process may come from a hardware chip or
hardware circuit that uses photons as a part of the unpredictable
process to create the cryptography keys. During authentication, if
the acquired biometric print is an acceptable match, then a
sequence of transaction steps that make up the complete transaction
may be initiated.
[0169] In embodiments using a secure area, the software that secure
area 102 executes may be embedded in secure memory 104. In an
embodiment, there is no operating system on the device or on secure
area 102 of user system 101. In an alternative embodiment, there is
an operating system. The secure biometric print device has a number
of components, which are described later. The security of the
secure area 102 may be enhanced by any one of, any combination or
of, or all of (1) the use of embedded software, (2) the lack of an
operating system, and (3) the secure area being at least part of a
self-contained device not connected to a computer or the internet.
For example, the unit that includes the secure area may contain its
own processor. In an embodiment, the secure area may not have any
of these security enhancing features. The biometric sensor enables
user system 101 to read biometric prints. The biometric sensor may
include a fingerprint area sensor or a fingerprint sweep sensor,
for example. In at least one embodiment, the biometric sensor may
contain an optical sensor that may acquire one or more types of
biometrics. In at least one embodiment, the biometric sensor may be
a microphone or other kind of sensor that receives acoustic
information, such as a person's voice. In at least one embodiment,
the sensor may be a device that acquires DNA or RNA. In an
embodiment, secure processor system 106 may execute the software
instructions, such as acquiring a biometric print from the sensor,
matching an acquired biometric print against a stored biometric
print, sending communication and control commands to a display,
and/or encrypting the registration key and transmitting the
registration key to the administrator when the user and
administrator are not in the same physical location. By including
processor system 106 in secure area 102, the security is enhanced,
because the external processor is given fewer chances to inspect
contents of secure area 102. Alternatively, secure area 102 may
store software instructions that are run by secure processor system
106. Processor system 106 performs the biometric print acquisition,
and/or the encryption or decryption. Alternatively, a specialized
logic circuit is built that carries out the functions that the
software causes the processors to perform, such as driving sensor
111 (which may be an acquisition unit, such as a biometric
sensor).
[0170] Secure memory system 104 may contain non-volatile memory in
addition to volatile memory. Non-volatile memory enables the device
to permanently store information for generating cryptography keys
(encryption or decryption). In another embodiment, secure memory
system 104 may include memory on secure processor system 106. In
another embodiment, the sensor or input system 110 and secure
processor system 106 may be integrated into a single chip.
Alternatively, in another embodiment, the sensor in input system
110 and secure processor system 106 may be two separate chips.
Content of Memory in Secure Area
[0171] FIG. 1B shows an embodiment of a block diagram of the
contents of memory system 104 of FIG. 1A, Memory system 104 may
include instructions 152, which in turn may include a setup routine
154, an authentication of user routine 156, a secure transaction
routine 158, having an initial request routine 160, a service
provider authentication routine 162, and a completion of
transaction routine 164. Instructions 154 (of memory 104) may also
include registration key generator 166, drivers 168, controller
169, generate cryptography key 170, perturb cryptography key 174,
hash functions 178, perturbing functions 180, and user interface
181. Memory system 104 may also store data 182, which may include
biometric template T 184, registration key R 186, current
cryptography key K 188 and transaction information S 192. In other
embodiments, memory system 104 may not have all of the elements or
features listed and/or may have other elements or features instead
of, or in addition to, those listed.
[0172] Instructions 152 may include machine instructions
implemented by processor 106. Setup routine 154 is a routine that
handles the setting up of the user system 101, so that user system
101 may be used for performing secure transactions. Setup routine
104 may collect a new user's biometric print, and apply a hash
function to the biometric print (and/or to other user information)
to generate a registration key R. In at least one embodiment, there
may be specialized hardware in the secure area to help create
unpredictableness used for the generation of cryptography key(s),
seed(s), and/or registration key(s). Alternatively, a registration
key, seed, or cryptography key may be generated by applying the
hash function to the raw biometric print data, for example.
Similarly, setup routine 154 may apply a hash function to
authentication information, such as a biometric print, to hardware
noise produced by a phototransistor, and/or other user information
or a combination of these to generate an initial cryptography key.
The setup routine 154 may also send the registration key and/or the
cryptography key to the service provider system 126. In another
embodiment, the registration key R and/or the initial cryptography
key may be received from service provider 126.
[0173] Authentication of user routine 156 may authenticate the user
each time the user attempts to use user system 101. This routine
may call image acquisition 173 to acquire a collection images for
user authentication. For example, user system 101 may include a
biometric sensor (e.g., as sensor 111) that scans the user's
biometric print, reduces the biometric print to a template, and
matches the newly derived biometric template to a stored template
(which was obtain by setup routine 154). Then, if the stored
template and the newly derived template match, the user is allowed
to use user system 101.
[0174] In an alternative embodiment, a biometric print acquired may
be directly matched with a stored template. Alternatively or
additionally, authentication of user routine 156 may require the
user to enter a password. If the password received and the password
stored match, the user is allowed to use user system 101.
[0175] Secure transaction routine 158 is a routine that implements
the secure transaction. The initial request routine 160 is a first
phase of secure transaction routine 158. One purpose of initial
request routine 160 is to receive a selection of images known to
the user and acting as a user authentication that are difficult for
malware to recognize or apprehend and transaction information
entered and represented as images that are difficult for malware to
recognize or apprehend. The transaction information is encrypted
with the cryptography key. The encrypted transaction information
and encrypted user authentication--both represented as images
before encryption--are sent to the service provider. During initial
request routine 160, the cryptography key may perturbed to obtain a
new cryptography key, respectively. In an alternative embodiment,
the cryptography key is not changed each time
[0176] Service provider authentication routine 162 authenticates
the information provided by the service provider. The collection of
images, representing the user's universal identifier or user
authentication, received by service provider 126 to system 101 in
reply to initial request 160 may be authenticated by service
provider authentication routine 162.
[0177] Drivers 168 may include drivers for controlling input and
output devices, such as the keyboard, a monitor, a pointing device
(e.g., a mouse and/or a touch pad), a biometric print sensor (for
collecting biometric prints). Controller 169 may include one or
more machine instructions for taking control of the keypad, monitor
and/or network interface, so the transaction may be performed
securely, without fear of the processor system 116 compromising
security as a result of being taken over by malware sent from
another machine.
[0178] Generate cryptography key 170 are machine instructions that
generate a new cryptography key (e.g., by applying a function). In
at least one embodiment, the cryptography key is not updated after
the initial step. Perturb cryptography key 174 perturbs the current
cryptography key to thereby generate the next cryptography key.
[0179] Image acquisition 173 are machine instructions that acquire
images. Image encrypt/decrypt are machine instructions that encrypt
or decrypt one or more images. In at least one embodiment, these
images are encrypted before sending to service provider system 126.
In at least one embodiment, encrypted images are received from
service provider system 126 and decrypted with service provider
system 126 before they are displayed to the user with image display
177. Image display 177 are machine instructions that display one or
more images to the user, utilizing user interface 181. In at least
one embodiment, images are displayed on a screen of a mobile phone
or PC. Image entry 179 are machine instructions that determine
which image a user has selected with his or her finger on a touch
sensitive screen or has selected with a mouse.
[0180] Hash functions 178 may be one or more one-way functions,
which may be used by generate registration key 166 for generating a
registration key from a biometric print and/or other user
information. Those hash function(s) of hash functions 178 that are
used by initial request 160, authentication of service provider
routine 162, and completion of transaction routine 164 may be the
same as one another or different from one another.
[0181] Perturbing functions 180 may include one or more perturbing
functions, which may be used by perturb cryptography key 174.
Different perturbing functions of perturbing functions 180 may be
used during each initial request 160, authentication of service
provider routine 162, and/or completion of transaction routine 164.
In this specification anytime a hash function is mentioned or a
perturbing function is mentioned any other function may be
substituted (e.g., any perturbing function may be replaced with a
hash function and any hash function may be replaced with a
perturbing function) to obtain another embodiment. Optionally, any
perturbing function and/or hash function mentioned in this
specification may be a one way function.
User Interface
[0182] User interface 181 provides a page, a web browser or another
method of displaying and entering information so that the user
interface may provide one or more of the following functionalities,
labeled with the letters A-F.
[0183] A. The user may view the transaction information being sent.
B. The user may enter instructions for sending transaction
information. C. The user may receive information about whether or
the user authentication was valid. D. The user may enter or
generate one or more images known by the user and/or enter another
biometric print or another type of user authentication such as a
PIN. E. The user may determine the current state in the transaction
process. F. The user may read directions or enter information for
the next step in the transaction process.
Data and Keys
[0184] Data 182 may include any data that is needed for
implementing any of the routines stored in memory 104. Biometric
template T 184 may include templates, such as minutiae and/or other
information characterizing biometric prints of users, which may be
used to authenticate the user each time the user would like to use
secure area 102 and/or system 101. Registration key R 186 may be
generated by applying a hash function to a collection of images
selected or generated by the user, biometric print(s) and/or
information derived from an unpredictable physical process. In one
embodiment, the unpredictable physical process may use one or more
phototransistors, each of which senses photons.
[0185] Current cryptography key K 188 is the current cryptography
key, which may be stored long enough for the next cryptography key
to be generated from the current cryptography key. Transaction
information S 192 may include information about a transaction that
the user would like to perform. Service Provider System
[0186] FIG. 2A shows a block diagram of an embodiment of a service
provider system 200 in a system for securing transactions against
cyber attacks. In an embodiment, service provider system 200 may
include output system 202, input system 204, memory system 206,
processor system 208, communication system 212, and input/output
system 214. In other embodiments, the service provider system 200
may not have all the components and/or may have other embodiments
in addition to or instead of the components listed above.
[0187] Service provider system 200 may be a financial institution
or any other system such as a power plant, a power grid, or a
nuclear plant or any other system requiring secure access. In an
embodiment, service provider system 200 may be an embodiment of
service provider system 126. Any place in this specification where
service provider 126 is mentioned service provider 200 may be
substituted. Any place in this specification where service provider
200 is mentioned service provider 126 may be substituted. Service
provider system 200 may include one or more webservers,
applications servers, and/or databases, which may be part of a
financial institution, for example.
[0188] Output system 202 may include any one of, some of, any
combination of, or all of a monitor system, a handheld display
system, a printer system, a speaker system, a connection or
interface system to a sound system, an interface system to
peripheral devices and/or a connection and/or interface system to a
computer system, intranet, and/or internet, for example.
[0189] Input system 204 may include any one of, some of, any
combination of, or all of a keyboard system, a touch sensitive
screen, a tablet pen, a stylus, a mouse system, a track ball
system, a track pad system, buttons on a handheld system, a scanner
system, a microphone system, a connection to a sound system, and/or
a connection and/or interface system to a computer system,
intranet, and/or internet (e.g. IrDA, USB).
[0190] Memory system 206 may include may include, for example, any
one of, some of, any combination of, or all of a long term storage
system, such as a hard drive; a short term storage system, such as
random access memory; a removable storage system, such as a floppy
drive or a removable drive; and/or flash memory. Memory system 206
may include one or more machine-readable mediums that may store a
variety of different types of information. The term
machine-readable medium is used to refer to any medium capable
carrying information that is readable by a machine. One example of
a machine-readable medium is a computer-readable medium. Another
example of a machine-readable medium is paper having holes that are
detected that trigger different mechanical, electrical, and/or
logic responses. Memory 206 may include encryption/decryption code,
algorithms for authenticating transaction information, for example
(memory 206 is discussed further in conjunction with FIG. 2B).
[0191] Processor system 208 executes the secure transactions on
system 200. Processor system 208 may include any one of, some of,
any combination of, or all of multiple parallel processors, a
single processor, a system of processors having one or more central
processors and/or one or more specialized processors dedicated to
specific tasks. In an embodiment, processor system 208 may include
a network interface to connect system 200 to user system 101 via
network 124. In an embodiment, processor 208 may execute encryption
and decryption algorithms,with which the transaction information
was encrypted. In an embodiment, processor 208 may decrypt secure
messages from user system 101 and/or encrypt messages sent to user
system 101.
[0192] Communication system 212 communicatively links output system
202, input system 204, memory system 206, processor system 208,
and/or input/output system 214 to each other. Communications system
212 may include any one of, some of, any combination of, or all of
electrical cables, fiber optic cables, and/or means of sending
signals through air or water (e.g. wireless communications), or the
like. Some examples of means of sending signals through air and/or
water include systems for transmitting electromagnetic waves such
as infrared and/or radio waves and/or systems for sending sound
waves. In embodiment, memory system 206 may store instructions for
system 200 to receive authenticated secure transaction information
from user system 101.
[0193] Input/output system 214 may include devices that have the
dual function as input and output devices. For example,
input/output system 214 may include one or more touch sensitive
screens, which display an image and therefore are an output device
and accept input when the screens are pressed by a finger or
stylus, for example. The touch sensitive screen may be sensitive to
heat and/or pressure. One or more of the input/output devices may
be sensitive to a voltage or current produced by a stylus, for
example. Input/output system 118 is optional, and may be used in
addition to or in place of output system 202 and/or input device
204.
[0194] FIG. 2B shows an embodiment of a block diagram of the
contents of memory system 206 of FIG. 2A, Memory system 206 may
include instructions 220, which in turn may include a setup routine
222, an authentication of user routine 224, a request for
authentication routine 226, completion of transaction routine 228,
generate registration key 230, generate cryptography key 232, hash
functions 242, and perturbing functions 244. Memory system 206 may
also store data 245, which may include registration key R 246,
current cryptography key K 248, and transaction information S 252.
In other embodiments, memory system 206 may not have all of the
elements or features listed and/or may have other elements or
features instead of, or in addition to, those listed.
[0195] Setup routine 222 is a routine that handles the setting up
of the service provider system 200, so that service provider system
200 may be used for performing secure transactions. Setup routine
222 may receive a registration key from the user system, which in
turn may be used for generating the initial cryptography key.
[0196] In an alternative embodiment, the user may send the
biometric print or template of the biometric print to service
provider system 200, and service provider system 200 may generate
the registration key from the biometric print in the same manner
that user system 101 generates the registration key from the
template of the biometric print or from the biometric print and/or
information obtained from an unpredictable physical process (e.g.,
by setup routine 222 applying a hash function to the biometric
print and/or information derived from an unpredictable physical
process).
[0197] In another embodiment, the user may visit the location of
service provider, where the service provider may acquire a
collection of images known to the user, which is used by service
provider system 200 for at least partially creating the initial
cryptography key.
[0198] Generate cryptography key 232 are machine instructions that
generate a new cryptography key from (e.g., by applying a function,
such as a perturbing function to) a prior cryptography key.
Generate cryptography key 232 may be the same routine as generate
cryptography key 170 except that generate cryptography key 232 is
implemented at service provider 200 and generate cryptography key
170 is implemented at user system 101.
[0199] Perturb cryptography key 236 may be the same as perturb
cryptography key 174, and perturb cryptography key 236 perturbs the
current cryptography key to thereby generate the next cryptography
key
[0200] Hash functions 242 may be the same as hash functions 178.
Hash functions 242 may be one a way functions, which may be used by
generate cryptography keys routine 230. Optionally, hash functions
242 may include a different function for generate cryptography keys
230. Those hash function(s) of hash functions 242 that are used by
authentication of user routine 224, request for authentication
routine 226, and completion of transaction routine 228 may be the
same as one another or different from one another.
[0201] Different perturbing functions of perturbing functions 244
may be used during each of authentication of user routine 224,
request for authentication routine 226, and completion of
transaction routine 228. Although perturbing functions 244 and hash
functions 242 are indicated as separate storage areas in from
perturb cryptography key 236, the perturbing functions may just be
stored as part of the code for perturb cryptography key 236.
[0202] Data 245 may include any data that is needed for
implementing any of the routines stored in memory 206. Registration
key R 246 may be the same as registration key 185 and may be
generated by applying a hash function to a collection of images
selected or generated by the user and/or biometric print(s) and/or
information from an unpredictable physical process.
[0203] Current cryptography key K 248 may be the same as current
cryptography key 188, and may be the current cryptography key,
which may be stored long enough for the next cryptography key to be
generated from the current cryptography key.
[0204] Transaction information S 252 may be the same as transaction
192, and may include information about a transaction that the user
would like to perform. Transaction information S 252 may be
received from user system 101 and may be used to perform a
transaction at service provider system 200 on behalf of user system
101.
Setup of User System
[0205] FIG. 3 shows a flowchart of an embodiment of setting up user
system 101 for securing transactions. This user system method may
be the setup performed by user system 101 before enabling a user to
execute secure transactions with a bank, financial institution or
financial exchange.
[0206] In step 302, a sequence or collection of visual images that
are easy to remember are obtained from the user. In an embodiment,
some visual images may be an image of an animal, an image of a car,
an image of a house, an image of a place, an image of a person's
name, an image of all or part of a bank logo. In at least one
embodiment, this collection of universal images may act as a
universal identifier for the user. As an example, the universal
identifier for that particular user may be composed of the
following 7 images where order is not important: a train, the
Golden Gate bridge, pink sparkle shoes, chocolate ice cream in a
waffle cone, one of the Wells Fargo stagecoach horses, an orange,
and a visual image of the name Haley. An example of this visual
image of a name is displayed as a visual image as shown in FIG. 11.
The universal identifier may use a particular background texture or
pattern that is determined by the user or service provider system
during setup. FIG. 9 shows examples of different textures. The
visual image of Haley in FIG. 11 is represented with a bubble
texture against a foliation background texture.
[0207] In an embodiment, the universal identifier may be used to
request from the user as user authentication. In an alternative
embodiment, user authentication may involve a subset of these
images of the universal identifier or different set of visual
images.
[0208] In an alternative embodiment, biometric print information
may be obtained from the user from a biometric sensor 111 in input
system 110 in order to establish a method of user authentication.
The user setup method may also collect other setup information,
such as a Personal Identification Number (PIN), or a password. The
setup data that was collected may be denoted as a T.
[0209] In step 304, the universal identifier and user
authentication information are encrypted and transmitted to the
service provider system. In at least one embodiment, this
information is encrypted as visual images and then sent back to the
service provider system. In at least one embodiment, a
Diffie-Hellman key exchange is used to establish keys to encrypt
the universal identifier and user authentication information.
[0210] In step 306, the user service provider receives the
encrypted universal identifier and user authentication information
and decrypts them and stores them.
[0211] In step 308, user's account is initialized with user service
provider and enabled for executing transactions.
Diffie-Hellman Key Exchange
[0212] A Diffie-Hellman key exchange is a key exchange method where
two parties (Alice and Bob) that have no prior knowledge of each
other jointly establish a shared secret key over an unsecure
communications channel. Before the Diffie-Hellman key exchange is
described it is helpful to review the mathematical notion of a
group. A group G is a set with a binary operation *, (g*g is
denoted as g.sup.2; g*g*g*g*g is denoted as g.sup.5), such that the
following four properties hold: [0213] (i.) The binary operation *
is closed on G. In other words, a*b lies in G for all elements a
and b in G. [0214] (ii.) The binary operation * is associative on
G. a*(b*c)=(a*b)*c for all elements a, b, and c in G [0215] (iii.)
There is a unique identity element e in G. a*e=e*a=a. [0216] (iv).
Each element a in G has a unique inverse denoted as a.sup.-1.
a*a.sup.-1=a.sup.-*a=e.
[0217] The integers { . . . , -2, -1, 0, 1, 2, . . . } with respect
to the binary operation + are an example of an infinite group. 0 is
the identity element. For example, the inverse of 5 is -5 and the
inverse of -107 is 107.
[0218] The set of permutations on n elements {1, 2, . . . , n},
denoted as S.sub.n, is an example of a finite group with n!
elements where the binary operation is function composition. Each
element of S.sub.n is a function p:{1, 2, . . . , n}.fwdarw.{1, 2,
. . . , n} that is 1 to 1 and onto. In this context, p is called a
permutation The identity permutation e is the identity element in
S.sub.n, where e(k)=k for each k in {1, 2, . . . , n}.
[0219] If H is a non-empty subset of a group G and H is a group
with respect to the binary group operation * of G, then H is called
a subgroup of G. H is a proper subgroup of G if H is not equal to G
(i.e., H is a proper subset of G). G is a cyclic group if G has no
proper subgroups.
[0220] The integers modulo n (i.e., Z.sub.n={[0], [1], . . . [n-1]}
are an example of a finite group with respect to addition modulo n.
If n=5, [4]+[4]=[3] in Z.sub.5 because 5 divides (4+4)-3.
Similarly, [3]+[4]=[3] in Z.sub.5. Observe that Z.sub.5 is a cyclic
group because 5 is a prime number. When p is a prime number, 4 is a
cyclic group containing p elements {[0], [1], . . . [p-1]}. [1] is
called a generating element for cyclic group Z.sub.p since
[1].sup.m=[m] where m is a natural number such that 0<m.ltoreq.s
p-1 and [1]p=[0]. This multiplicative notation works as follows:
[1].sup.2=[1]+[1]; [1].sup.3=[1]+[1]+[1]; and so on. This
multiplicative notation (i.e. using superscripts) is used in the
description of the Diffie-Hillman key exchange protocol described
below.
[0221] There are an infinite number of cyclic groups and an
infinite number of these cyclic groups are extremely large. The
notion of extremely large means the following: if 2.sup.1024 is
considered to be an extremely large number based on the computing
power of current computers, then there are still an infinite number
of finite cyclic groups with each cyclic group containing more than
2.sup.1024 elements.
[0222] Steps 1, 2, 3, 4, and 5 describe the Diffie-Hellman key
exchange. [0223] 1. Alice and Bob agree on an extremely large,
finite, cyclic group G and a generating element g in G. (Alice and
Bob sometimes agree on finite, cyclic group G and element g long
before the rest of the key exchange protocol; g is assumed to be
known by all attackers.) The group G is written multiplicatively as
explained previously. [0224] 2. Alice picks a random natural number
a and sends g.sup.a to Bob. [0225] 3. Bob picks a random natural
number b and sends g.sup.b to Alice. [0226] 4. Alice computes
(g.sup.b).sup.a. [0227] 5. Bob computes (g.sup.a).sup.b.
[0228] Both Alice and Bob are now in possession of the group
element g.sup.ab, which can serve as the shared secret key. The
values of (g.sup.b).sup.a and (g.sup.a).sup.b are the same because
g is an element of group G.
[0229] Alice can encrypt a message m, as mg.sup.ab, and sends
mg.sup.ab to Bob. Bob knows |G|, b, and g.sup.a. A result from
group theory implies that the order of every element of a group
divides the number of elements in the group, denoted as |G|. This
means x.sup.|G|=1 for all x in G where 1 is the identity element in
G. Bob calculates (g.sup.a).sup.|G|-b=(g.sup.|G|).sup.a
g.sup.-ab=(g.sup.ab).sup.-1. After Bob receives the encrypted
message mg.sup.ab from Alice, then Bob applies (g.sup.ab).sup.-1
and decrypts the encrypted message by computing
mg.sup.ab(g.sup.ab).sup.-1=m.
[0230] The user and the service provider 126 agree upon a common
key for the registration key. The user then encrypts one of the
common keys with the registration key. The service provider 126
encrypts the common key with other information, which may be
information specific to the user or a random number, for example.
Then the user sends the encrypted common key (that was encrypted by
the user with the registration) to the service provider 126, and
the service provider 126 sends the encrypted common key that the
service provider 126 encrypted to the user. Next, the user encrypts
the encrypted common keys that was received from the service
provider 126 with the registration key, and the service provider
126 encrypts the encrypted common key received from the user (which
was encrypted with the registration key) with the same information
that was used to encrypt the original copy of the common key of the
service provider 126. Thus, both the user and the service provider
126 will now have the common encrypted key derived from the
registration key supplied by the user and the information supplied
by the service provider 126. The resulting encrypted common key may
be used as the registration key (instead of the original
registration key).
[0231] Optionally, the user system 101 and the service provider 126
may also agree upon a common key for the cryptography key. The
common key of the cryptography key and registration key may be the
same as one another or different. The user system 101 then encrypts
one of the common keys and the cryptography key. The server
encrypts the common key with other information, which may be
information specific to the user or a random number for example (as
was done for the registration key). Then the user system 101 sends
the encrypted common key (that was encrypted by the user with the
cryptography key) to the service provider 126, and the service
provider 126 sends the encrypted common keys (which was encrypted
service provider 126) to the user. Next, the user encrypts the
encrypted common key that were received from the service provider
126 with the cryptography key, and the service provider 126
encrypts the encrypted common keys received from the user (which
was already encrypted with the cryptography key by the user) with
the same information that was used to encrypt the original copy of
the common keys of the service provider 126. Thus, both the user
and the service provider 126 will now have the common key encrypted
by the cryptography key supplied by the user and the information
supplied by the service provider 126. The resulting encrypted
common key may be used as the cryptography key (instead of the
original cryptography key).
[0232] In other embodiments, the secure transmission may use
elliptic curve cryptography which is similar to the Diffie-Hellman
exchange described previously. In other embodiments, the secure
transmission of cryptography key(s) K may use a camera that reads a
proprietary pattern from the user's display of the device after
setup is complete. In an embodiment, the user's display is the
screen of a mobile phone.
[0233] In at least one embodiment, the registration key R may be
given to the administrator in the same physical place, such as at a
bank, or the registration key may be mailed or electronically
transmitted to the administrator if setup is accomplished remotely.
In some applications, the registration key may be encrypted first
and then electronically transmitted or sent by mail. The service
provider system 126 uses the registration key R to generate the
cryptography key (that service provider system 126 received), and
is used to compute the cryptography key K as K=.PHI..sup.j(R) where
j.gtoreq.0 and stores cryptography key K for a particular user in a
secure area 102. The number j in the operator .PHI..sup.j( ) is the
number of times that the operator .PHI.( ) is applied to R.
Transaction Information for Exchanges
[0234] For a payment transaction, one item may be the name of the
person or entity sending the money. In at least one embodiment, the
transaction may be a stock trade. In these embodiments, the stock
account number may be part of the transaction information. In at
least one embodiment, the ticker symbol of the stock--for example,
GOOG--being bought or sold may be part of the transaction
information (or the name of a commodity or other item being
purchased). The number of shares may be part of the transaction
information. The price per share (or unit price) at which the
person wishes to buy or sell the shares may be an item of the
transaction information. If the stock purchase (or sale) is a limit
order, then an indication that the stock purchase is a limit order
may be an item of the transaction information. If the stock
purchase (or sale) is a market order, then an indication that the
purchase is a market order may be an item of the transaction
information. The name of the stock account (e.g. Ameritrade,
Charles Schwab, etc.) or broker may also be an item of the
transaction information.
Securely Executing a Financial Transaction
[0235] In at least one embodiment, there are transaction steps A
and B, which are executed to successfully complete a transaction.
In at least one embodiment, there are transaction steps A, B, and
C, which are executed to successfully complete a transaction. FIG.
3A shows a flow chart of transaction step A.
[0236] TRANSACTION STEP A. In at least one embodiment, the person
looks for one or more logos or visual images that helps person make
sure that he or she is communicating to the appropriate user's
bank, financial institution or other service provider system. In an
embodiment, the person learns or creates this image that verifies
the service provider system during setup. When a transaction is
requested by the person, user selects a collection or sequence of
visual images that are easy to remember, and/or presents a
biometric print match and/or a password or PIN, that are acquired
by user system 101. This is referred to as user authentication. The
person (user) securely enters transaction information by selecting
or choosing visual images that are difficult for malware to read or
recognize. [0237] Step A.1 The person verifies in web browser or
visual display that her or she is communicating to the appropriate
bank, financial institution or other service provider system.
[0238] Step A.2 The person enters their user authentication
information as a collection of visual images, a PIN or password or
a biometric print. [0239] Step A.3 The person enters a one-time
sequence of letters, and/or a one-time sequence of numbers or a
one-time sequence of images or a combination that is unique for
this transaction and difficult for malware to guess. [0240] Step
A.4 The person selects and enters transaction information into user
system 101. [0241] Step A.5 Transaction information is encrypted
with key K denoted as E(, K). User authentication information is
encrypted as E(, K). One-time information information is encrypted
as E(, K) and are then sent to service provider system.
[0242] There are many different methods for transmitting encrypted
user authentication E(, K), encrypted unique information E(, K) and
encrypted transaction information E(, K) to the administrator
(bank) at service provider system 126. In one method, the user may
wirelessly transmit the encrypted transaction information via a
mobile phone to service provider system 126. In a third method, the
user may submit or enter a collection of images and encrypted
transaction information to the web browser of user system 101 and
use the Internet for transmission to the administrator (bank) at
service provider system 126. In many other methods, the user may
submit the user authentication and encrypted transaction
information by some other electronic means, such as a fax machine
or an ATM machine.
[0243] In at least one embodiment, the current time .tau..sub.1 is
determined and provided as transaction information. The current
time .tau..sub.1 may be rounded to the nearest minute, for example.
Optionally, the sender and receiver may compute the difference in
time between the clock of the sender and the clock of the receiver
prior to sending a message in case the two clocks are not
sufficiently synchronized. In other embodiments, the time may be
rounded to the nearest 5 minutes, the nearest, 10 minute, or the
nearest hour, for example. Here the reference time is GMT time. For
example, if the exact time is 19:05 and 45 seconds GMT, then
.tau..sub.1 is set to is 19:06 GMT. If the time is not correct or
is too delayed from the original time, then the transaction may be
aborted.
[0244] TRANSACTION STEP B. The administrator (bank or financial
institution) receives at service provider system 126 the encrypted
transaction information, encrypted one-time information and
encrypted user authentication information. FIG. 3B shows a flow
chart of transaction step B.
[0245] Step B.1 The service provider system decrypts the user
authentication information and checks that it is valid. If it is
not valid, then the transaction is aborted. If the user
authentication information is valid, then service provider system
126 goes to step B.2.
[0246] Step B.2 The service provider decrypts E(, K) and checks
that the user was able to correctly recognize the one-time
information from the user's screen or web browser. If the one-time
information ' decrypted by the service provider system is not valid
(i.e., ' doesn't match ), then the transaction is aborted. If the
one-time information ' decrypted by the service provider system is
valid (i.e., ' matches ), then service provider system 126 goes to
step B.3.
[0247] In at least one embodiment, the one-time information is
displayed on the user's screen in a way that is difficult to
recognize or apprehend by malware but recognizable by a person.
[0248] Step B.3 The encrypted transaction information E(, K) is
decrypted and transaction is executed.
Alternative Embodiment Transaction Steps C. and D.
[0249] TRANSACTION STEP C. The service provider system translates
the transaction information to a new collection of visual images
but that represent the same transaction information as . The
service provider system encrypts this new visual representation of
the transaction information as E(, K) and sends E(, K) back to the
user system. The user system receives E(, K), decrypts it and the
user checks that matches transaction information . If doesn't match
transaction information , then the user may abort the
transaction.
[0250] Transaction Step D.
[0251] If matches the original transaction information submitted by
the user, then the user sends a message to the service provider to
complete the transaction. There are a number of methods to
implement transaction step D.
[0252] In at least one embodiment, the cryptography key K may be
updated, denoted as .gamma.(K) on both sides. Then the encrypted
transaction information E(, .gamma.(K)) or E(, K) is sent from the
administrator (bank) back to the user.
User Interface
[0253] In at least one embodiment, the user interface may
implemented with a web browser in a personal computer or in a
mobile phone. User input such as selecting letters, numbers or
other input items may be accomplished with fingers on the glass
screen of IPhone or Android phone. For a PC, the letters, number or
other input items, may be entered with a mouse selecting
appropriate letters as shown in FIG. 5 or 6. In at least one
embodiment, the display screen may be rendered with a glass screen
in a mobile phone such as an Android or IPhone. In other
embodiments, the display screen may use an LCD. In at least one
embodiment, some or all of the financial institution members of
SWIFT may be stored in terms of patterns or images in the memory of
the service provider system. In at least one embodiment, the user
may use her or her fingers to scroll on the screen and select one
of the banks to make a transaction with. In at least one
embodiment, the user may use a mouse to scroll on the display of
the personal computer.
[0254] In at least one embodiment, the user may be an employee of
the bank. In at least one embodiment, the device may be used to
securely execute wire transfers between two banks In at least one
embodiment, a visual images of letters that are difficult for
malware to read may be displayed as a keyboard to be used by a
person to enter a password or transaction information as shown in
FIGS. 5 and 6. In at least one embodiment, the display may enable
the user to verify that the transaction information is correct or
has not been tampered with by malware before executing the
transaction.
Extensions and Alternatives
[0255] Each embodiment disclosed herein may be used or otherwise
combined with any of the other embodiments disclosed. Any element
of any embodiment may be used in any embodiment. At least one
embodiment of this specification includes all of the embodiments
being used together except for those that are mutually
exclusive.
[0256] Although the invention has been described with reference to
specific embodiments, it will be understood by those skilled in the
art that various changes may be made and equivalents may be
substituted for elements thereof without departing from the true
spirit and scope of the invention. In addition, modifications may
be made without departing from the essential teachings of the
invention.
* * * * *
References