Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20030101348 A1
Publication typeApplication
Application numberUS 10/194,959
Publication dateMay 29, 2003
Filing dateJul 12, 2002
Priority dateJul 12, 2001
Also published asEP1573426A2, EP1573426A4, US7197168, US7751595, US20030115475, US20030115490, US20030126448, US20070274575, WO2003007121A2, WO2003007121A3, WO2003007121B1, WO2003007125A2, WO2003007125A3, WO2003007125A9, WO2003007127A2, WO2003007127A3, WO2003007127A9
Publication number10194959, 194959, US 2003/0101348 A1, US 2003/101348 A1, US 20030101348 A1, US 20030101348A1, US 2003101348 A1, US 2003101348A1, US-A1-20030101348, US-A1-2003101348, US2003/0101348A1, US2003/101348A1, US20030101348 A1, US20030101348A1, US2003101348 A1, US2003101348A1
InventorsAnthony Russo, Peter McCoy, Mark Howell
Original AssigneeRusso Anthony P., Mccoy Peter A., Howell Mark J.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method and system for determining confidence in a digital transaction
US 20030101348 A1
Abstract
The present invention provides systems and methods utilizing tokens to assign or mitigate risk. A software token is provided that associates a secret—such as a private key or password—with risk factors involved in protecting that secret from illicit access. The token may include an indication or calculation of the “overall risk of compromise” (OROC), generally represented as an overall trust metric, associated with the secret. This token can then be used to inform system operators or third parties of the confidence of a given system transaction that depends on the secret. A third party can then take whatever actions it deems appropriate according to the estimated risk. For example, in one embodiment of the present invention, the risk factor is used to deny a transaction if the risk is deemed to great—that is if the risk factor is greater than a predetermined (or sufficient) value.
Images(4)
Previous page
Next page
Claims(19)
We claim:
1. A transaction confidence token for use in a secure communication system, said token comprising:
an envelope comprising
transaction information; and
a trust metric; and
a seal comprising a digital signature of said envelope.
2. A token according to claim 1, wherein said envelope further comprises a time stamp.
3. A token according to claim 1, wherein said transaction information includes information selected from the group consisting of a web site address, a web session identifier, a monetary or exchange value, an order number, an SKU number, and a credit card number, and combinations thereof.
4. A token according to claim 1, wherein said trust metric is an overall trust metric indicating a combined confidence level for enrollment, storage, transmission, and authentication processes employed for authentication of a transaction.
5. A token according to claim 1, wherein said trust metric comprises a storage trust metric indicating a confidence level for a storage process associated with authentication of a transaction.
6. A token according to claim 1, wherein said trust metric comprises a transmission trust metric indicating a confidence level for a transmission process associated with authentication of a transaction.
7. A token according to claim 1, wherein said trust metric comprises an authentication trust metric indicating a confidence level for an authentication process associated with authentication of a transaction.
8. A token according to claim 1, wherein said trust metric comprises an enrollment trust metric indicating a confidence level for an enrollment process associated with authentication of a transaction.
9. A token according to claim 1, wherein said trust metric comprises an overall trust metric and said envelope further comprises at least one metric chosen from the group consisting of an enrollment trust metric, a storage trust metric, a transmission trust metric, an authentication trust metric, and combinations thereof.
10. A token according to claim 1, wherein said digital signature is signed with a private key.
11. A method for assuring a secure transaction comprising:
receiving a transaction confidence token comprising a trust metric associated with said transaction;
determining if said trust metric indicates a sufficient trust level; and
processing said transaction if said trust metric indicates or exceeds said sufficient trust level.
11. A method according to claim 10, further comprising:
requiring a mitigating factor if said trust metric indicates less than said sufficient trust level.
12. A method according to claim 11, wherein said mitigating factor is chosen based on said trust metric.
13. A method according to claim 11, wherein said mitigating factor is chosen from the group consisting of a fee, a waiting period, an authentication procedure, and combinations thereof.
14. A method according to claim 11, further comprising:
processing said transaction after receiving said mitigating factor.
15. A method according to claim 10, further comprising:
constructing a transaction confidence token comprising said trust metric; and
transmitting said transaction confidence token to a server.
16. A method for assuring a secure transaction comprising:
receiving a transaction confidence token comprising a trust metric associated with said transaction;
determining if said trust metric indicates an acceptable risk level; and
processing said transaction if said trust metric indicates or is less than said acceptable risk level.
17. A method according to claim 16, further comprising:
requiring a mitigating factor if said trust metric indicates greater than said acceptable risk level.
18. A method according to claim 17, further comprising:
processing said transaction after receiving said mitigating factor.
Description
RELATED APPLICATIONS

[0001] This application further relates to the following co-pending applications:

[0002] U.S. application Ser. No. ______, filed ______, entitled “BIOMETRICALLY ENHANCED DIGITAL CERTIFICATES AND SYSTEM AND METHOD FOR MAKING AND USING” (Attorney Docket No. A-70596/RMA/JML);

[0003] U.S. application Ser. No. ______, filed ______, entitled “SECURE NETWORK AND NETWORKED DEVICES USING BIOMETRICS” (Attorney Docket No. A70595/RMA/JML); and

[0004] U.S. application Ser. No. ______, filed ______, entitled “METHOD AND SYSTEM FOR BIOMETRIC IMAGE ASSEMBLY FROM MULTIPLE PARTIAL BIOMETRIC FRAME SCANS” (Attorney Docket No. A-70591/RMA/JML); all of which are hereby incorporated by reference.

[0005] This application claims the benefit under 35 U.S.C. §119 and/or 35 U.S.C. §120 of the filing date of: U.S. Provisional Application Serial No. 60/305,120, filed Jul. 12, 2001, which is hereby incorporated by reference, and entitled SYSTEM, METHOD, DEVICE AND COMPUTER PROGRAM FOR NON-REPUDIATED WIRELESS TRANSACTIONS; U.S. patent application Ser. No. 10/099,554 filed Mar. 13, 2002 and entitled SYSTEM, METHOD, AND OPERATING MODEL FOR MOBILE WIRELESS NETWORK-BASED TRANSACTION AUTHENTICATION AND NON-REPUDIATION; and U.S. patent application Ser. No. 10/099,558 filed Mar. 13, 2002 and entitled FINGERPRINT BIOMETRIC CAPTURE DEVICE AND METHOD WITH INTEGRATED ON-CHIP DATA BUFFERING; each of which applications are incorporated by reference herein.

FIELD OF THE INVENTION

[0006] The present invention relates generally to the field of methods, computer programs and computer program products, devices, and systems for encryption systems, especially public key infrastructure (PKI) systems, and also to the field of biometrics, especially but not limited to biometrics such as human fingerprints and human voiceprints.

BACKGROUND OF THE INVENTION

[0007] The security and integrity of information systems depends in part on authentication of individual users—accurately and reliably determining the identity of a user attempting to use the system. Once a user is authenticated, a system is then able to authorize the user to retrieve certain information or perform certain actions appropriate to the system's understanding of the user's identity. Examples of such actions include downloading a document, completing a financial transaction, or digitally signing a purchase.

[0008] Numerous methods have been developed for authenticating users. Generally, as will be understood by those skilled in the art, authentication methods are grouped into three categories, also called authentication factors: (1) something you know—a secret such as a password or a PIN or other information; (2) something you have—such as a smartcard, the key to a mechanical lock, an ID badge, or other physical object; and (3) something you are—a measure of a person such as a fingerprint or voiceprint. Each method has advantages and disadvantages including those relating to ways that a system may be fooled into accepting a normally unauthorized user in cases where, for example, a password has been guessed or a key has been stolen.

[0009] The third category above—referred to herein as ‘something you are’ authentication methods—are the subject of the biometrics field. Biometric identification is used to verify the identity of a person by measuring selected features of some physical characteristic and comparing those measurements with those filed for the person in a reference database or stored in a token (such as a smartcard) carried by the person. Physical characteristics that are used today include fingerprints, voiceprints, hand geometry, the pattern of blood vessels on the wrist or on the retina of the eye, the topography of the iris of the eye, facial patterns, and the dynamics of writing a signature or typing on a keyboard. Biometric identification methods are widely used today for securing physical access to buildings and securing data networks and personal computers.

[0010] A secure system is based upon either a mutually-shared secret or a private key of a public-private key pair. During the enrollment process, the secret is first selected or created, then agreed upon and stored for later use. There are generally four major sources of risk associated with the secret being compromised: (1) the secret can be guessed by an unauthorized user; (2) the secret was observed by an unauthorized user during creation or subsequent transmission; (3) the stored secret can be retrieved and employed by an unauthorized user after creation; and/or (4) the stored secret was issued to the wrong party.

[0011] Each of the above broad categories has its own specific risk factors depending on the type of secret, where and how it is stored and how it is created. For example, the risk of guessing is dependent on a variety of factors including, but not limited to, the type of secret (for example, a password, a private PKI key, a symmetric key, or the like), the length of the secret (for example, number of characters in the password or number of bits in the private key), and the randomness of the secret (for example, an entropy calculation plus, in the case of a password, whether the password matches a dictionary word). The risk of observation during transmission is dependent on factors including, but not limited to: whether it was transmitted at all (generally, there is no transmission of the secret in PKI); what type of encryption was used, if any, during transmission; and the network used for the transmission (for example, whether it was transmitted using a telephone, an internet, a private network, or other network or communication link or channel).

[0012] The risk of a stored secret being illicitly retrieved is dependent on factors including, but not limited to: the number of devices where instances of the secret are stored (for example, a secret may be stored on a user's PC as well as in a system database); the storage medium used for each stored instance (hard disk, paper notes, smart card, portable memory device such as a flash memory card, PKCS-11 token (as discussed further in “PKCS #11 v2.11: Cryptrographic Token Interface Standard” published June 2001 by RSA Laboratories, hereby incorporated by reference), or the like); whether the secret is stored in plain text or encrypted; if stored encrypted, the risk associated with the encryption key used; what kind of biometrics used, if any, to restrict access to the storage medium; the security of passphrase used, if any, to retrieve secret; the security of biometric system(s) used, if any, to retrieve secret; and the security of physical token used, if any, to retrieve secret—for example if a token is used, the security of that token is dependent upon whether someone else has had access to it, or if it has been lost or stolen; what combinations of passphrase, biometric, and token are required, if any; and the security of the enrolled biometric template.

[0013] The risk associated with the secret being issued to the wrong person is dependent on factors including, but not limited to: the specific method or methods used to verify the user's identity prior to issuing the secret; the degree of human interaction, if any, involved in the verification process (i.e. is it supervised and verified by a trained human being); what specific biometric system or systems, if any, is used to aid verification; which government agencies (such as for example the FBI, Secret Service, or other agency), if any, aid in the verification process; and which trusted documents, if any, were required for verification (for example bank statement, social security number, passport, or the like).

[0014] Systems used for e-commerce, online banking and other financially related areas rely on security to prevent unauthorized users from accessing services for monetary gain. For example, well-designed systems try to prevent would-be buyers from purchasing goods and services with someone else's credit card by requiring a PIN or a password.

[0015] More generally, the security and integrity of information systems depends primarily on keeping data confidential so that only authorized users may see or act against the data, and assuring the integrity of data so that the data can not be changed or tampered with undetected. The field of cryptography provides well-known tools for assuring confidentiality and integrity using encryption techniques such as ciphers and hash algorithms.

[0016] One widely known and implemented body of these tools, and procedures and practices for their use, is called Public Key Infrastructure (PKI). PKI gets its name from its use of a class of cryptographic algorithm called a public key algorithm. As is widely known to those versed in the cryptographic field, a public key algorithm is a cryptographic algorithm that operates using two different but mathematically-related keys, a public key that may be shared with any party and a private key which must be kept secret, such that (for must such algorithms) data encrypted with the public key may only be decrypted with the private key, and vice-versa. PKI standards are well known, X.509 for example, described in Housley, R., “Internet X.509 Public Key Infrastructure Certificate and CRL Profile,” RFC 2459, January 1999, and ITU-T Recommendation X.509 (1997 E): Information Technology—Open Systems Interconnection—The Directory: Authentication Framework, June 1997, both of which are hereby incorporated by reference.

[0017] These standards provide powerful mechanisms for safe and private storage and transmission of confidential data so that it remains hidden from unauthorized parties. The standards provide for digital signatures, which provide the receiving party of some data with an assurance of the identity of the transmitting party. PKI standards further provide for digital certificates, which provide a tamper-resistant, portable record of the association of a public key with a person's or organization's name, attested to and signed by a trusted party, thus presenting a form of unique, irrefutable digital identity or credential for that person or organization. PKI standards also provide other useful and powerful mechanisms that can contribute to the security and integrity of information systems.

[0018] PKI is widely used in commercial and non-commercial systems, both over the Internet and in more closed or local applications. Most web browsers, for example, use PKI and PKI-based standards to interoperate with web servers when high security is desired, as when a user specifies a credit card number for payment while placing an online order. The proliferation of electronic commerce has led many jurisdictions around the world to begin to develop legal standards with the intended result that a correctly constituted digital signature would be every bit as legally binding as a handwritten signature is today.

[0019] PKI provides powerful mechanisms, but it has weaknesses. One way for digital identities to be compromised is for an impostor to somehow get a copy of the private key that is associated with the public key embedded in a certificate, thus invalidating an assumption that only the person or organization to which the certificate is issued has access to the (secret) private key. Anyone with both the certificate (which is meant to be public information, freely exchanged with anyone) and the associated private key (which is meant to be secret) can impersonate someone else and compromise the security and integrity of an information system dependent on the valid use of a certificate and associated private key.

[0020] Most systems, therefore, secure the private key such that the user must authenticate before the private key can be used for any task. Many such systems require a password (“something you know”) or a smartcard (“something you have”), or both. Some systems provide additional security by putting the private key on a smartcard that is resistant to tampering or copying. Other systems may also employ biometrics (“something you are”) to ensure that the person using the private key is in fact the true owner of the certificate.

[0021] However, smart cards may be lost, damaged, or stolen. Passwords may be forgotten or guessed. Biometrics systems can be fooled. These concerns are part of what is called in the field “the last-meter problem”, the problem of making sure that an otherwise secure system isn't compromised by a failure to correctly authenticate the person using (and usually physically adjacent to) some part of the system. The last-meter problem can present opportunities for impostors in PKI systems. Mathematically, the theoretical probability of a PKI system being fooled or otherwise compromised is extremely low (much less than 1 in a billion, for instance). However, once the “last meter problem” is taken into account, the security of such a system is greatly reduced, as the “last meter problem” becomes the weakest link in an otherwise very secure chain.

[0022] Today's PKI systems do not take into account the risk associated with the “last meter problem” when assessing the trust level to associate with users of such systems.

[0023] Accordingly, it is an object of the present invention to provide an indication of the security of a given transaction.

SUMMARY

[0024] In a first embodiment, the invention provides a transaction confidence token for use in a secure communication system, comprising an envelope and a seal. The envelope comprises transaction information and a trust metric. The seal contains a digital signature of the envelope. In preferred embodiments, the envelope further includes a timestamp. In some embodiments, the transaction information contained in the envelope includes a web site address, a web session identifier, a monetary or exchange value, an order number, an SKU number, a credit card number, or any combinations thereof.

[0025] In one embodiment, the trust metric within the envelope is an overall trust metric indicating a combined confidence level for enrollment, storage, transmission, and authentication processes employed for authentication of a transaction.

[0026] In another embodiment, the trust metric comprises a storage trust metric indicating a confidence level for a storage process associated with authentication of a transaction. In yet another embodiment, the trust metric comprises a transmission trust metric indicating a confidence level for a transmission process associated with authentication of a transaction. In still another embodiment, the trust metric comprises an authentication trust metric indicating a confidence level for an authentication process associated with authentication of a transaction. In a further embodiment, the trust metric comprises an enrollment trust metric indicating a confidence level for an enrollment process associated with authentication of a transaction. In other embodiments, a plurality of trust metrics are provided in the envelope. In one embodiment, a first trust metric comprises an overall trust metric and at least a second trust metric is provided chosen from the group consisting of an enrollment trust metric, a storage trust metric, a transmission trust metric, an authentication trust metric, and combinations thereof.

[0027] In some embodiments, the digital signature contained in the seal is signed with a private key.

[0028] The present invention further provides methods for assuring a secure transaction. In one embodiment, a method for assuring a secure transaction comprises receiving a transaction confidence token comprising a trust metric associated with the transaction, determining if the trust metric indicates a sufficient trust level; and processing the transaction if the trust metric indicates or exceeds said sufficient trust level.

[0029] In some embodiments, a method further comprises requiring a mitigating factor if said trust metric indicates less than said sufficient trust level. The mitigating factor may be chosen based on the trust metric. The mitigating factor may be chosen from the group consisting of a fee, a waiting period, an authentication procedure, and combinations thereof.

[0030] In yet other embodiments, the method further comprises processing the transaction after receiving a mitigating factor.

[0031] In other embodiments, the method further comprises constructing a transaction confidence token comprising a trust metric, and transmitting said transaction confidence token to a server.

[0032] In other embodiments, a method for assuring a secure transaction comprises receiving a transaction confidence token comprising a trust metric associated with said transaction, determining if said trust metric indicates an acceptable risk level; and processing said transaction if said trust metric indicates or is less than said acceptable risk level.

[0033] In some embodiments, the method further comprises requiring a mitigating factor if said trust metric indicates greater than said acceptable risk level. In still other embodiments, the method further includes processing said transaction after receiving said mitigating factor.

BRIEF DESCRIPTION OF THE DRAWINGS

[0034] The present invention may be better understood, and its features and advantages made apparent to those skilled in the art by referencing the accompanying drawings.

[0035]FIG. 1 is a schematic representation of one embodiment of a transaction confidence token according to an embodiment of the present invention.

[0036]FIG. 2 is a schematic flowchart showing a method of processing a transaction according to an embodiment of the present invention.

[0037]FIG. 3 is a schematic flowchart outlining a process for using a transaction confidence token according to an embodiment of the present invention.

DETAILED DESCRIPTION OF THE EMBODIMENTS

[0038] Many systems today, especially those that use PKI, involve transactions that depend on keeping a secret protected from use by third parties. If there is any risk that the secret is compromised, then that risk is propagated to the provider of the transaction itself. For example, if an internet-based system allows a user to purchase an item by entering any valid credit card number, then the risk to the credit card company or merchant related to an unauthorized purchase is dependent on how well that credit card number can be kept secret, for example, how well authenticated are the parties to whom the secret is made available.

[0039] This invention introduces the concept of a software token that associates a secret—such as a private key or password—with risk factors involved in protecting that secret from illicit access. Furthermore, in preferred embodiments of the present invention, the token includes an indication or calculation of the “overall risk of compromise” (OROC), generally represented as an overall trust metric, associated with the secret. In some embodiments of the present invention, the token also includes a calculation of individual risk factor probabilities used to determine the OROC, or overall trust metric. This token can then be used to inform system operators or third parties of the confidence of a given system transaction that depends on the secret. A third party can then take whatever actions it deems appropriate according to the estimated risk. For example, in one embodiment of the present invention, the risk factor is used to deny a transaction if the risk is deemed to great—that is if the risk factor is greater than a predetermined (or sufficient) value. In another embodiment, the risk factor is used to charge the user a fee in an effort to mitigate the risk, or where some fee may already be charged to the user for the transaction to charge different fees according to the assessed risk or trust level. The fee may be a flat fee charged to all transactions having less than a sufficient trust level or the fee may vary according to the trust level indicated by the token.

[0040] Most authentication systems are geared to answer the question of whether the party trying to use the system is the party it claims to be with either a yes or no, even though the authentication method or methods employed are imperfect. The present invention provides a mechanism for a system to add an estimate of risk or confidence on that yes or no answer, and for other systems to use that confidence information to their advantage. In one embodiment, it also provides a mechanism for documenting the party's identity so as to provide a non-repudiation mechanism for the transaction.

[0041] That is, the present invention provides systems utilizing tokens to assign or mitigate risk.

[0042]FIG. 1 depicts a schematic representation of transaction confidence token 100 according to one embodiment of the present invention.

[0043] A token, such as token 100, is created using available information regarding risk factors, examples of which are discussed above. Token 100 can be in the form of a separate packet of stored data associated with the secret, integrated either with the secret itself or, in the case of PKI, with the associated digital certificate.

[0044] The present invention provides transaction confidence tokens comprising at least one trust metric. As used herein, ‘trust metric’ generally refers to a measure of a risk factor. Examples of typical risk factors are discussed above. In one embodiment, token 100 comprises information on at least one risk factor discussed above. In another embodiment, token 100 comprises an overall risk-of-compromise (OROC) value, or overall trust metric 110, which may take one or more risk factors into consideration. In a preferred embodiment, token 100 is created and stored in a database during both enrollment and subsequent transactions, includes all the fields shown in FIG. 1. In other embodiments, only a subset of fields shown in FIG. 1 are present.

[0045] In one embodiment of the present invention, trust metrics, such as overall trust metric 110, are given by an absolute probability ranging from 0.0 to 1.0, calculated using a weighted Bayesian equation. Other ranges and equations for calculating trust metrics may also or alternatively be employed. In preferred embodiments of the present invention, trust metrics are given by an arbitrary mapping of risk information to three categories—low, medium, and high. Any number of categories may alternatively be used, with each category represented by a unique indicator. The risk information may alternatively be provided by a continuous range of values rather than in discrete categories. Overall trust metric 110 represents a weighted combination of individual risk probabilities of a plurality of risk factors. In a preferred embodiment, a system uses token 100 to deny or accept a transaction. In other embodiments, a system charges a fee, or imposes another mitigation factor—such as a waiting period or another required authentication—based on risk information contained in token 100.

[0046] Accordingly, transaction confidence token 100 (FIG. 1) is composed of two data structures: envelope 120 and seal 130. Envelope 120 comprises transaction contents, or transaction information 140 and at least one trust metric, although a plurality of trust metrics are shown in FIG. 1. Further, envelope 120 comprises timestamp 150. In a preferred embodiment, transaction information 140 represents a complete record of a transaction—including, as appropriate, account numbers, web session identifiers, monetary or exchange values, item quantities, an SKU number, an order number, a credit card number, a web URL or address, or other data describing the user's authenticated request. In other embodiments, transaction information 140 comprises only some of the above information associated with a transaction. In a preferred embodiment, transaction information 140 comprises only a transaction identifier or reference string such as a web session identifier as is often used in web applications. In an alternative embodiment, transaction information 140 field comprises a complete transaction confidence token, which may in turn (i.e., recursively) contain another transaction confidence token in its transaction contents field, without particular limit. This embodiment allows for multiple parties to attest to a transaction and attach their own confidence to the transaction as it is processed by each of a number of systems in series. The innermost transaction confidence token corresponds to the original transaction when it is first authenticated and signed by the originating party. Timestamp 150 generally comprises a string indicating a date and time at which the authentication event which is the subject of the transaction confidence token took place. Generally, any time indicator is appropriate for timestamp 150. In a preferred embodiment, timestamp 150 is expressed in Universal Coordinated Time (UTC). Overall trust metric 110 indicates a degree of overall confidence in a transaction. In one embodiment, overall trust metric 110 provides a degree of confidence in enrollment, storage, transmission, and authentication processes employed for authentication of a transaction. Overall trust metric 110 can be defined according to the specifics of the application contemplated, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In a preferred embodiment, low security refers to a password authentication against a 4-digit numeric PIN stored in non-secure storage. Medium security refers to a fingerprint authentication or strong password (alphanumeric, mixed case, greater than 8 characters) against a secret in non-secure storage, and high security is attributed to a fingerprint authentication or strong password against a secret in secure storage such as a smart card. Generally, any number of trust categories can be assigned among any authentication processes.

[0047] Envelope 120 may comprise metrics related to measures of individual aspects of an authentication process. That is, envelope 120 may comprise some or all of the following optional fields: (1) Enrollment Trust Metric 160, (2) Storage Trust Metric 170, (3) Transmission Trust Metric 180, and (4) Authentication Trust Metric 190.

[0048] Enrollment Trust Metric 160 indicates a degree of confidence in security of an enrollment or personalization process under which a secret was issued to an authenticating party. Enrollment trust metric 160 can be defined according to specifics of the application employed. In a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In one embodiment, a low confidence enrollment trust metric is assigned to self-enrollment where little or no manual verification of user identity is carried out; a medium confidence enrollment trust metric is assigned to online verification using a “weak secret” such as a credit card number, which may be independently verified to match the enrollee's name by the credit card issuer; and a high confidence enrollment trust metric is assigned in an enrollment situation where the user's identity is verified—using trusted documents such as a passport, driver's license, or the like—by a human being who works for the enrollment agency or represents another predetermined organization.

[0049] Storage Trust Metric 170 indicates a degree of confidence in the security of a method of storage used to store a secret. Storage Trust Metric 170 can be defined according to the specifics of the application employed. In a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. Here, in one embodiment, a storage trust metric indicating a low confidence level is assigned to storage of a secret in unencrypted form on a hard disk or FLASH memory of a PC or other computing device; a storage trust metric indicating a medium confidence level is assigned to storage of a secret in encrypted form on a hard drive or FLASH memory of a PC or other computing device and protected with a PIN or password; and a storage trust metric indicating a high confidence level is assigned to storage of a secret in secure storage, such as that of a smart card, and protected with a PIN or password.

[0050] Transmission Trust Metric 180 indicates a degree of confidence in security of a method of transmission, if any, of a secret. This Transmission Trust Metric can be defined according to specifics of the application employed, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In one embodiment, a transmission trust metric indicating a low confidence level is assigned to a transmission of a secret in unencrypted form via the internet or local computer network; a transmission trust metric indicating a medium confidence level is assigned to transmission of a secret in encrypted form using SSL or TLS (as known in the art and described further in Dierks, T., and Allen, C., “The TLS Protocol Version 1.0,” RFC 2246, January 1999, hereby incorporated by reference) or other common standard of network encryption; and a transmission trust metric indicating a high confidence level applies to transmission of a secret via armored car using a certified carrier such as, for example, Brink's@, Inc.

[0051] Authentication Trust Metric 190 indicates a degree of confidence in the security of a method of authentication for a particular transaction. Authentication Trust Metric 190 can be defined according to specifics of the application employed, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. Accordingly, in one embodiment, an authentication trust metric indicating a low confidence level is assigned to authentication using a PIN or password (“something you know”); an authentication trust metric indicating a medium confidence level is assigned to authentication using a physical token such as a PKCS-11 standard device or smart card (“something you have”); and an authentication trust metric indicating a high confidence level is assigned authentication requiring use of a biometric such as fingerprint, voiceprint, or face recognition (“something you are”).

[0052] In other embodiments, greater or fewer trust levels are provided. In still other embodiments a continuous range of trust metric values is provided. In some embodiments, more than one type of procedure, device, or method may be assigned an identical trust metric value. For example, in some embodiments both encrypted and unencrypted storage of a secret on a hard disk receive a trust metric indicating a low trust level, while secure storage of a secret—for example on a smart card protected with a PIN—receives a trust metric indicating a high trust level. Although in preferred embodiments, trust metrics provide an indication of security based on measurable risk factors, in other embodiments trust metric values are not constrained by theoretical security weaknesses. For example, a particular storage method or enrollment procedure may be assigned a stronger or weaker trust metric based on a preferred or encouraged method for performing those functions.

[0053] Seal 130 of transaction confidence token 100 is a string of bytes containing a digital signature of envelope 120, signed in a preferred embodiment with the private key of the authenticating party or system.

[0054] In a preferred embodiment, envelope 120 and seal 130 are constructed according to PKCS #7—for a detailed description of the standard, see, for example RSA Laboratories. PKCS #7: Cryptographic Message Syntax Standard. Version 1.5, November 1993, hereby incorporated by reference. Using that standard's signed-data content type such that envelope 120 is embodied in a content information (contentInfo) field and seal 130 is embodied in a signer information (signerInfos) field. Note that PKCS #7 also allows for the recursion of transaction information field 140 of envelope 120 in a transaction confidence token.

[0055] In another embodiment, envelope 120 and seal 130 are constructed according to the XML Signature Syntax and Processing Recommendation—for a detailed description of the standard see, for example Eastlake 3rd, D., Reagle, J., and Solo, D., “(Extensible Markup Language) XML-Signature Syntax and Processing,” RFC 3275, March 2002, incorporated herein by reference.

[0056] Other encodings or structures of a transaction confidence token are also possible.

[0057] The present invention further provides systems and methods for using a transaction confidence token. For example, when a requester (client) or server initiates a transaction requiring authentication, such as in step 200 in FIG. 2, server 210 requests the authentication and an associated transaction confidence token. In other embodiments, no specific request is made by server 210. During the course of the transaction Requester 220 allows access to a secret, step 230, such as a private encryption key using an authentication method, such as a biometric match. Numerous devices and methods exist for securing a secret, including those described in U.S. application Ser. No. ______, filed ______, entitled “Secure Network And Networked Devices Using Biometrics” (Attorney Docket No. A-70595/RMA/JML), incorporated herein by reference. Requester 220 generates, step 240, contents of a requested transaction, such as the quantity and SKUs of item(s) to be purchased, in a form suitable to be encoded in the transaction confidence token's transaction contents, or transaction information field, such as transaction information 140 in token 100 depicted in FIG. 1.

[0058] Requester 220 determines at least one trust metric, step 250, as described above and encodes at least one trust metric in the transaction confidence token. Requester 220 signs the transaction confidence token in step 260. Server 210 receives a transaction confidence token associated with a transaction request in step 270. Server 210 then adjusts its confidence level, in the transaction, step 280, based on whether the signature is valid and takes action appropriate to the confidence level, completing the transaction in step 290.

[0059] The present invention further provides methods for a server to act on a transaction confidence token. FIG. 3 provides a schematic overview of an embodiment of such a method according to the present invention. A server receives a transaction confidence token, step 300, and verifies the signature of the transaction confidence token, step 310, using, for example, the public key of the originator of the transaction confidence token. If the signature verification fails, indicating that the transaction confidence token was not created by the purported originating party, had been altered after its creation, or otherwise is invalid, then the system may then discard the token, step 320, and assume no confidence in the authenticity of the associated transaction. In a preferred embodiment, a transaction confidence token with an invalid signature is discarded and the associated transaction request is discarded or rolled back according to appropriate exception handling practice for the application employing methods of the present invention.

[0060] If the signature verification succeeds, then the server determines its confidence in the transaction, step 330, calculated from one or more trust metric fields in the transaction confidence token.

[0061] If another Server or plurality of Servers are to participate in the transaction, step 340, the original receiving server may construct a new transaction confidence token, and embed the current transaction confidence token, optionally asserting its own degree of confidence in the transaction, step 350. The server then transmits, step 360, a new transaction confidence token (comprising embedded first transaction confidence token) to other participating Server(s), step 370.

[0062] The Server may then do its own processing of the transaction request employing the confidence it has determined, step 370. For example, if a trust metric within the transaction confidence token indicates or exceeds a predetermined sufficient trust level, the server processes the transaction. However, in one embodiment, if a trust metric does not indicate a minimum sufficient trust level, the server rejects the transaction. If a trust metric indicates a minimum sufficient trust level but less than a sufficient trust level, the server may require a mitigating factor. For example, the server may require an additional authentication procedure, a fee, or a waiting period in an effort to mitigate risk associated with a predetermined range of trust metric values.

[0063] Although embodiments of the present invention discussed above generally refer to ‘confidence levels’ or ‘trust levels’ with increasing trust metric values associated with increasing trust or confidence in a transaction, in other embodiments, trust metrics are assigned and evaluated with respect to risk. That is, risk is generally the opposite of trust and trust metrics may be assigned such that increasing trust metric values corresponding to an increasing risk associated with a transaction. In these embodiments, less secure situations would receive a higher trust metric value. For example, in some embodiments of the present invention, a confidence level between 0.0 and 1.0 is calculated. A corresponding risk level in this embodiment is given generally by 1—(confidence level).

[0064] That is, in another embodiment, if a trust metric within the transaction confidence token indicates or exceeds a predetermined maximum risk level, the risk is determined to be too great, and the server the server rejects the transaction. However, if a trust metric indicates less than a maximum risk level but greater than an acceptable risk level, the server may require a mitigating factor before processing the transaction. For example, the server may require an additional authentication procedure, a fee, or a waiting period in an effort to mitigate risk associated with a predetermined range of trust metric values. If a trust metric indicates less than an acceptable risk level, the server will process the transaction.

[0065] Having described several methods and procedures, it will be appreciated that the invention may advantageously implement the methods and procedures described herein on a general purpose or special purpose computing device, such as a device having a processor for executing computer program code instructions and a memory coupled to the processor for storing data and/or commands. It will be appreciated that the computing device may be a single computer or a plurality of networked computers and that the several procedures associated with implementing the methods and procedures described herein may be implemented on one or a plurality of computing devices. In some embodiments the inventive procedures and methods are implemented on standard server-client network infrastructures with the inventive features added on top of such infrastructure or compatible therewith.

[0066] The foregoing descriptions of specific embodiments and best mode of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7197168Jul 12, 2002Mar 27, 2007Atrua Technologies, Inc.Method and system for biometric image assembly from multiple partial biometric frame scans
US7334130 *Jul 18, 2003Feb 19, 2008Bowers Charles RMethod and apparatus for managing confidential information
US7386105 *May 27, 2005Jun 10, 2008Nice Systems LtdMethod and apparatus for fraud detection
US7525411Oct 11, 2005Apr 28, 2009Newfrey LlcDoor lock with protected biometric sensor
US7716493Dec 12, 2007May 11, 2010Bowers Charles RMethod and apparatus for managing confidential information
US7730546 *Jul 1, 2005Jun 1, 2010Time Warner, Inc.Method and apparatus for authenticating usage of an application
US7751595Feb 16, 2007Jul 6, 2010Authentec, Inc.Method and system for biometric image assembly from multiple partial biometric frame scans
US7801288Feb 29, 2008Sep 21, 2010Nice Systems Ltd.Method and apparatus for fraud detection
US8087090 *Jun 2, 2008Dec 27, 2011International Business Machines CorporationFuzzy multi-level security
US8141141 *Jun 30, 2009Mar 20, 2012Actividentity, Inc.System and method for sequentially processing a biometric sample
US8185747 *Aug 16, 2007May 22, 2012Access Security Protection, LlcMethods of registration for programs using verification processes with biometrics for fraud management and enhanced security protection
US8205249 *Oct 23, 2003Jun 19, 2012Giesecke & Devrient GmbhMethod for carrying out a secure electronic transaction using a portable data support
US8321685May 10, 2010Nov 27, 2012Bowers Charles RMethod and apparatus for managing confidential information
US8327459Apr 7, 2010Dec 4, 2012Time Warner, Inc.Method and apparatus for authenticating usage of an application
US8539558Aug 15, 2011Sep 17, 2013Bank Of America CorporationMethod and apparatus for token-based token termination
US8560456Dec 2, 2005Oct 15, 2013Credigy Technologies, Inc.System and method for an anonymous exchange of private data
US8572683Aug 15, 2011Oct 29, 2013Bank Of America CorporationMethod and apparatus for token-based re-authentication
US8572689May 24, 2012Oct 29, 2013Bank Of America CorporationApparatus and method for making access decision using exceptions
US8572714 *May 24, 2012Oct 29, 2013Bank Of America CorporationApparatus and method for determining subject assurance level
US8584202May 24, 2012Nov 12, 2013Bank Of America CorporationApparatus and method for determining environment integrity levels
US8621561 *Jan 4, 2008Dec 31, 2013Microsoft CorporationSelective authorization based on authentication input attributes
US8631486 *Mar 31, 2009Jan 14, 2014Emc CorporationAdaptive identity classification
US8713672 *Aug 15, 2011Apr 29, 2014Bank Of America CorporationMethod and apparatus for token-based context caching
US8726339May 24, 2012May 13, 2014Bank Of America CorporationMethod and apparatus for emergency session validation
US8726340 *May 24, 2012May 13, 2014Bank Of America CorporationApparatus and method for expert decisioning
US8726341 *May 24, 2012May 13, 2014Bank Of America CorporationApparatus and method for determining resource trust levels
US8726361 *Aug 15, 2011May 13, 2014Bank Of America CorporationMethod and apparatus for token-based attribute abstraction
US8752124May 24, 2012Jun 10, 2014Bank Of America CorporationApparatus and method for performing real-time authentication using subject token combinations
US8752143 *Aug 15, 2011Jun 10, 2014Bank Of America CorporationMethod and apparatus for token-based reassignment of privileges
US8752157May 24, 2012Jun 10, 2014Bank Of America CorporationMethod and apparatus for third party session validation
US8781975 *May 23, 2005Jul 15, 2014Emc CorporationSystem and method of fraud reduction
US8782427Mar 20, 2012Jul 15, 2014Actividentity, Inc.System and method for sequentially processing a biometric sample
US8789143 *Aug 15, 2011Jul 22, 2014Bank Of America CorporationMethod and apparatus for token-based conditioning
US8789162 *Aug 15, 2011Jul 22, 2014Bank Of America CorporationMethod and apparatus for making token-based access decisions
US20050273442 *May 23, 2005Dec 8, 2005Naftali BennettSystem and method of fraud reduction
US20070288759 *Aug 16, 2007Dec 13, 2007Wood Richard GMethods of registration for programs using verification processes with biometrics for fraud management and enhanced security protection
US20090178129 *Jan 4, 2008Jul 9, 2009Microsoft CorporationSelective authorization based on authentication input attributes
US20130047201 *May 24, 2012Feb 21, 2013Bank Of America CorporationApparatus and Method for Expert Decisioning
US20130047204 *May 24, 2012Feb 21, 2013Bank Of America CorporationApparatus and Method for Determining Resource Trust Levels
US20130047215 *Aug 15, 2011Feb 21, 2013Bank Of America CorporationMethod and apparatus for token-based reassignment of privileges
US20130047248 *May 24, 2012Feb 21, 2013Bank Of America CorporationApparatus and Method for Determining Subject Assurance Level
US20130047251 *Aug 15, 2011Feb 21, 2013Bank Of America CorporationMethod and Apparatus for Token-Based Context Caching
US20130047262 *May 24, 2012Feb 21, 2013Bank Of America CorporationMethod and Apparatus for Object Security Session Validation
DE102004046153A1 *Sep 23, 2004Apr 6, 2006Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.Network e.g. Internet, subscriber`s e.g. mobile telephone, digital reputation determining method, involves determining reputation of subscriber of network by central server based on token issued by service provider to subscriber
DE102004046153B4 *Sep 23, 2004Oct 12, 2006Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.Verfahren und Netzwerksystem zur Bestimmung der digitalen Reputation
WO2006126183A2 *Jan 26, 2006Nov 30, 2006Barak EilamMethod and apparatus for fraud detection
Classifications
U.S. Classification713/185
International ClassificationH04N1/387, G06K9/00, G06Q30/00, G06Q20/00, G06Q10/00
Cooperative ClassificationH04L9/3231, H04L2209/08, H04L2209/805, G06Q20/04, G06K9/00026, G06Q30/06, G06Q20/4016, G06Q10/02
European ClassificationG06Q30/06, G06Q10/02, G06Q20/04, G06Q20/4016, G06K9/00A1C, H04L9/32T
Legal Events
DateCodeEventDescription
Nov 19, 2004ASAssignment
Owner name: ATRUA TECHNOLOGIES, INC., CALIFORNIA
Free format text: CHANGE OF NAME;ASSIGNOR:I-CONTROL SECURITY, INC.;REEL/FRAME:015393/0534
Effective date: 20030908
Apr 28, 2004ASAssignment
Owner name: I-CONTROL SECURITY, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:I-CONTROL TRANSACTIONS, INC.;REEL/FRAME:015264/0686
Effective date: 20021112
Feb 3, 2003ASAssignment
Owner name: I-CONTROL SECURITY, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RUSSO, ANTHONY P.;MCCOY, PETER A.;HOWELL, MARK J.;REEL/FRAME:013713/0333;SIGNING DATES FROM 20021205 TO 20021219