Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20030101348 A1
Publication typeApplication
Application numberUS 10/194,959
Publication dateMay 29, 2003
Filing dateJul 12, 2002
Priority dateJul 12, 2001
Also published asEP1573426A2, EP1573426A4, US7197168, US7751595, US20030115475, US20030115490, US20030126448, US20070274575, WO2003007121A2, WO2003007121A3, WO2003007121B1, WO2003007125A2, WO2003007125A3, WO2003007125A9, WO2003007127A2, WO2003007127A3, WO2003007127A9
Publication number10194959, 194959, US 2003/0101348 A1, US 2003/101348 A1, US 20030101348 A1, US 20030101348A1, US 2003101348 A1, US 2003101348A1, US-A1-20030101348, US-A1-2003101348, US2003/0101348A1, US2003/101348A1, US20030101348 A1, US20030101348A1, US2003101348 A1, US2003101348A1
InventorsAnthony Russo, Peter McCoy, Mark Howell
Original AssigneeRusso Anthony P., Mccoy Peter A., Howell Mark J.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method and system for determining confidence in a digital transaction
US 20030101348 A1
Abstract
The present invention provides systems and methods utilizing tokens to assign or mitigate risk. A software token is provided that associates a secret—such as a private key or password—with risk factors involved in protecting that secret from illicit access. The token may include an indication or calculation of the “overall risk of compromise” (OROC), generally represented as an overall trust metric, associated with the secret. This token can then be used to inform system operators or third parties of the confidence of a given system transaction that depends on the secret. A third party can then take whatever actions it deems appropriate according to the estimated risk. For example, in one embodiment of the present invention, the risk factor is used to deny a transaction if the risk is deemed to great—that is if the risk factor is greater than a predetermined (or sufficient) value.
Images(4)
Previous page
Next page
Claims(19)
We claim:
1. A transaction confidence token for use in a secure communication system, said token comprising:
an envelope comprising
transaction information; and
a trust metric; and
a seal comprising a digital signature of said envelope.
2. A token according to claim 1, wherein said envelope further comprises a time stamp.
3. A token according to claim 1, wherein said transaction information includes information selected from the group consisting of a web site address, a web session identifier, a monetary or exchange value, an order number, an SKU number, and a credit card number, and combinations thereof.
4. A token according to claim 1, wherein said trust metric is an overall trust metric indicating a combined confidence level for enrollment, storage, transmission, and authentication processes employed for authentication of a transaction.
5. A token according to claim 1, wherein said trust metric comprises a storage trust metric indicating a confidence level for a storage process associated with authentication of a transaction.
6. A token according to claim 1, wherein said trust metric comprises a transmission trust metric indicating a confidence level for a transmission process associated with authentication of a transaction.
7. A token according to claim 1, wherein said trust metric comprises an authentication trust metric indicating a confidence level for an authentication process associated with authentication of a transaction.
8. A token according to claim 1, wherein said trust metric comprises an enrollment trust metric indicating a confidence level for an enrollment process associated with authentication of a transaction.
9. A token according to claim 1, wherein said trust metric comprises an overall trust metric and said envelope further comprises at least one metric chosen from the group consisting of an enrollment trust metric, a storage trust metric, a transmission trust metric, an authentication trust metric, and combinations thereof.
10. A token according to claim 1, wherein said digital signature is signed with a private key.
11. A method for assuring a secure transaction comprising:
receiving a transaction confidence token comprising a trust metric associated with said transaction;
determining if said trust metric indicates a sufficient trust level; and
processing said transaction if said trust metric indicates or exceeds said sufficient trust level.
11. A method according to claim 10, further comprising:
requiring a mitigating factor if said trust metric indicates less than said sufficient trust level.
12. A method according to claim 11, wherein said mitigating factor is chosen based on said trust metric.
13. A method according to claim 11, wherein said mitigating factor is chosen from the group consisting of a fee, a waiting period, an authentication procedure, and combinations thereof.
14. A method according to claim 11, further comprising:
processing said transaction after receiving said mitigating factor.
15. A method according to claim 10, further comprising:
constructing a transaction confidence token comprising said trust metric; and
transmitting said transaction confidence token to a server.
16. A method for assuring a secure transaction comprising:
receiving a transaction confidence token comprising a trust metric associated with said transaction;
determining if said trust metric indicates an acceptable risk level; and
processing said transaction if said trust metric indicates or is less than said acceptable risk level.
17. A method according to claim 16, further comprising:
requiring a mitigating factor if said trust metric indicates greater than said acceptable risk level.
18. A method according to claim 17, further comprising:
processing said transaction after receiving said mitigating factor.
Description
    RELATED APPLICATIONS
  • [0001]
    This application further relates to the following co-pending applications:
  • [0002]
    U.S. application Ser. No. ______, filed ______, entitled “BIOMETRICALLY ENHANCED DIGITAL CERTIFICATES AND SYSTEM AND METHOD FOR MAKING AND USING” (Attorney Docket No. A-70596/RMA/JML);
  • [0003]
    U.S. application Ser. No. ______, filed ______, entitled “SECURE NETWORK AND NETWORKED DEVICES USING BIOMETRICS” (Attorney Docket No. A70595/RMA/JML); and
  • [0004]
    U.S. application Ser. No. ______, filed ______, entitled “METHOD AND SYSTEM FOR BIOMETRIC IMAGE ASSEMBLY FROM MULTIPLE PARTIAL BIOMETRIC FRAME SCANS” (Attorney Docket No. A-70591/RMA/JML); all of which are hereby incorporated by reference.
  • [0005]
    This application claims the benefit under 35 U.S.C. §119 and/or 35 U.S.C. §120 of the filing date of: U.S. Provisional Application Serial No. 60/305,120, filed Jul. 12, 2001, which is hereby incorporated by reference, and entitled SYSTEM, METHOD, DEVICE AND COMPUTER PROGRAM FOR NON-REPUDIATED WIRELESS TRANSACTIONS; U.S. patent application Ser. No. 10/099,554 filed Mar. 13, 2002 and entitled SYSTEM, METHOD, AND OPERATING MODEL FOR MOBILE WIRELESS NETWORK-BASED TRANSACTION AUTHENTICATION AND NON-REPUDIATION; and U.S. patent application Ser. No. 10/099,558 filed Mar. 13, 2002 and entitled FINGERPRINT BIOMETRIC CAPTURE DEVICE AND METHOD WITH INTEGRATED ON-CHIP DATA BUFFERING; each of which applications are incorporated by reference herein.
  • FIELD OF THE INVENTION
  • [0006]
    The present invention relates generally to the field of methods, computer programs and computer program products, devices, and systems for encryption systems, especially public key infrastructure (PKI) systems, and also to the field of biometrics, especially but not limited to biometrics such as human fingerprints and human voiceprints.
  • BACKGROUND OF THE INVENTION
  • [0007]
    The security and integrity of information systems depends in part on authentication of individual users—accurately and reliably determining the identity of a user attempting to use the system. Once a user is authenticated, a system is then able to authorize the user to retrieve certain information or perform certain actions appropriate to the system's understanding of the user's identity. Examples of such actions include downloading a document, completing a financial transaction, or digitally signing a purchase.
  • [0008]
    Numerous methods have been developed for authenticating users. Generally, as will be understood by those skilled in the art, authentication methods are grouped into three categories, also called authentication factors: (1) something you know—a secret such as a password or a PIN or other information; (2) something you have—such as a smartcard, the key to a mechanical lock, an ID badge, or other physical object; and (3) something you are—a measure of a person such as a fingerprint or voiceprint. Each method has advantages and disadvantages including those relating to ways that a system may be fooled into accepting a normally unauthorized user in cases where, for example, a password has been guessed or a key has been stolen.
  • [0009]
    The third category above—referred to herein as ‘something you are’ authentication methods—are the subject of the biometrics field. Biometric identification is used to verify the identity of a person by measuring selected features of some physical characteristic and comparing those measurements with those filed for the person in a reference database or stored in a token (such as a smartcard) carried by the person. Physical characteristics that are used today include fingerprints, voiceprints, hand geometry, the pattern of blood vessels on the wrist or on the retina of the eye, the topography of the iris of the eye, facial patterns, and the dynamics of writing a signature or typing on a keyboard. Biometric identification methods are widely used today for securing physical access to buildings and securing data networks and personal computers.
  • [0010]
    A secure system is based upon either a mutually-shared secret or a private key of a public-private key pair. During the enrollment process, the secret is first selected or created, then agreed upon and stored for later use. There are generally four major sources of risk associated with the secret being compromised: (1) the secret can be guessed by an unauthorized user; (2) the secret was observed by an unauthorized user during creation or subsequent transmission; (3) the stored secret can be retrieved and employed by an unauthorized user after creation; and/or (4) the stored secret was issued to the wrong party.
  • [0011]
    Each of the above broad categories has its own specific risk factors depending on the type of secret, where and how it is stored and how it is created. For example, the risk of guessing is dependent on a variety of factors including, but not limited to, the type of secret (for example, a password, a private PKI key, a symmetric key, or the like), the length of the secret (for example, number of characters in the password or number of bits in the private key), and the randomness of the secret (for example, an entropy calculation plus, in the case of a password, whether the password matches a dictionary word). The risk of observation during transmission is dependent on factors including, but not limited to: whether it was transmitted at all (generally, there is no transmission of the secret in PKI); what type of encryption was used, if any, during transmission; and the network used for the transmission (for example, whether it was transmitted using a telephone, an internet, a private network, or other network or communication link or channel).
  • [0012]
    The risk of a stored secret being illicitly retrieved is dependent on factors including, but not limited to: the number of devices where instances of the secret are stored (for example, a secret may be stored on a user's PC as well as in a system database); the storage medium used for each stored instance (hard disk, paper notes, smart card, portable memory device such as a flash memory card, PKCS-11 token (as discussed further in “PKCS #11 v2.11: Cryptrographic Token Interface Standard” published June 2001 by RSA Laboratories, hereby incorporated by reference), or the like); whether the secret is stored in plain text or encrypted; if stored encrypted, the risk associated with the encryption key used; what kind of biometrics used, if any, to restrict access to the storage medium; the security of passphrase used, if any, to retrieve secret; the security of biometric system(s) used, if any, to retrieve secret; and the security of physical token used, if any, to retrieve secret—for example if a token is used, the security of that token is dependent upon whether someone else has had access to it, or if it has been lost or stolen; what combinations of passphrase, biometric, and token are required, if any; and the security of the enrolled biometric template.
  • [0013]
    The risk associated with the secret being issued to the wrong person is dependent on factors including, but not limited to: the specific method or methods used to verify the user's identity prior to issuing the secret; the degree of human interaction, if any, involved in the verification process (i.e. is it supervised and verified by a trained human being); what specific biometric system or systems, if any, is used to aid verification; which government agencies (such as for example the FBI, Secret Service, or other agency), if any, aid in the verification process; and which trusted documents, if any, were required for verification (for example bank statement, social security number, passport, or the like).
  • [0014]
    Systems used for e-commerce, online banking and other financially related areas rely on security to prevent unauthorized users from accessing services for monetary gain. For example, well-designed systems try to prevent would-be buyers from purchasing goods and services with someone else's credit card by requiring a PIN or a password.
  • [0015]
    More generally, the security and integrity of information systems depends primarily on keeping data confidential so that only authorized users may see or act against the data, and assuring the integrity of data so that the data can not be changed or tampered with undetected. The field of cryptography provides well-known tools for assuring confidentiality and integrity using encryption techniques such as ciphers and hash algorithms.
  • [0016]
    One widely known and implemented body of these tools, and procedures and practices for their use, is called Public Key Infrastructure (PKI). PKI gets its name from its use of a class of cryptographic algorithm called a public key algorithm. As is widely known to those versed in the cryptographic field, a public key algorithm is a cryptographic algorithm that operates using two different but mathematically-related keys, a public key that may be shared with any party and a private key which must be kept secret, such that (for must such algorithms) data encrypted with the public key may only be decrypted with the private key, and vice-versa. PKI standards are well known, X.509 for example, described in Housley, R., “Internet X.509 Public Key Infrastructure Certificate and CRL Profile,” RFC 2459, January 1999, and ITU-T Recommendation X.509 (1997 E): Information Technology—Open Systems Interconnection—The Directory: Authentication Framework, June 1997, both of which are hereby incorporated by reference.
  • [0017]
    These standards provide powerful mechanisms for safe and private storage and transmission of confidential data so that it remains hidden from unauthorized parties. The standards provide for digital signatures, which provide the receiving party of some data with an assurance of the identity of the transmitting party. PKI standards further provide for digital certificates, which provide a tamper-resistant, portable record of the association of a public key with a person's or organization's name, attested to and signed by a trusted party, thus presenting a form of unique, irrefutable digital identity or credential for that person or organization. PKI standards also provide other useful and powerful mechanisms that can contribute to the security and integrity of information systems.
  • [0018]
    PKI is widely used in commercial and non-commercial systems, both over the Internet and in more closed or local applications. Most web browsers, for example, use PKI and PKI-based standards to interoperate with web servers when high security is desired, as when a user specifies a credit card number for payment while placing an online order. The proliferation of electronic commerce has led many jurisdictions around the world to begin to develop legal standards with the intended result that a correctly constituted digital signature would be every bit as legally binding as a handwritten signature is today.
  • [0019]
    PKI provides powerful mechanisms, but it has weaknesses. One way for digital identities to be compromised is for an impostor to somehow get a copy of the private key that is associated with the public key embedded in a certificate, thus invalidating an assumption that only the person or organization to which the certificate is issued has access to the (secret) private key. Anyone with both the certificate (which is meant to be public information, freely exchanged with anyone) and the associated private key (which is meant to be secret) can impersonate someone else and compromise the security and integrity of an information system dependent on the valid use of a certificate and associated private key.
  • [0020]
    Most systems, therefore, secure the private key such that the user must authenticate before the private key can be used for any task. Many such systems require a password (“something you know”) or a smartcard (“something you have”), or both. Some systems provide additional security by putting the private key on a smartcard that is resistant to tampering or copying. Other systems may also employ biometrics (“something you are”) to ensure that the person using the private key is in fact the true owner of the certificate.
  • [0021]
    However, smart cards may be lost, damaged, or stolen. Passwords may be forgotten or guessed. Biometrics systems can be fooled. These concerns are part of what is called in the field “the last-meter problem”, the problem of making sure that an otherwise secure system isn't compromised by a failure to correctly authenticate the person using (and usually physically adjacent to) some part of the system. The last-meter problem can present opportunities for impostors in PKI systems. Mathematically, the theoretical probability of a PKI system being fooled or otherwise compromised is extremely low (much less than 1 in a billion, for instance). However, once the “last meter problem” is taken into account, the security of such a system is greatly reduced, as the “last meter problem” becomes the weakest link in an otherwise very secure chain.
  • [0022]
    Today's PKI systems do not take into account the risk associated with the “last meter problem” when assessing the trust level to associate with users of such systems.
  • [0023]
    Accordingly, it is an object of the present invention to provide an indication of the security of a given transaction.
  • SUMMARY
  • [0024]
    In a first embodiment, the invention provides a transaction confidence token for use in a secure communication system, comprising an envelope and a seal. The envelope comprises transaction information and a trust metric. The seal contains a digital signature of the envelope. In preferred embodiments, the envelope further includes a timestamp. In some embodiments, the transaction information contained in the envelope includes a web site address, a web session identifier, a monetary or exchange value, an order number, an SKU number, a credit card number, or any combinations thereof.
  • [0025]
    In one embodiment, the trust metric within the envelope is an overall trust metric indicating a combined confidence level for enrollment, storage, transmission, and authentication processes employed for authentication of a transaction.
  • [0026]
    In another embodiment, the trust metric comprises a storage trust metric indicating a confidence level for a storage process associated with authentication of a transaction. In yet another embodiment, the trust metric comprises a transmission trust metric indicating a confidence level for a transmission process associated with authentication of a transaction. In still another embodiment, the trust metric comprises an authentication trust metric indicating a confidence level for an authentication process associated with authentication of a transaction. In a further embodiment, the trust metric comprises an enrollment trust metric indicating a confidence level for an enrollment process associated with authentication of a transaction. In other embodiments, a plurality of trust metrics are provided in the envelope. In one embodiment, a first trust metric comprises an overall trust metric and at least a second trust metric is provided chosen from the group consisting of an enrollment trust metric, a storage trust metric, a transmission trust metric, an authentication trust metric, and combinations thereof.
  • [0027]
    In some embodiments, the digital signature contained in the seal is signed with a private key.
  • [0028]
    The present invention further provides methods for assuring a secure transaction. In one embodiment, a method for assuring a secure transaction comprises receiving a transaction confidence token comprising a trust metric associated with the transaction, determining if the trust metric indicates a sufficient trust level; and processing the transaction if the trust metric indicates or exceeds said sufficient trust level.
  • [0029]
    In some embodiments, a method further comprises requiring a mitigating factor if said trust metric indicates less than said sufficient trust level. The mitigating factor may be chosen based on the trust metric. The mitigating factor may be chosen from the group consisting of a fee, a waiting period, an authentication procedure, and combinations thereof.
  • [0030]
    In yet other embodiments, the method further comprises processing the transaction after receiving a mitigating factor.
  • [0031]
    In other embodiments, the method further comprises constructing a transaction confidence token comprising a trust metric, and transmitting said transaction confidence token to a server.
  • [0032]
    In other embodiments, a method for assuring a secure transaction comprises receiving a transaction confidence token comprising a trust metric associated with said transaction, determining if said trust metric indicates an acceptable risk level; and processing said transaction if said trust metric indicates or is less than said acceptable risk level.
  • [0033]
    In some embodiments, the method further comprises requiring a mitigating factor if said trust metric indicates greater than said acceptable risk level. In still other embodiments, the method further includes processing said transaction after receiving said mitigating factor.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0034]
    The present invention may be better understood, and its features and advantages made apparent to those skilled in the art by referencing the accompanying drawings.
  • [0035]
    [0035]FIG. 1 is a schematic representation of one embodiment of a transaction confidence token according to an embodiment of the present invention.
  • [0036]
    [0036]FIG. 2 is a schematic flowchart showing a method of processing a transaction according to an embodiment of the present invention.
  • [0037]
    [0037]FIG. 3 is a schematic flowchart outlining a process for using a transaction confidence token according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • [0038]
    Many systems today, especially those that use PKI, involve transactions that depend on keeping a secret protected from use by third parties. If there is any risk that the secret is compromised, then that risk is propagated to the provider of the transaction itself. For example, if an internet-based system allows a user to purchase an item by entering any valid credit card number, then the risk to the credit card company or merchant related to an unauthorized purchase is dependent on how well that credit card number can be kept secret, for example, how well authenticated are the parties to whom the secret is made available.
  • [0039]
    This invention introduces the concept of a software token that associates a secret—such as a private key or password—with risk factors involved in protecting that secret from illicit access. Furthermore, in preferred embodiments of the present invention, the token includes an indication or calculation of the “overall risk of compromise” (OROC), generally represented as an overall trust metric, associated with the secret. In some embodiments of the present invention, the token also includes a calculation of individual risk factor probabilities used to determine the OROC, or overall trust metric. This token can then be used to inform system operators or third parties of the confidence of a given system transaction that depends on the secret. A third party can then take whatever actions it deems appropriate according to the estimated risk. For example, in one embodiment of the present invention, the risk factor is used to deny a transaction if the risk is deemed to great—that is if the risk factor is greater than a predetermined (or sufficient) value. In another embodiment, the risk factor is used to charge the user a fee in an effort to mitigate the risk, or where some fee may already be charged to the user for the transaction to charge different fees according to the assessed risk or trust level. The fee may be a flat fee charged to all transactions having less than a sufficient trust level or the fee may vary according to the trust level indicated by the token.
  • [0040]
    Most authentication systems are geared to answer the question of whether the party trying to use the system is the party it claims to be with either a yes or no, even though the authentication method or methods employed are imperfect. The present invention provides a mechanism for a system to add an estimate of risk or confidence on that yes or no answer, and for other systems to use that confidence information to their advantage. In one embodiment, it also provides a mechanism for documenting the party's identity so as to provide a non-repudiation mechanism for the transaction.
  • [0041]
    That is, the present invention provides systems utilizing tokens to assign or mitigate risk.
  • [0042]
    [0042]FIG. 1 depicts a schematic representation of transaction confidence token 100 according to one embodiment of the present invention.
  • [0043]
    A token, such as token 100, is created using available information regarding risk factors, examples of which are discussed above. Token 100 can be in the form of a separate packet of stored data associated with the secret, integrated either with the secret itself or, in the case of PKI, with the associated digital certificate.
  • [0044]
    The present invention provides transaction confidence tokens comprising at least one trust metric. As used herein, ‘trust metric’ generally refers to a measure of a risk factor. Examples of typical risk factors are discussed above. In one embodiment, token 100 comprises information on at least one risk factor discussed above. In another embodiment, token 100 comprises an overall risk-of-compromise (OROC) value, or overall trust metric 110, which may take one or more risk factors into consideration. In a preferred embodiment, token 100 is created and stored in a database during both enrollment and subsequent transactions, includes all the fields shown in FIG. 1. In other embodiments, only a subset of fields shown in FIG. 1 are present.
  • [0045]
    In one embodiment of the present invention, trust metrics, such as overall trust metric 110, are given by an absolute probability ranging from 0.0 to 1.0, calculated using a weighted Bayesian equation. Other ranges and equations for calculating trust metrics may also or alternatively be employed. In preferred embodiments of the present invention, trust metrics are given by an arbitrary mapping of risk information to three categories—low, medium, and high. Any number of categories may alternatively be used, with each category represented by a unique indicator. The risk information may alternatively be provided by a continuous range of values rather than in discrete categories. Overall trust metric 110 represents a weighted combination of individual risk probabilities of a plurality of risk factors. In a preferred embodiment, a system uses token 100 to deny or accept a transaction. In other embodiments, a system charges a fee, or imposes another mitigation factor—such as a waiting period or another required authentication—based on risk information contained in token 100.
  • [0046]
    Accordingly, transaction confidence token 100 (FIG. 1) is composed of two data structures: envelope 120 and seal 130. Envelope 120 comprises transaction contents, or transaction information 140 and at least one trust metric, although a plurality of trust metrics are shown in FIG. 1. Further, envelope 120 comprises timestamp 150. In a preferred embodiment, transaction information 140 represents a complete record of a transaction—including, as appropriate, account numbers, web session identifiers, monetary or exchange values, item quantities, an SKU number, an order number, a credit card number, a web URL or address, or other data describing the user's authenticated request. In other embodiments, transaction information 140 comprises only some of the above information associated with a transaction. In a preferred embodiment, transaction information 140 comprises only a transaction identifier or reference string such as a web session identifier as is often used in web applications. In an alternative embodiment, transaction information 140 field comprises a complete transaction confidence token, which may in turn (i.e., recursively) contain another transaction confidence token in its transaction contents field, without particular limit. This embodiment allows for multiple parties to attest to a transaction and attach their own confidence to the transaction as it is processed by each of a number of systems in series. The innermost transaction confidence token corresponds to the original transaction when it is first authenticated and signed by the originating party. Timestamp 150 generally comprises a string indicating a date and time at which the authentication event which is the subject of the transaction confidence token took place. Generally, any time indicator is appropriate for timestamp 150. In a preferred embodiment, timestamp 150 is expressed in Universal Coordinated Time (UTC). Overall trust metric 110 indicates a degree of overall confidence in a transaction. In one embodiment, overall trust metric 110 provides a degree of confidence in enrollment, storage, transmission, and authentication processes employed for authentication of a transaction. Overall trust metric 110 can be defined according to the specifics of the application contemplated, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In a preferred embodiment, low security refers to a password authentication against a 4-digit numeric PIN stored in non-secure storage. Medium security refers to a fingerprint authentication or strong password (alphanumeric, mixed case, greater than 8 characters) against a secret in non-secure storage, and high security is attributed to a fingerprint authentication or strong password against a secret in secure storage such as a smart card. Generally, any number of trust categories can be assigned among any authentication processes.
  • [0047]
    Envelope 120 may comprise metrics related to measures of individual aspects of an authentication process. That is, envelope 120 may comprise some or all of the following optional fields: (1) Enrollment Trust Metric 160, (2) Storage Trust Metric 170, (3) Transmission Trust Metric 180, and (4) Authentication Trust Metric 190.
  • [0048]
    Enrollment Trust Metric 160 indicates a degree of confidence in security of an enrollment or personalization process under which a secret was issued to an authenticating party. Enrollment trust metric 160 can be defined according to specifics of the application employed. In a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In one embodiment, a low confidence enrollment trust metric is assigned to self-enrollment where little or no manual verification of user identity is carried out; a medium confidence enrollment trust metric is assigned to online verification using a “weak secret” such as a credit card number, which may be independently verified to match the enrollee's name by the credit card issuer; and a high confidence enrollment trust metric is assigned in an enrollment situation where the user's identity is verified—using trusted documents such as a passport, driver's license, or the like—by a human being who works for the enrollment agency or represents another predetermined organization.
  • [0049]
    Storage Trust Metric 170 indicates a degree of confidence in the security of a method of storage used to store a secret. Storage Trust Metric 170 can be defined according to the specifics of the application employed. In a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. Here, in one embodiment, a storage trust metric indicating a low confidence level is assigned to storage of a secret in unencrypted form on a hard disk or FLASH memory of a PC or other computing device; a storage trust metric indicating a medium confidence level is assigned to storage of a secret in encrypted form on a hard drive or FLASH memory of a PC or other computing device and protected with a PIN or password; and a storage trust metric indicating a high confidence level is assigned to storage of a secret in secure storage, such as that of a smart card, and protected with a PIN or password.
  • [0050]
    Transmission Trust Metric 180 indicates a degree of confidence in security of a method of transmission, if any, of a secret. This Transmission Trust Metric can be defined according to specifics of the application employed, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. In one embodiment, a transmission trust metric indicating a low confidence level is assigned to a transmission of a secret in unencrypted form via the internet or local computer network; a transmission trust metric indicating a medium confidence level is assigned to transmission of a secret in encrypted form using SSL or TLS (as known in the art and described further in Dierks, T., and Allen, C., “The TLS Protocol Version 1.0,” RFC 2246, January 1999, hereby incorporated by reference) or other common standard of network encryption; and a transmission trust metric indicating a high confidence level applies to transmission of a secret via armored car using a certified carrier such as, for example, Brink's@, Inc.
  • [0051]
    Authentication Trust Metric 190 indicates a degree of confidence in the security of a method of authentication for a particular transaction. Authentication Trust Metric 190 can be defined according to specifics of the application employed, but in a preferred embodiment, there are three possible values corresponding to low, medium, and high confidence. Accordingly, in one embodiment, an authentication trust metric indicating a low confidence level is assigned to authentication using a PIN or password (“something you know”); an authentication trust metric indicating a medium confidence level is assigned to authentication using a physical token such as a PKCS-11 standard device or smart card (“something you have”); and an authentication trust metric indicating a high confidence level is assigned authentication requiring use of a biometric such as fingerprint, voiceprint, or face recognition (“something you are”).
  • [0052]
    In other embodiments, greater or fewer trust levels are provided. In still other embodiments a continuous range of trust metric values is provided. In some embodiments, more than one type of procedure, device, or method may be assigned an identical trust metric value. For example, in some embodiments both encrypted and unencrypted storage of a secret on a hard disk receive a trust metric indicating a low trust level, while secure storage of a secret—for example on a smart card protected with a PIN—receives a trust metric indicating a high trust level. Although in preferred embodiments, trust metrics provide an indication of security based on measurable risk factors, in other embodiments trust metric values are not constrained by theoretical security weaknesses. For example, a particular storage method or enrollment procedure may be assigned a stronger or weaker trust metric based on a preferred or encouraged method for performing those functions.
  • [0053]
    Seal 130 of transaction confidence token 100 is a string of bytes containing a digital signature of envelope 120, signed in a preferred embodiment with the private key of the authenticating party or system.
  • [0054]
    In a preferred embodiment, envelope 120 and seal 130 are constructed according to PKCS #7—for a detailed description of the standard, see, for example RSA Laboratories. PKCS #7: Cryptographic Message Syntax Standard. Version 1.5, November 1993, hereby incorporated by reference. Using that standard's signed-data content type such that envelope 120 is embodied in a content information (contentInfo) field and seal 130 is embodied in a signer information (signerInfos) field. Note that PKCS #7 also allows for the recursion of transaction information field 140 of envelope 120 in a transaction confidence token.
  • [0055]
    In another embodiment, envelope 120 and seal 130 are constructed according to the XML Signature Syntax and Processing Recommendation—for a detailed description of the standard see, for example Eastlake 3rd, D., Reagle, J., and Solo, D., “(Extensible Markup Language) XML-Signature Syntax and Processing,” RFC 3275, March 2002, incorporated herein by reference.
  • [0056]
    Other encodings or structures of a transaction confidence token are also possible.
  • [0057]
    The present invention further provides systems and methods for using a transaction confidence token. For example, when a requester (client) or server initiates a transaction requiring authentication, such as in step 200 in FIG. 2, server 210 requests the authentication and an associated transaction confidence token. In other embodiments, no specific request is made by server 210. During the course of the transaction Requester 220 allows access to a secret, step 230, such as a private encryption key using an authentication method, such as a biometric match. Numerous devices and methods exist for securing a secret, including those described in U.S. application Ser. No. ______, filed ______, entitled “Secure Network And Networked Devices Using Biometrics” (Attorney Docket No. A-70595/RMA/JML), incorporated herein by reference. Requester 220 generates, step 240, contents of a requested transaction, such as the quantity and SKUs of item(s) to be purchased, in a form suitable to be encoded in the transaction confidence token's transaction contents, or transaction information field, such as transaction information 140 in token 100 depicted in FIG. 1.
  • [0058]
    Requester 220 determines at least one trust metric, step 250, as described above and encodes at least one trust metric in the transaction confidence token. Requester 220 signs the transaction confidence token in step 260. Server 210 receives a transaction confidence token associated with a transaction request in step 270. Server 210 then adjusts its confidence level, in the transaction, step 280, based on whether the signature is valid and takes action appropriate to the confidence level, completing the transaction in step 290.
  • [0059]
    The present invention further provides methods for a server to act on a transaction confidence token. FIG. 3 provides a schematic overview of an embodiment of such a method according to the present invention. A server receives a transaction confidence token, step 300, and verifies the signature of the transaction confidence token, step 310, using, for example, the public key of the originator of the transaction confidence token. If the signature verification fails, indicating that the transaction confidence token was not created by the purported originating party, had been altered after its creation, or otherwise is invalid, then the system may then discard the token, step 320, and assume no confidence in the authenticity of the associated transaction. In a preferred embodiment, a transaction confidence token with an invalid signature is discarded and the associated transaction request is discarded or rolled back according to appropriate exception handling practice for the application employing methods of the present invention.
  • [0060]
    If the signature verification succeeds, then the server determines its confidence in the transaction, step 330, calculated from one or more trust metric fields in the transaction confidence token.
  • [0061]
    If another Server or plurality of Servers are to participate in the transaction, step 340, the original receiving server may construct a new transaction confidence token, and embed the current transaction confidence token, optionally asserting its own degree of confidence in the transaction, step 350. The server then transmits, step 360, a new transaction confidence token (comprising embedded first transaction confidence token) to other participating Server(s), step 370.
  • [0062]
    The Server may then do its own processing of the transaction request employing the confidence it has determined, step 370. For example, if a trust metric within the transaction confidence token indicates or exceeds a predetermined sufficient trust level, the server processes the transaction. However, in one embodiment, if a trust metric does not indicate a minimum sufficient trust level, the server rejects the transaction. If a trust metric indicates a minimum sufficient trust level but less than a sufficient trust level, the server may require a mitigating factor. For example, the server may require an additional authentication procedure, a fee, or a waiting period in an effort to mitigate risk associated with a predetermined range of trust metric values.
  • [0063]
    Although embodiments of the present invention discussed above generally refer to ‘confidence levels’ or ‘trust levels’ with increasing trust metric values associated with increasing trust or confidence in a transaction, in other embodiments, trust metrics are assigned and evaluated with respect to risk. That is, risk is generally the opposite of trust and trust metrics may be assigned such that increasing trust metric values corresponding to an increasing risk associated with a transaction. In these embodiments, less secure situations would receive a higher trust metric value. For example, in some embodiments of the present invention, a confidence level between 0.0 and 1.0 is calculated. A corresponding risk level in this embodiment is given generally by 1—(confidence level).
  • [0064]
    That is, in another embodiment, if a trust metric within the transaction confidence token indicates or exceeds a predetermined maximum risk level, the risk is determined to be too great, and the server the server rejects the transaction. However, if a trust metric indicates less than a maximum risk level but greater than an acceptable risk level, the server may require a mitigating factor before processing the transaction. For example, the server may require an additional authentication procedure, a fee, or a waiting period in an effort to mitigate risk associated with a predetermined range of trust metric values. If a trust metric indicates less than an acceptable risk level, the server will process the transaction.
  • [0065]
    Having described several methods and procedures, it will be appreciated that the invention may advantageously implement the methods and procedures described herein on a general purpose or special purpose computing device, such as a device having a processor for executing computer program code instructions and a memory coupled to the processor for storing data and/or commands. It will be appreciated that the computing device may be a single computer or a plurality of networked computers and that the several procedures associated with implementing the methods and procedures described herein may be implemented on one or a plurality of computing devices. In some embodiments the inventive procedures and methods are implemented on standard server-client network infrastructures with the inventive features added on top of such infrastructure or compatible therewith.
  • [0066]
    The foregoing descriptions of specific embodiments and best mode of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4222076 *Sep 15, 1978Sep 9, 1980Bell Telephone Laboratories, IncorporatedProgressive image transmission
US4405829 *Dec 14, 1977Sep 20, 1983Massachusetts Institute Of TechnologyCryptographic communications system and method
US4558372 *Jan 19, 1984Dec 10, 1985Tektronix, Inc.Scanning method and apparatus
US4654876 *Dec 19, 1984Mar 31, 1987Itek CorporationDigital image motion correction method
US4868877 *Feb 12, 1988Sep 19, 1989Fischer Addison MPublic key/signature cryptosystem with enhanced digital signature certification
US5144448 *Jul 31, 1990Sep 1, 1992Vidar Systems CorporationScanning apparatus using multiple CCD arrays and related method
US5155597 *Nov 28, 1990Oct 13, 1992Recon/Optical, Inc.Electro-optical imaging array with motion compensation
US5227839 *Jun 24, 1991Jul 13, 1993Etec Systems, Inc.Small field scanner
US5293323 *Oct 24, 1991Mar 8, 1994General Electric CompanyMethod for fault diagnosis by assessment of confidence measure
US5444478 *Dec 28, 1993Aug 22, 1995U.S. Philips CorporationImage processing method and device for constructing an image from adjacent images
US5453777 *Apr 12, 1993Sep 26, 1995Presstek, Inc.Method and apparatus for correcting and adjusting digital image output
US5456256 *Nov 4, 1993Oct 10, 1995Ultra-Scan CorporationHigh resolution ultrasonic imaging apparatus and method
US5559961 *Aug 30, 1995Sep 24, 1996Lucent Technologies Inc.Graphical password
US5576763 *Nov 22, 1994Nov 19, 1996Lucent Technologies Inc.Single-polysilicon CMOS active pixel
US5577120 *May 1, 1995Nov 19, 1996Lucent Technologies Inc.Method and apparatus for restrospectively identifying an individual who had engaged in a commercial or retail transaction or the like
US5602585 *Dec 22, 1994Feb 11, 1997Lucent Technologies Inc.Method and system for camera with motion detection
US5623552 *Aug 15, 1995Apr 22, 1997Cardguard International, Inc.Self-authenticating identification card with fingerprint identification
US5625304 *Apr 21, 1995Apr 29, 1997Lucent Technologies Inc.Voltage comparator requiring no compensating offset voltage
US5631704 *Oct 14, 1994May 20, 1997Lucent Technologies, Inc.Active pixel sensor and imaging system having differential mode
US5668874 *Feb 28, 1995Sep 16, 1997Lucent Technologies Inc.Identification card verification system and method
US5671279 *Nov 13, 1995Sep 23, 1997Netscape Communications CorporationElectronic commerce using a secure courier system
US5673123 *Jun 30, 1994Sep 30, 1997Lucent Technologies Inc.Methods and means for processing images
US5739562 *Aug 1, 1995Apr 14, 1998Lucent Technologies Inc.Combined photogate and photodiode active pixel image sensor
US5764789 *Sep 27, 1996Jun 9, 1998Smarttouch, LlcTokenless biometric ATM access system
US5768439 *Mar 22, 1995Jun 16, 1998Hitachi Software Engineering Co., Ltd.Image compounding method and device for connecting a plurality of adjacent images on a map without performing positional displacement at their connections boundaries
US5774525 *Aug 14, 1997Jun 30, 1998International Business Machines CorporationMethod and apparatus utilizing dynamic questioning to provide secure access control
US5812704 *Nov 29, 1994Sep 22, 1998Focus Automation Systems Inc.Method and apparatus for image overlap processing
US5825907 *Jul 11, 1997Oct 20, 1998Lucent Technologies Inc.Neural network system for classifying fingerprints
US5835141 *Jul 3, 1996Nov 10, 1998Lucent Technologies Inc.Single-polysilicon CMOS active pixel image sensor
US5864296 *May 19, 1997Jan 26, 1999Trw Inc.Fingerprint detector using ridge resistance sensor
US5903225 *May 16, 1997May 11, 1999Harris CorporationAccess control system including fingerprint sensor enrollment and associated methods
US5920640 *May 16, 1997Jul 6, 1999Harris CorporationFingerprint sensor and token reader and associated methods
US5963679 *Jan 26, 1996Oct 5, 1999Harris CorporationElectric field fingerprint sensor apparatus and related methods
US5987156 *Nov 25, 1996Nov 16, 1999Lucent TechnologiesApparatus for correcting fixed column noise in images acquired by a fingerprint sensor
US5991408 *May 16, 1997Nov 23, 1999Veridicom, Inc.Identification and security using biometric measurements
US6003135 *Jun 4, 1997Dec 14, 1999Spyrus, Inc.Modular security device
US6016355 *Dec 15, 1995Jan 18, 2000Veridicom, Inc.Capacitive fingerprint acquisition sensor
US6016476 *Jan 16, 1998Jan 18, 2000International Business Machines CorporationPortable information and transaction processing system and method utilizing biometric authorization and digital certificate security
US6047268 *Nov 4, 1997Apr 4, 2000A.T.&T. CorporationMethod and apparatus for billing for transactions conducted over the internet
US6049620 *May 13, 1997Apr 11, 2000Veridicom, Inc.Capacitive fingerprint sensor with adjustable gain
US6069970 *Apr 27, 1999May 30, 2000Authentec, Inc.Fingerprint sensor and token reader and associated methods
US6097418 *Mar 24, 1998Aug 1, 2000Agfa CorporationMethod and apparatus for combining a plurality of images without incurring a visible seam
US6175922 *Mar 13, 2000Jan 16, 2001Esign, Inc.Electronic transaction systems and methods therefor
US6192142 *Feb 2, 1999Feb 20, 2001Smarttouch, Inc.Tokenless biometric electronic stored value transactions
US6195447 *Jan 16, 1998Feb 27, 2001Lucent Technologies Inc.System and method for fingerprint data verification
US6195471 *Mar 24, 1998Feb 27, 2001Agfa CorporationMethod and apparatus for combining a plurality of images at random stitch points without incurring a visible seam
US6202151 *Dec 31, 1997Mar 13, 2001Gte Service CorporationSystem and method for authenticating electronic transactions using biometric certificates
US6208264 *May 21, 1998Mar 27, 2001Automated Identification Service, Inc.Personal verification in a commercial transaction system
US6219793 *Sep 8, 1997Apr 17, 2001Hush, Inc.Method of using fingerprints to authenticate wireless communications
US6230148 *Jan 29, 1999May 8, 2001Veristar CorporationTokenless biometric electric check transaction
US6230235 *Sep 29, 1998May 8, 2001Apache Systems, Inc.Address lookup DRAM aging
US6256737 *Mar 9, 1999Jul 3, 2001Bionetrix Systems CorporationSystem, method and computer program product for allowing access to enterprise resources using biometric devices
US6260300 *Apr 21, 1999Jul 17, 2001Smith & Wesson Corp.Biometrically activated lock and enablement system
US6289114 *Jun 5, 1997Sep 11, 2001Thomson-CsfFingerprint-reading system
US6310966 *May 8, 1998Oct 30, 2001Gte Service CorporationBiometric certificates
US6330345 *Nov 17, 1997Dec 11, 2001Veridicom, Inc.Automatic adjustment processing for sensor devices
US6333989 *Mar 29, 1999Dec 25, 2001Dew Engineering And Development LimitedContact imaging device
US6366682 *Oct 30, 1998Apr 2, 2002Indivos CorporationTokenless electronic transaction system
US6459804 *Jun 13, 2001Oct 1, 2002Thomson-CsfFingerprint-reading system
US6501846 *Apr 24, 1998Dec 31, 2002Ethentica, Inc.Method and system for computer access and cursor control using a relief object image generator
US6518560 *Apr 27, 2000Feb 11, 2003Veridicom, Inc.Automatic gain amplifier for biometric sensor device
US6535622 *Apr 26, 1999Mar 18, 2003Veridicom, Inc.Method for imaging fingerprints and concealing latent fingerprints
US6538456 *Jan 11, 2000Mar 25, 2003Veridicom, Inc.Capacitive fingerprint sensor with adjustable gain
US6546122 *Jul 29, 1999Apr 8, 2003Veridicom, Inc.Method for combining fingerprint templates representing various sensed areas of a fingerprint to derive one fingerprint template representing the fingerprint
US6853988 *Sep 20, 2000Feb 8, 2005Security First CorporationCryptographic server with provisions for interoperability between cryptographic systems
US20020060243 *Nov 13, 2001May 23, 2002Janiak Martin J.Biometric authentication device for use in mobile telecommunications
US20030021495 *Mar 13, 2002Jan 30, 2003Ericson ChengFingerprint biometric capture device and method with integrated on-chip data buffering
US20030115475 *Jul 12, 2002Jun 19, 2003Russo Anthony P.Biometrically enhanced digital certificates and system and method for making and using
US20030115490 *Jul 12, 2002Jun 19, 2003Russo Anthony P.Secure network and networked devices using biometrics
US20030126448 *Jul 12, 2002Jul 3, 2003Russo Anthony P.Method and system for biometric image assembly from multiple partial biometric frame scans
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7197168Jul 12, 2002Mar 27, 2007Atrua Technologies, Inc.Method and system for biometric image assembly from multiple partial biometric frame scans
US7334130 *Jul 18, 2003Feb 19, 2008Bowers Charles RMethod and apparatus for managing confidential information
US7386105 *May 27, 2005Jun 10, 2008Nice Systems LtdMethod and apparatus for fraud detection
US7525411Oct 11, 2005Apr 28, 2009Newfrey LlcDoor lock with protected biometric sensor
US7716493Dec 12, 2007May 11, 2010Bowers Charles RMethod and apparatus for managing confidential information
US7730546 *Jul 1, 2005Jun 1, 2010Time Warner, Inc.Method and apparatus for authenticating usage of an application
US7751595Jul 6, 2010Authentec, Inc.Method and system for biometric image assembly from multiple partial biometric frame scans
US7801288Sep 21, 2010Nice Systems Ltd.Method and apparatus for fraud detection
US8087090 *Dec 27, 2011International Business Machines CorporationFuzzy multi-level security
US8141141 *Jun 30, 2009Mar 20, 2012Actividentity, Inc.System and method for sequentially processing a biometric sample
US8185747 *Aug 16, 2007May 22, 2012Access Security Protection, LlcMethods of registration for programs using verification processes with biometrics for fraud management and enhanced security protection
US8205249 *Oct 23, 2003Jun 19, 2012Giesecke & Devrient GmbhMethod for carrying out a secure electronic transaction using a portable data support
US8321685May 10, 2010Nov 27, 2012Bowers Charles RMethod and apparatus for managing confidential information
US8327459Apr 7, 2010Dec 4, 2012Time Warner, Inc.Method and apparatus for authenticating usage of an application
US8421890Jan 15, 2010Apr 16, 2013Picofield Technologies, Inc.Electronic imager using an impedance sensor grid array and method of making
US8539558Aug 15, 2011Sep 17, 2013Bank Of America CorporationMethod and apparatus for token-based token termination
US8560456Dec 2, 2005Oct 15, 2013Credigy Technologies, Inc.System and method for an anonymous exchange of private data
US8572683Aug 15, 2011Oct 29, 2013Bank Of America CorporationMethod and apparatus for token-based re-authentication
US8572689May 24, 2012Oct 29, 2013Bank Of America CorporationApparatus and method for making access decision using exceptions
US8572714 *May 24, 2012Oct 29, 2013Bank Of America CorporationApparatus and method for determining subject assurance level
US8584202May 24, 2012Nov 12, 2013Bank Of America CorporationApparatus and method for determining environment integrity levels
US8621561 *Jan 4, 2008Dec 31, 2013Microsoft CorporationSelective authorization based on authentication input attributes
US8631486 *Mar 31, 2009Jan 14, 2014Emc CorporationAdaptive identity classification
US8713672 *Aug 15, 2011Apr 29, 2014Bank Of America CorporationMethod and apparatus for token-based context caching
US8726339May 24, 2012May 13, 2014Bank Of America CorporationMethod and apparatus for emergency session validation
US8726340 *May 24, 2012May 13, 2014Bank Of America CorporationApparatus and method for expert decisioning
US8726341 *May 24, 2012May 13, 2014Bank Of America CorporationApparatus and method for determining resource trust levels
US8726361 *Aug 15, 2011May 13, 2014Bank Of America CorporationMethod and apparatus for token-based attribute abstraction
US8752124May 24, 2012Jun 10, 2014Bank Of America CorporationApparatus and method for performing real-time authentication using subject token combinations
US8752143 *Aug 15, 2011Jun 10, 2014Bank Of America CorporationMethod and apparatus for token-based reassignment of privileges
US8752157May 24, 2012Jun 10, 2014Bank Of America CorporationMethod and apparatus for third party session validation
US8781975 *May 23, 2005Jul 15, 2014Emc CorporationSystem and method of fraud reduction
US8782427Mar 20, 2012Jul 15, 2014Actividentity, Inc.System and method for sequentially processing a biometric sample
US8789143 *Aug 15, 2011Jul 22, 2014Bank Of America CorporationMethod and apparatus for token-based conditioning
US8789162 *Aug 15, 2011Jul 22, 2014Bank Of America CorporationMethod and apparatus for making token-based access decisions
US8791792Jun 21, 2010Jul 29, 2014Idex AsaElectronic imager using an impedance sensor grid array mounted on or about a switch and method of making
US8806602May 24, 2012Aug 12, 2014Bank Of America CorporationApparatus and method for performing end-to-end encryption
US8850515May 24, 2012Sep 30, 2014Bank Of America CorporationMethod and apparatus for subject recognition session validation
US8866347May 27, 2011Oct 21, 2014Idex AsaBiometric image sensing
US8910290 *Aug 15, 2011Dec 9, 2014Bank Of America CorporationMethod and apparatus for token-based transaction tagging
US8950002 *Aug 15, 2011Feb 3, 2015Bank Of America CorporationMethod and apparatus for token-based access of related resources
US8996860 *Aug 23, 2012Mar 31, 2015Amazon Technologies, Inc.Tolerance factor-based secret decay
US9015842Mar 19, 2008Apr 21, 2015Websense, Inc.Method and system for protection against information stealing software
US9038148Aug 23, 2012May 19, 2015Amazon Technologies, Inc.Secret variation for network sessions
US9053471 *Aug 29, 2008Jun 9, 20154361423 Canada Inc.Apparatus and method for conducting securing financial transactions
US9055053Aug 15, 2011Jun 9, 2015Bank Of America CorporationMethod and apparatus for token-based combining of risk ratings
US9130986 *Mar 19, 2008Sep 8, 2015Websense, Inc.Method and system for protection against information stealing software
US9159065 *May 24, 2012Oct 13, 2015Bank Of America CorporationMethod and apparatus for object security session validation
US9203818Aug 23, 2012Dec 1, 2015Amazon Technologies, Inc.Adaptive timeouts for security credentials
US9218507 *Nov 26, 2012Dec 22, 2015Charles R. BowersMethod and apparatus for managing confidential information
US9230149Sep 14, 2012Jan 5, 2016Idex AsaBiometric image sensing
US9241259Nov 30, 2012Jan 19, 2016Websense, Inc.Method and apparatus for managing the transfer of sensitive information to mobile devices
US9253197Aug 15, 2011Feb 2, 2016Bank Of America CorporationMethod and apparatus for token-based real-time risk updating
US9268988Sep 14, 2012Feb 23, 2016Idex AsaBiometric image sensing
US20030115490 *Jul 12, 2002Jun 19, 2003Russo Anthony P.Secure network and networked devices using biometrics
US20050015596 *Jul 18, 2003Jan 20, 2005Bowers Charles R.Method and apparatus for managing confidential information
US20050273442 *May 23, 2005Dec 8, 2005Naftali BennettSystem and method of fraud reduction
US20060242691 *Oct 23, 2003Oct 26, 2006Gisela MeisterMethod for carrying out a secure electronic transaction using a portable data support
US20060277092 *Jun 3, 2005Dec 7, 2006Credigy Technologies, Inc.System and method for a peer to peer exchange of consumer information
US20060285665 *May 27, 2005Dec 21, 2006Nice Systems Ltd.Method and apparatus for fraud detection
US20070006163 *Jul 1, 2005Jan 4, 2007Aoki Norihiro EMethod and apparatus for authenticating usage of an application
US20070080778 *Oct 11, 2005Apr 12, 2007Newfrey LlcDoor lock with protected biometric sensor
US20070130070 *Dec 2, 2005Jun 7, 2007Credigy Technologies, Inc.System and method for an anonymous exchange of private data
US20070162377 *Dec 23, 2005Jul 12, 2007Credigy Technologies, Inc.System and method for an online exchange of private data
US20070274575 *Feb 16, 2007Nov 29, 2007Russo Anthony PMethod and system for biometric image assembly from multiple partial biometric frame scans
US20070288759 *Aug 16, 2007Dec 13, 2007Wood Richard GMethods of registration for programs using verification processes with biometrics for fraud management and enhanced security protection
US20080091953 *Dec 12, 2007Apr 17, 2008Bowers Charles RMethod and apparatus for managing confidential information
US20080263662 *Jun 2, 2008Oct 23, 2008Pau-Chen ChengSystem and method for fuzzy multi-level security
US20090178129 *Jan 4, 2008Jul 9, 2009Microsoft CorporationSelective authorization based on authentication input attributes
US20090241173 *Mar 19, 2008Sep 24, 2009Websense, Inc.Method and system for protection against information stealing software
US20100017845 *Jan 21, 2010Microsoft CorporationDifferentiated authentication for compartmentalized computing resources
US20100088509 *Jun 30, 2009Apr 8, 2010Louis Joseph Fedronic DominiqueSystem and method for sequentially processing a biometric sample
US20100199347 *Aug 5, 2010Time Warner, Inc.Method and Apparatus for Authenticating Usage of an Application
US20100223474 *Sep 2, 2010Bowers Charles RMethod and apparatus for managing confidential information
US20110099112 *Aug 29, 2008Apr 28, 2011Mages Kenneth GApparatus and method for conducting securing financial transactions
US20130047201 *Feb 21, 2013Bank Of America CorporationApparatus and Method for Expert Decisioning
US20130047204 *May 24, 2012Feb 21, 2013Bank Of America CorporationApparatus and Method for Determining Resource Trust Levels
US20130047215 *Aug 15, 2011Feb 21, 2013Bank Of America CorporationMethod and apparatus for token-based reassignment of privileges
US20130047248 *May 24, 2012Feb 21, 2013Bank Of America CorporationApparatus and Method for Determining Subject Assurance Level
US20130047251 *Feb 21, 2013Bank Of America CorporationMethod and Apparatus for Token-Based Context Caching
US20130047262 *May 24, 2012Feb 21, 2013Bank Of America CorporationMethod and Apparatus for Object Security Session Validation
US20130047266 *Feb 21, 2013Bank Of America CorporationMethod and apparatus for token-based access of related resources
US20140149747 *Nov 26, 2012May 29, 2014Charles R. BowersMethod and apparatus for managing confidential information
US20150199554 *Jun 20, 2014Jul 16, 2015Motorola Mobility LlcFinger Print State Integration with Non-Application Processor Functions for Power Savings in an Electronic Device
US20160162682 *Jun 15, 2015Jun 9, 2016Charles R. BowersMethod and apparatus for managing confidential information
CN101911585BDec 9, 2008Aug 13, 2014微软公司Selective authorization based on authentication input attributes
DE102004046153A1 *Sep 23, 2004Apr 6, 2006Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.Network e.g. Internet, subscriber`s e.g. mobile telephone, digital reputation determining method, involves determining reputation of subscriber of network by central server based on token issued by service provider to subscriber
DE102004046153B4 *Sep 23, 2004Oct 12, 2006Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.Verfahren und Netzwerksystem zur Bestimmung der digitalen Reputation
WO2006126183A2 *Jan 26, 2006Nov 30, 2006Nice Systems Ltd.Method and apparatus for fraud detection
Classifications
U.S. Classification713/185
International ClassificationG06Q10/02, G06Q30/06, G06Q20/04, G06Q20/40, H04N1/387, G06K9/00
Cooperative ClassificationH04L9/3231, H04L2209/08, H04L2209/805, G06Q20/04, G06K9/00026, G06Q30/06, G06Q20/4016, G06Q10/02
European ClassificationG06Q30/06, G06Q10/02, G06Q20/04, G06Q20/4016, G06K9/00A1C, H04L9/32T
Legal Events
DateCodeEventDescription
Feb 3, 2003ASAssignment
Owner name: I-CONTROL SECURITY, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RUSSO, ANTHONY P.;MCCOY, PETER A.;HOWELL, MARK J.;REEL/FRAME:013713/0333;SIGNING DATES FROM 20021205 TO 20021219
Apr 28, 2004ASAssignment
Owner name: I-CONTROL SECURITY, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:I-CONTROL TRANSACTIONS, INC.;REEL/FRAME:015264/0686
Effective date: 20021112
Nov 19, 2004ASAssignment
Owner name: ATRUA TECHNOLOGIES, INC., CALIFORNIA
Free format text: CHANGE OF NAME;ASSIGNOR:I-CONTROL SECURITY, INC.;REEL/FRAME:015393/0534
Effective date: 20030908