US20070100622A1 - Adaptation method for inter-person biometrics variability - Google Patents
Adaptation method for inter-person biometrics variability Download PDFInfo
- Publication number
- US20070100622A1 US20070100622A1 US11/263,752 US26375205A US2007100622A1 US 20070100622 A1 US20070100622 A1 US 20070100622A1 US 26375205 A US26375205 A US 26375205A US 2007100622 A1 US2007100622 A1 US 2007100622A1
- Authority
- US
- United States
- Prior art keywords
- correction factor
- score
- identity
- standard deviation
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification
- G10L17/06—Decision making techniques; Pattern matching strategies
Definitions
- Embodiments described herein relate generally to speech recognition and more particularly relate to speaker verification.
- Biometrics is the science and technology of measuring and statistically analyzing biological data.
- a biometric is a measurable, physical characteristic or personal behavioral trait used to recognize the identity, or verify the claimed identity, of an enrollee.
- biometrics statistically measure certain human anatomical and physiological traits that are unique to an individual. Examples of biometrics include fingerprints, retinal scans, hand recognition, signature recognition, and speaker recognition.
- Verification is a process of verifying the user is who they claim to be.
- a goal of verification is to determine if the user is the authentic enrolled user or an impostor.
- verification includes four stages: capturing input; filtering unwanted input such as noise; transforming the input to extract a set of feature vectors; generating a statistical representation of the feature vector; and performing a comparison against information previously gathered during an enrollment procedure.
- Speaker verification systems also known as voice verification systems
- Speaker verification systems attempt to match a voice of a speaker whose identity is undergoing verification with a known voice. Speaker verification systems help to provide a means for ensuring secure access by using speech utterances.
- Verbal submission of a word or phrase or simply a sample of an individual speaker's speaking of a randomly selected word or phrase are provided by a claimant when seeking access to pass through a speaker recognition and/or speaker verification system.
- An authentic claimant is one whose utterance matches known characteristics associated with the claimed identity.
- a claimant typically provides a speech sample or speech utterance that is scored against a model corresponding to the claimant's claimed identity and a claimant score is then computed to confirm that the claimant is in fact the claimed identity.
- a feature may be extracted from a biometric sample captured from a claimant claiming an identity.
- the extracted feature may be compared to a template associated with the identity to determine the similarity between the extracted feature and the template with the similarity between them being represented by a score.
- a determination may be made to determine whether the identity has a correction factor associated therewith. If the identity is determined to have a correction factor associated therewith, then the score may be modified using the correction factor. The score may then be compared to a threshold to determine whether to accept the claimant as the identity.
- the biometric sample may comprise a speech sample spoken by the claimant.
- the correction factor may be derived from a biometric sample captured from the claimed identity. The correction factor may also be retrieved from a correction factor data store. To modify the score, the correction factor may be added to the score. In one embodiment, the correction factor may have a negative value. In another embodiment, the claimant may be rejected if the score of the claimant is determined to exceed the threshold.
- a feature may be extracted from a biometric sample captured from the subject requesting enrollment and a standard deviation for the feature may then be calculated.
- a determination may then be performed to determining whether the standard deviation of the feature is greater than a standard deviation of a centroid of a density function. If the standard deviation of the feature is greater than the standard deviation of the centroid, then a correction factor for the subject may be derived based on a trend line of the density function.
- the correction factor may be stored in correction data store.
- the density function may comprise a match score vs. standard deviation density function of a sample population.
- the correction factor is further derived from the difference between a mean score obtained from the centroid and a score for the subject derived from the trend line.
- a determination may be performed to determining whether the standard deviation of the feature exceeds a threshold value. If the standard deviation of the feature is determined to exceed the threshold value, then the threshold value may be used in place of the standard deviation of the feature to derive the correction factor for the subject.
- FIG. 1 is a density function graph for an exemplary population where match scores are plotted against standard deviations of each sample in the exemplary population in accordance with an illustrative embodiment
- FIG. 2 is a density function graph of an exemplary population with various trend lines applied thereto in accordance with an illustrative embodiment
- FIG. 3 is flowchart of a process for training a biometric system in accordance with an exemplary embodiment
- FIG. 4 is a flowchart of a process for deriving a correction factor in accordance with an exemplary embodiment
- FIG. 5 is a flowchart of a verification process in which a correction factor may be applied in accordance with an exemplary embodiment
- FIG. 6 is a density function graph illustrating an effect on decision making of the application of a correction factor to a match score in a biometric verification system in accordance with an exemplary embodiment
- FIG. 7 is a schematic diagram of an illustrative hardware environment in accordance with an exemplary embodiment.
- biometrics For most biometrics, a small percentage of people exhibit biometrics features that are unreliable for use in biometric verification or identification. For example, older people often tend to have very light fingerprints that are unsuitable for a fingerprint based biometric system. As another example, a person may have a medical condition that makes their voice unstable and therefore unsuitable for a speech based biometric system. Embodiments described herein may be implemented for helping to adapt a biometric system to such users without compromising the overall accuracy of the biometric system's decision algorithm.
- biometric feature vectors derived from a biometrics enrollment process can be used to determine the variance in an enrollee's (i.e., a user) biometrics.
- a relationship between a user's biometric features (i.e., feature vectors) and the reliability of a given biometric system can be used to identify such users/enrollees that have features that are unreliable for the given biometric system.
- the biometric match-score of an enrollee with unreliable features may be assigned a small correction to help allow subsequent correct verification of the user in question. These corrections can be computed during enrollment.
- Such a mechanism helps to afford the reliably use of a biometrics system by users that would otherwise be unreliable for the biometric system. Since embodiments of the adaptation process may be implemented so that it only affects the verification match scores of enrollees with unreliable biometric features, the overall behavior of a biometric system may remains unaffected for those enrollees having biometric features that are more reliable. Further, these adjustments to a biometric system for unreliable users can be minimal.
- the embodiments described herein may involve one or more of the following phases or processes: (1) offline training, (2), computation of a correction factor; and (3) application of the correction factor.
- the various aspects and features of these phases will now be described in the following exemplary embodiments. While the exemplary embodiments described herein are described in the context of a voice or speech based biometric system, one of ordinary skill in the art should be able to implement embodiments using other biometrics (e.g., fingerprints, iris, etc.).
- FIG. 1 is a density function graph 100 for an exemplary population where match scores between feature vectors and reference templates of an exemplary population are plotted against standard deviations of each sample in the population.
- the exemplary density function graph 100 shown in FIG. 1 is based on speech data in a voice biometrics application implementation.
- the x-axis 102 of the graph 100 represents standard deviation values and the y-axis 104 represents match score values.
- the smaller the match score the better the match between the feature vector and the reference template (i.e., a smaller match score indicates a better match).
- Each point (e.g., point 106 ) in the density function graph 100 of FIG. 1 represents the plotting of a match score to standard deviation of a feature vector of a given biometric sample.
- the match score of a given sample tends to increase as the standard deviation (SD) of the sample increases.
- SD standard deviation
- FIG. 2 is a density function graph 200 for an exemplary population where match scores between feature vectors and reference templates of an exemplary population are plotted against standard deviations of each sample in the population.
- the standard deviation values are represented along the x-axis 202 and the match score values are represented along the y-axis 204 .
- trend lines 206 , 208 , 210 have been plotted for the density function.
- These trend lines include a trend lines for linear, quadratic and cubic equations (i.e., a linear trend line 206 , a quadratic trend line 208 and a cubic trend line 210 ) to provide data approximations of the density function.
- a simple line equation (such as e.g., trend line 206 ) can be used to represent the trend followed in a match score versus standard deviation density function graph.
- FIG. 3 is flowchart of a training process 300 for an biometric system in accordance with one embodiment.
- This training process 300 may be performed offline.
- biometric data for a test set of valid users may be obtained.
- the biometric data for each user in the test set may then be subjected to the following iterative set of operations (operations 306 - 314 ).
- a feature vector comprising one or more feature coefficients may be extracted from the biometric sample of a user.
- the standard deviation of each feature coefficient may then be calculated from which a mean of standard deviation for the feature vector may be calculated.
- the user may then be enrolled in the biometric system in operation 310 .
- a verification match score also referred to as a verification score
- a verification score for the user may be calculated from a verification sample of the user (and return path to 304 may then be taken).
- a scatter graph i.e., density function graph
- the standard deviations calculated in operation 314 may be generated using the standard deviations calculated in operation 312 .
- centroid for the distribution area of the plotted points may be calculated in operation 318 and stored in a centroid data store 320 .
- the centroid may be represented in terms of its x- and y-axis values (i.e., a centroid standard deviation value and a centroid match score value).
- a trend line of the density function generated in operation 316 may then be derived.
- the trend line generated in operation 322 may then be stored in a line representation data store 324 (e.g., a database).
- FIG. 4 is a flowchart of a process 400 for computing such a correction factor in accordance with an exemplary embodiment.
- a biometric sample i.e., feature data
- an end user i.e., a potential enrollee
- a feature vector is obtained from the biometric sample.
- the standard deviation of each coefficient of the feature vector may be calculated and used to derive a mean of standard deviation for the feature vector (also referred to as the standard deviation of the feature vector) in operation 406 .
- information about a centroid of a pre-calculated density function of match scores to standard deviations may be obtained from a centroid data store 410 (e.g., a database).
- the information about the centroid may include a centroid match score value (e.g., cy of centroid (cx, cy)) and a centroid standard deviation value (e.g., cx of centroid (cx, cy)).
- the standard deviation of the feature vector obtained in operation 406 may be compared to the centroid's standard deviation (i.e., cx).
- the standard deviation of the feature vector is less than the standard deviation value of the centroid (i.e., to the left of the centroid in the density function graph)
- no correction of the match score for the subject may be deemed needed in the biometric system and the subject may be enrolled into the biometric system in operation 412 (via the “Yes” path from decision 408 ).
- the “No” path from decision 408 may be followed in order to calculate a correction factor for the potential enrollee.
- the maximum value may, for example, correspond to a standard deviation value on the right side of the score versus standard deviation density functional graph at which approximately about 5% to about 20% (and preferably about 10%) of the total number of subjects (i.e., users) are located to the right of the value. Setting a maximum value for the standard deviation may be carried out in order to help avoid spurious corrections due to outliers with very large standard deviation values.
- the value of the standard deviation of the feature vector computed in operation 406 may then be used to derive a correction factor for the feature vector.
- the correction factor may be derived by applying the standard deviation value (or modified standard deviation value) to a line representation algorithm of the centroid of the pre-calculated density function of match scores to standard deviations.
- the line representation algorithm used in operation 416 may be obtained from a centroid line representation data store 418 that, in one embodiment, can comprise at least a portion of the centroid data store 410 .
- the value output from operation 416 may be used as the correction factor for the given feature vector and stored in a correction factor data store 420 .
- the match score of a user may be corrected with the applied correction depending on the reliability of the particular user's biometric.
- the user's corrected score can serve as a match score adapted to the characteristics of the user's own voice or other biometric.
- the correction value itself may be determined during the enrollment process.
- the application of a correction factor to a user's match score may be accomplished according to a process such as that set forth in FIG. 5 .
- FIG. 5 is a flowchart of a biometric verification process 500 in which a correction factor may be applied in accordance with an exemplary embodiment.
- feature vectors are obtained from biometric data input by a user claiming an identity (i.e., a “claimant”).
- a biometrics decision module in operation 510 for deciding whether to accept or reject the claimant as the claimed identity.
- each point (e.g., point 602 ) represents a claimant in an exemplary biometric verification system with each point indicating the match score and standard deviation of the given claimant.
- a horizontal boundary line 604 extending across the density function graph 600 represents an illustrative threshold match score of the exemplary biometric verification system.
- the value of threshold match score (and thus the location of the boundary line 604 on the density function graph 600 ) is dependent of the specific biometric verification system and may be set to meet the needs of the particular implementation.
- the boundary line 604 divides the density function graph into upper and lower areas 606 , 608 .
- claimants whose points are located in the upper area 606 are designated as imposters (i.e., an imposter zone) while claimants having points located in the lower area 608 are designated as valid subjects (i.e., a valid user zone).
- claimant “1” would always be rejected (i.e., an imposter) by the exemplary biometric system.
- claimant “2” small variations in the voice sample of claimant “2” could cause the match score for claimant “2” to become larger and cross over the boundary line into the upper area and, as a result, the exemplary biometric system would frequently reject claimant “2” as an imposter.
- the exemplary biometric verification system By implementing the exemplary biometric verification system so that it can apply a correction factor to the match scores of claimants near the boundary line such as claimants “1” and “2,” these claimants can be accepted as valid subjects by the biometric verification system.
- the application of a correction factor to the match scores of claimants “1” and “2” is represented by the downwards arrows. As represented by these arrows, the application of a correction factor to the match scores of claimants “1” and “2” effectively shifts the match scores downwards below the boundary line so that both claimants “1” and “2” would be more likely to be accepted as valid subjects by the biometric verification system.
- FIG. 7 illustrates an exemplary hardware configuration of a computer 700 having a central processing unit 702 , such as a microprocessor, and a number of other units interconnected via a system bus 704 .
- the computer 700 shown in FIG. 5 includes a Random Access Memory (RAM) 706 , Read Only Memory (ROM) 708 , an I/O adapter 710 for connecting peripheral devices such as, for example, disk storage units 712 and printers 714 to the bus 704 , a user interface adapter 716 for connecting various user interface devices such as, for example, a keyboard 718 , a mouse 720 , a speaker 722 , a microphone 724 , and/or other user interface devices such as a touch screen or a digital camera to the bus 704 , a communication adapter 726 for connecting the computer 700 to a communication network 728 (e.g., a data processing network) and a display adapter 730 for connecting the bus 704 to a display device 732 .
- a communication network 728
- the computer may utilize an operating system such as, for example, a Microsoft Windows operating system (O/S), a Macintosh O/S, a Linux O/S and/or a UNIX O/S.
- an operating system such as, for example, a Microsoft Windows operating system (O/S), a Macintosh O/S, a Linux O/S and/or a UNIX O/S.
- O/S Microsoft Windows operating system
- Macintosh O/S a Macintosh O/S
- Linux O/S a Linux O/S
- UNIX O/S UNIX O/S
- Embodiments of the present invention may also be implemented using computer program languages such as, for example, ActiveX, Java, C, and the C++ language and utilize object oriented programming methodology. Any such resulting program, having computer-readable code, may be embodied or provided within one or more computer-readable media, thereby making a computer program product (i.e., an article of manufacture).
- the computer readable media may be, for instance, a fixed (hard) drive, diskette, optical disk, magnetic tape, semiconductor memory such as read-only memory (ROM), etc., or any transmitting/receiving medium such as the Internet or other communication network or link.
- the article of manufacture containing the computer code may be made and/or used by executing the code directly from one medium, by copying the code from one medium to another medium, or by transmitting the code over a network.
- a biometric sample may be captured from a subject requesting enrollment into a biometric system.
- at least one feature vector may be extracted with the feature vector comprising one or more coefficients.
- a standard deviation may be calculated for each of the coefficients of the feature vector.
- a mean standard deviation may be calculated from the coefficient standard deviations to represent the standard deviation for the feature vector.
- a determination may be made as to whether the value of the calculated standard deviation of the feature vector is greater than the value of a standard deviation represented by a centroid of a match score vs. standard deviation density function of, for example, a sample population. If the standard deviation of the feature vector is determined to be greater than the standard deviation of the centroid of the density function, then a correction factor may be derived for the subject based on a linear representation of a trend line of the density function and the standard deviation of the feature vector.
- the correction factor may be derived from the difference between the value of a match score represented by the centroid of the match score vs. standard deviation density function and a match score for the subject derived from the trend line of the density function.
- the derivation of the correction factor may be performed by calculating an “expected” match score for the feature vector of the subject from the trend line of the density function and the standard deviation of the feature vector of the subject.
- the value of the “expected” match score may be subtracted from the value of a match score represented by the centroid of the match score vs. standard deviation density function with the output difference comprising the correction factor.
- the derivation of the correction factor may also include making a determination as to whether the standard deviation of the feature vector exceeds a maximum threshold value for standard deviation values (e.g., a predefined maximum value), and if it does, then replacing the standard deviation of the feature vector with the maximum standard deviation threshold value.
- a maximum threshold value for standard deviation values (e.g., a predefined maximum value)
- the maximum standard deviation threshold value may then be used instead of the originally calculated standard deviation of the feature vector with the trend line to derive the correction factor for the subject.
- the derived correction factor may be stored in a correction factor data store in a memory and/or memory device.
- a biometric capturing component may be used to capture a biometric sample from a claimant who claims a particular identity.
- the biometric sample may comprise a speech sample (i.e., a vocal sample) made by the claimant.
- the captured biometric sample may then be passed to an extraction component that can extract at least one feature vector from the captured biometric sample.
- the extracted feature may then be compared to a pre-generated reference template associated with the claimed identity (i.e., the reference template of an enrolled subject) to determine the degree (i.e., amount) of similarity between the extracted feature and the reference template.
- the degree of similarity may be represented by a match score that is output as a result of the comparison.
- this comparison may be carried out by a comparison component coupled to the extraction component.
- a determination may be made to determine whether the claimed identity has a correction factor associated therewith. This determination may be accomplished by searching a correction factor data store in which correction factors of enrolled subjects are stored.
- the correction factor data store may reside in a memory and/or a memory device. If a correction factor for the claimed identity is found during this search, then it may be retrieved from the correction factor data store and used to modifying the generated match score and thereby derive a modified match score.
- the modification of the match score may be performed by adding the correction factor to the match score.
- the correction factor may have a negative value so that the match score is lowered in value by the addition of the correction factor.
- a decision component may then compare either the modified match score (if the claimed identity is determined to have a correction factor) or the unmodified match score (if the claimed identity is determined not to have a correction factor) to a decision threshold value to determine whether to accept the claimant as the claimed identity. If the value of the match score/modified match score of the claimant exceeds the decision threshold value, then the claimant may be rejected (i.e., classified as an imposter) by the biometric verification system.
- the computer readable media may be, for instance, a fixed drive (e.g., a hard drive), diskette, optical disk, magnetic tape, semiconductor memory such as for example, read-only memory (ROM), flash-type memory, etc., and/or any transmitting/receiving medium such as the Internet and/or other communication network or link.
- An article of manufacture containing the computer code may be made and/or used by executing the code directly from one medium, by copying the code from one medium to another medium, and/or by transmitting the code over a network.
- one of ordinary skill in the art of computer science may be able to combine the software created as described with appropriate general purpose or special purpose computer hardware to create a computer system or computer sub-system embodying embodiments or portions thereof described herein.
Abstract
Description
- Embodiments described herein relate generally to speech recognition and more particularly relate to speaker verification.
- Biometrics is the science and technology of measuring and statistically analyzing biological data. A biometric is a measurable, physical characteristic or personal behavioral trait used to recognize the identity, or verify the claimed identity, of an enrollee. In general, biometrics statistically measure certain human anatomical and physiological traits that are unique to an individual. Examples of biometrics include fingerprints, retinal scans, hand recognition, signature recognition, and speaker recognition.
- Verification (also known as authentication) is a process of verifying the user is who they claim to be. A goal of verification is to determine if the user is the authentic enrolled user or an impostor. Generally, verification includes four stages: capturing input; filtering unwanted input such as noise; transforming the input to extract a set of feature vectors; generating a statistical representation of the feature vector; and performing a comparison against information previously gathered during an enrollment procedure.
- Speaker verification systems (also known as voice verification systems) attempt to match a voice of a speaker whose identity is undergoing verification with a known voice. Speaker verification systems help to provide a means for ensuring secure access by using speech utterances. Verbal submission of a word or phrase or simply a sample of an individual speaker's speaking of a randomly selected word or phrase are provided by a claimant when seeking access to pass through a speaker recognition and/or speaker verification system. An authentic claimant is one whose utterance matches known characteristics associated with the claimed identity.
- To train a speaker verification system, a claimant typically provides a speech sample or speech utterance that is scored against a model corresponding to the claimant's claimed identity and a claimant score is then computed to confirm that the claimant is in fact the claimed identity.
- There exist groups of users that have unstable or unreliable biometric data that cause biometric systems to falsely reject them. These users with unstable or unreliable biometric data may be referred to as “goats.” Implementation of biometric systems capable of providing increased acceptance rates for such users could be advantageous.
- Embodiments of a system and method for verifying an identity of a claimant are described. In accordance with one embodiment, a feature may be extracted from a biometric sample captured from a claimant claiming an identity. The extracted feature may be compared to a template associated with the identity to determine the similarity between the extracted feature and the template with the similarity between them being represented by a score. A determination may be made to determine whether the identity has a correction factor associated therewith. If the identity is determined to have a correction factor associated therewith, then the score may be modified using the correction factor. The score may then be compared to a threshold to determine whether to accept the claimant as the identity.
- In another embodiment, the biometric sample may comprise a speech sample spoken by the claimant. In one embodiment, the correction factor may be derived from a biometric sample captured from the claimed identity. The correction factor may also be retrieved from a correction factor data store. To modify the score, the correction factor may be added to the score. In one embodiment, the correction factor may have a negative value. In another embodiment, the claimant may be rejected if the score of the claimant is determined to exceed the threshold.
- In accordance with a further embodiment, during enrollment of a subject in a biometric verification system, a feature may be extracted from a biometric sample captured from the subject requesting enrollment and a standard deviation for the feature may then be calculated. A determination may then be performed to determining whether the standard deviation of the feature is greater than a standard deviation of a centroid of a density function. If the standard deviation of the feature is greater than the standard deviation of the centroid, then a correction factor for the subject may be derived based on a trend line of the density function.
- In one embodiment, the correction factor may be stored in correction data store. In another embodiment, the density function may comprise a match score vs. standard deviation density function of a sample population. In a further embodiment, the correction factor is further derived from the difference between a mean score obtained from the centroid and a score for the subject derived from the trend line. As a further option, a determination may be performed to determining whether the standard deviation of the feature exceeds a threshold value. If the standard deviation of the feature is determined to exceed the threshold value, then the threshold value may be used in place of the standard deviation of the feature to derive the correction factor for the subject.
-
FIG. 1 is a density function graph for an exemplary population where match scores are plotted against standard deviations of each sample in the exemplary population in accordance with an illustrative embodiment; -
FIG. 2 is a density function graph of an exemplary population with various trend lines applied thereto in accordance with an illustrative embodiment; -
FIG. 3 is flowchart of a process for training a biometric system in accordance with an exemplary embodiment; -
FIG. 4 is a flowchart of a process for deriving a correction factor in accordance with an exemplary embodiment; -
FIG. 5 is a flowchart of a verification process in which a correction factor may be applied in accordance with an exemplary embodiment; -
FIG. 6 is a density function graph illustrating an effect on decision making of the application of a correction factor to a match score in a biometric verification system in accordance with an exemplary embodiment; and -
FIG. 7 is a schematic diagram of an illustrative hardware environment in accordance with an exemplary embodiment. - For most biometrics, a small percentage of people exhibit biometrics features that are unreliable for use in biometric verification or identification. For example, older people often tend to have very light fingerprints that are unsuitable for a fingerprint based biometric system. As another example, a person may have a medical condition that makes their voice unstable and therefore unsuitable for a speech based biometric system. Embodiments described herein may be implemented for helping to adapt a biometric system to such users without compromising the overall accuracy of the biometric system's decision algorithm.
- More particularly, biometric feature vectors derived from a biometrics enrollment process can be used to determine the variance in an enrollee's (i.e., a user) biometrics. During enrollment, a relationship between a user's biometric features (i.e., feature vectors) and the reliability of a given biometric system can be used to identify such users/enrollees that have features that are unreliable for the given biometric system. During verification, the biometric match-score of an enrollee with unreliable features may be assigned a small correction to help allow subsequent correct verification of the user in question. These corrections can be computed during enrollment.
- Such a mechanism helps to afford the reliably use of a biometrics system by users that would otherwise be unreliable for the biometric system. Since embodiments of the adaptation process may be implemented so that it only affects the verification match scores of enrollees with unreliable biometric features, the overall behavior of a biometric system may remains unaffected for those enrollees having biometric features that are more reliable. Further, these adjustments to a biometric system for unreliable users can be minimal.
- In general, the embodiments described herein may involve one or more of the following phases or processes: (1) offline training, (2), computation of a correction factor; and (3) application of the correction factor. The various aspects and features of these phases will now be described in the following exemplary embodiments. While the exemplary embodiments described herein are described in the context of a voice or speech based biometric system, one of ordinary skill in the art should be able to implement embodiments using other biometrics (e.g., fingerprints, iris, etc.).
- In offline training, a relationship between a persons biometric features and the reliability of biometrics identification system may be developed.
FIG. 1 is adensity function graph 100 for an exemplary population where match scores between feature vectors and reference templates of an exemplary population are plotted against standard deviations of each sample in the population. The exemplarydensity function graph 100 shown inFIG. 1 is based on speech data in a voice biometrics application implementation. Thex-axis 102 of thegraph 100 represents standard deviation values and the y-axis 104 represents match score values. In this example, the smaller the match score, the better the match between the feature vector and the reference template (i.e., a smaller match score indicates a better match). Each point (e.g., point 106) in thedensity function graph 100 ofFIG. 1 represents the plotting of a match score to standard deviation of a feature vector of a given biometric sample. - Based on the distribution of points in the graph of
FIG. 1 , the following observations can be made: (1) the data is clustered towards a center that represents a centroid of the distribution area; and (2) the match score of a given sample tends to increase as the standard deviation (SD) of the sample increases. As a result, it can be inferred from thegraph 100 shown inFIG. 1 that users with smaller standard deviations may have smaller match scores (i.e., better matches between their feature vectors and reference template). -
FIG. 2 is adensity function graph 200 for an exemplary population where match scores between feature vectors and reference templates of an exemplary population are plotted against standard deviations of each sample in the population. As in thegraph 100 shown inFIG. 1 , in thisgraph 200, the standard deviation values are represented along thex-axis 202 and the match score values are represented along the y-axis 204. - In
FIG. 2 , a variety oftrend lines linear trend line 206, aquadratic trend line 208 and a cubic trend line 210) to provide data approximations of the density function. As can be seen in this exemplary implementation, all three approximations are fairly straight. Therefore, in one embodiment, a simple line equation (such as e.g., trend line 206) can be used to represent the trend followed in a match score versus standard deviation density function graph. -
FIG. 3 is flowchart of atraining process 300 for an biometric system in accordance with one embodiment. Thistraining process 300 may be performed offline. Inoperation 302, biometric data for a test set of valid users may be obtained. As shown by the “No” path ofdecision 304, the biometric data for each user in the test set may then be subjected to the following iterative set of operations (operations 306-314). - In
operation 306, a feature vector comprising one or more feature coefficients may be extracted from the biometric sample of a user. Inoperation 308, the standard deviation of each feature coefficient may then be calculated from which a mean of standard deviation for the feature vector may be calculated. The user may then be enrolled in the biometric system inoperation 310. Inoperation 312, a verification match score (also referred to as a verification score) for the user may be calculated from a verification sample of the user (and return path to 304 may then be taken). - After the biometric data from the last user in the set has been subject to operations 306-314, the “Yes” path of
decision 304 may be followed tooperation 316. Inoperation 316, a scatter graph (i.e., density function graph) of the standard deviations versus the verification match scores may be generated using the standard deviations calculated inoperation 314 and the verification match scores calculated inoperation 312. - From the density function graph, a centroid for the distribution area of the plotted points may be calculated in
operation 318 and stored in acentroid data store 320. The centroid may be represented in terms of its x- and y-axis values (i.e., a centroid standard deviation value and a centroid match score value). - In
operation 322, a trend line of the density function generated inoperation 316 may then be derived. As shown inFIG. 3 , the derived trend line may comprise a linear trend line that is a linear approximation of the data and have the form: score=m*sdev+c. The trend line generated inoperation 322 may then be stored in a line representation data store 324 (e.g., a database). - During enrollment, the trend line relationship developed for a density function graph may be used to compute a correction factor that may be used in a biometric verification process.
FIG. 4 is a flowchart of aprocess 400 for computing such a correction factor in accordance with an exemplary embodiment. Inoperation 402, a biometric sample (i.e., feature data) of an end user (i.e., a potential enrollee). Inoperation 404, a feature vector is obtained from the biometric sample. The standard deviation of each coefficient of the feature vector may be calculated and used to derive a mean of standard deviation for the feature vector (also referred to as the standard deviation of the feature vector) inoperation 406. - With reference to
decision 408, information about a centroid of a pre-calculated density function of match scores to standard deviations may be obtained from a centroid data store 410 (e.g., a database). The information about the centroid may include a centroid match score value (e.g., cy of centroid (cx, cy)) and a centroid standard deviation value (e.g., cx of centroid (cx, cy)). Indecision 408, the standard deviation of the feature vector obtained inoperation 406 may be compared to the centroid's standard deviation (i.e., cx). If the standard deviation of the feature vector is less than the standard deviation value of the centroid (i.e., to the left of the centroid in the density function graph), then no correction of the match score for the subject may be deemed needed in the biometric system and the subject may be enrolled into the biometric system in operation 412 (via the “Yes” path from decision 408). - If, on the other hand, the standard deviation of the feature vector obtained in
operation 406 is greater than the standard deviation value of the centroid (i.e., to the right of the centroid in the density function graph), then the “No” path fromdecision 408 may be followed in order to calculate a correction factor for the potential enrollee. - Following the “No” path, a determination may be made in
operation 414 to determine whether the value of the standard deviation of the feature vector calculated inoperation 406 exceeds a maximum value. If the standard deviation is determined to exceed the maximum value, then the standard deviation value for the feature vector may be set (i.e., reduced or bound) to the maximum value inoperation 414. - In one embodiment, the maximum value may, for example, correspond to a standard deviation value on the right side of the score versus standard deviation density functional graph at which approximately about 5% to about 20% (and preferably about 10%) of the total number of subjects (i.e., users) are located to the right of the value. Setting a maximum value for the standard deviation may be carried out in order to help avoid spurious corrections due to outliers with very large standard deviation values.
- In
operation 416, the value of the standard deviation of the feature vector computed in operation 406 (or modified value of the standard deviation per operation 414) may then be used to derive a correction factor for the feature vector. In one embodiment, the correction factor may be derived by applying the standard deviation value (or modified standard deviation value) to a line representation algorithm of the centroid of the pre-calculated density function of match scores to standard deviations. As previously described, in one embodiment, the line representation algorithm may be linear (e.g., a linear trend line), for example, and be represented as: score=m*sdev+c, where m is the slope of the line, “sdev” is the standard deviation and “c” is a constant defined at the intersection of the line to the match score axis of the density function graph. The line representation algorithm used inoperation 416 may be obtained from a centroid linerepresentation data store 418 that, in one embodiment, can comprise at least a portion of thecentroid data store 410. The value output fromoperation 416 may be used as the correction factor for the given feature vector and stored in a correctionfactor data store 420. In one exemplary embodiment, the value of the correction factor may be derived from the algorithm: corr=cy−score, where “corr” is the correction factor, “cy” is the centroid match score (i.e., the match score of centroid (cx, cy)) and “score” is the line representation. - Thus, in accordance with an exemplary embodiment, the following illustrative pseudocode sets forth an illustrative process for deriving a correction score:
score = 0.63 * sdev + 1.4; corr = yc-score; where: “score” is the line representation of the centroid of the density function; “sdev” is the standard deviation of the feature vector or the maximum standard deviation value if the standard deviation of the feature vector exceeds the maximum standard deviation value; “corr” is the correction factor; and “yc” is the a centroid match score value (e.g., cy of centroid (cx, cy)). - During verification, the match score of a user may be corrected with the applied correction depending on the reliability of the particular user's biometric. This way, the user's corrected score can serve as a match score adapted to the characteristics of the user's own voice or other biometric. As previously described with reference to
FIG. 4 , the correction value itself may be determined during the enrollment process. The application of a correction factor to a user's match score may be accomplished according to a process such as that set forth inFIG. 5 . -
FIG. 5 is a flowchart of abiometric verification process 500 in which a correction factor may be applied in accordance with an exemplary embodiment. Inoperation 502, feature vectors are obtained from biometric data input by a user claiming an identity (i.e., a “claimant”). Inoperation 504, a match score (or “verification score”) may be calculated using the obtained feature vectors and, for example, a reference template associated with the claimed identity. If the claimed identity has a correction factor associated with it, then inoperation 506, the correction factor (also referred to as “Corr”) may be obtained from a correctionfactor data store 508 and added to the match score to obtain an adjusted match score for the claimant (e.g., adjusted match score=match score+correction factor). The adjusted match score may then be used by a biometrics decision module inoperation 510 for deciding whether to accept or reject the claimant as the claimed identity. - The effect on decision making in a biometric verification system as a result of the application of a correction factor to a match score can be explained with reference to the exemplary
density function graph 600 ofFIG. 6 . In thedensity function graph 600 shown inFIG. 6 , each point (e.g., point 602) represents a claimant in an exemplary biometric verification system with each point indicating the match score and standard deviation of the given claimant. - A
horizontal boundary line 604 extending across thedensity function graph 600 represents an illustrative threshold match score of the exemplary biometric verification system. The value of threshold match score (and thus the location of theboundary line 604 on the density function graph 600) is dependent of the specific biometric verification system and may be set to meet the needs of the particular implementation. Theboundary line 604 divides the density function graph into upper andlower areas upper area 606 are designated as imposters (i.e., an imposter zone) while claimants having points located in thelower area 608 are designated as valid subjects (i.e., a valid user zone). - As shown in
FIG. 6 , two claimants represented by the circled points are located close to the boundary line. Normally, claimant “1” would always be rejected (i.e., an imposter) by the exemplary biometric system. With respect to claimant “2,” small variations in the voice sample of claimant “2” could cause the match score for claimant “2” to become larger and cross over the boundary line into the upper area and, as a result, the exemplary biometric system would frequently reject claimant “2” as an imposter. - By implementing the exemplary biometric verification system so that it can apply a correction factor to the match scores of claimants near the boundary line such as claimants “1” and “2,” these claimants can be accepted as valid subjects by the biometric verification system. The application of a correction factor to the match scores of claimants “1” and “2” is represented by the downwards arrows. As represented by these arrows, the application of a correction factor to the match scores of claimants “1” and “2” effectively shifts the match scores downwards below the boundary line so that both claimants “1” and “2” would be more likely to be accepted as valid subjects by the biometric verification system.
-
FIG. 7 illustrates an exemplary hardware configuration of acomputer 700 having acentral processing unit 702, such as a microprocessor, and a number of other units interconnected via asystem bus 704. Thecomputer 700 shown inFIG. 5 includes a Random Access Memory (RAM) 706, Read Only Memory (ROM) 708, an I/O adapter 710 for connecting peripheral devices such as, for example,disk storage units 712 andprinters 714 to thebus 704, auser interface adapter 716 for connecting various user interface devices such as, for example, akeyboard 718, amouse 720, aspeaker 722, amicrophone 724, and/or other user interface devices such as a touch screen or a digital camera to thebus 704, acommunication adapter 726 for connecting thecomputer 700 to a communication network 728 (e.g., a data processing network) and adisplay adapter 730 for connecting thebus 704 to adisplay device 732. The computer may utilize an operating system such as, for example, a Microsoft Windows operating system (O/S), a Macintosh O/S, a Linux O/S and/or a UNIX O/S. Those of ordinary skill in the art will appreciate that embodiments may also be implemented on platforms and operating systems other than those mentioned. One of ordinary skilled in the art will also be able to combine software with appropriate general purpose or special purpose computer hardware to create a computer system or computer sub-system for implementing various embodiments described herein. - Embodiments of the present invention may also be implemented using computer program languages such as, for example, ActiveX, Java, C, and the C++ language and utilize object oriented programming methodology. Any such resulting program, having computer-readable code, may be embodied or provided within one or more computer-readable media, thereby making a computer program product (i.e., an article of manufacture). The computer readable media may be, for instance, a fixed (hard) drive, diskette, optical disk, magnetic tape, semiconductor memory such as read-only memory (ROM), etc., or any transmitting/receiving medium such as the Internet or other communication network or link. The article of manufacture containing the computer code may be made and/or used by executing the code directly from one medium, by copying the code from one medium to another medium, or by transmitting the code over a network.
- In accordance with the foregoing, embodiments of a process for enrolling a subject in a biometric verification system may be implemented. In one implementation, a biometric sample may be captured from a subject requesting enrollment into a biometric system. From the captured biometric sample, at least one feature vector may be extracted with the feature vector comprising one or more coefficients. A standard deviation may be calculated for each of the coefficients of the feature vector. As an option, a mean standard deviation may be calculated from the coefficient standard deviations to represent the standard deviation for the feature vector.
- Next, a determination may be made as to whether the value of the calculated standard deviation of the feature vector is greater than the value of a standard deviation represented by a centroid of a match score vs. standard deviation density function of, for example, a sample population. If the standard deviation of the feature vector is determined to be greater than the standard deviation of the centroid of the density function, then a correction factor may be derived for the subject based on a linear representation of a trend line of the density function and the standard deviation of the feature vector.
- The correction factor may be derived from the difference between the value of a match score represented by the centroid of the match score vs. standard deviation density function and a match score for the subject derived from the trend line of the density function. In other words, the derivation of the correction factor may be performed by calculating an “expected” match score for the feature vector of the subject from the trend line of the density function and the standard deviation of the feature vector of the subject. The value of the “expected” match score may be subtracted from the value of a match score represented by the centroid of the match score vs. standard deviation density function with the output difference comprising the correction factor. The derivation of the correction factor may also include making a determination as to whether the standard deviation of the feature vector exceeds a maximum threshold value for standard deviation values (e.g., a predefined maximum value), and if it does, then replacing the standard deviation of the feature vector with the maximum standard deviation threshold value. In such a situation (i.e., when the standard deviation of the feature vector exceeds the maximum standard deviation threshold value), the maximum standard deviation threshold value may then be used instead of the originally calculated standard deviation of the feature vector with the trend line to derive the correction factor for the subject. The derived correction factor may be stored in a correction factor data store in a memory and/or memory device.
- Embodiments of a process for verifying an identity of a claimant may also be implemented in accordance with the foregoing. In one implementation, a biometric capturing component may be used to capture a biometric sample from a claimant who claims a particular identity. In one embodiment, the biometric sample may comprise a speech sample (i.e., a vocal sample) made by the claimant. The captured biometric sample may then be passed to an extraction component that can extract at least one feature vector from the captured biometric sample.
- The extracted feature may then be compared to a pre-generated reference template associated with the claimed identity (i.e., the reference template of an enrolled subject) to determine the degree (i.e., amount) of similarity between the extracted feature and the reference template. The degree of similarity may be represented by a match score that is output as a result of the comparison. In one implementation, this comparison may be carried out by a comparison component coupled to the extraction component.
- Next, a determination may be made to determine whether the claimed identity has a correction factor associated therewith. This determination may be accomplished by searching a correction factor data store in which correction factors of enrolled subjects are stored. The correction factor data store may reside in a memory and/or a memory device. If a correction factor for the claimed identity is found during this search, then it may be retrieved from the correction factor data store and used to modifying the generated match score and thereby derive a modified match score. By modifying the match score with a correction factor, the degree of similarity between the feature vector and the reference template can be effectively increased for biometric purposes. In one embodiment, the modification of the match score may be performed by adding the correction factor to the match score. In implementations where lower match scores indicate a greater the degree of similarity (e.g., a match score of 2 indicates a greater match than a match score of 3) between the feature vector and the reference template, the correction factor may have a negative value so that the match score is lowered in value by the addition of the correction factor.
- A decision component may then compare either the modified match score (if the claimed identity is determined to have a correction factor) or the unmodified match score (if the claimed identity is determined not to have a correction factor) to a decision threshold value to determine whether to accept the claimant as the claimed identity. If the value of the match score/modified match score of the claimant exceeds the decision threshold value, then the claimant may be rejected (i.e., classified as an imposter) by the biometric verification system.
- The following references are hereby incorporated by reference herein in their entirety: A. K. Jain, A. Ross and S. Prabhakar, “An Introduction to Biometric Recognition,” IEEE Transactions on Circuits and Systems for Video Technology, Special Issue on Image- and Video-Based Biometrics, Vol. 14, No. 1, pp. 4-20, January 2004; and Ruud M. Bolle, Sharath Pankanti, Nalini K. Ratha, “Evaluation Techniques for Biometric-Based Authentication Systems (FRR),” IBM Computer Science Research Report RC 21759, 2000.
- Based on the foregoing specification, various embodiments may be implemented using computer programming or engineering techniques including computer software, firmware, hardware or any combination or subset thereof. Any such resulting program—having computer-readable code—may be embodied or provided in one or more computer-readable media, thereby making a computer program product (i.e., an article of manufacture) implementation of one or more embodiments described herein. The computer readable media may be, for instance, a fixed drive (e.g., a hard drive), diskette, optical disk, magnetic tape, semiconductor memory such as for example, read-only memory (ROM), flash-type memory, etc., and/or any transmitting/receiving medium such as the Internet and/or other communication network or link. An article of manufacture containing the computer code may be made and/or used by executing the code directly from one medium, by copying the code from one medium to another medium, and/or by transmitting the code over a network. In addition, one of ordinary skill in the art of computer science may be able to combine the software created as described with appropriate general purpose or special purpose computer hardware to create a computer system or computer sub-system embodying embodiments or portions thereof described herein.
- While various embodiments have been described, they have been presented by way of example only, and not limitation. Thus, the breadth and scope of any embodiment should not be limited by any of the above described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/263,752 US7788101B2 (en) | 2005-10-31 | 2005-10-31 | Adaptation method for inter-person biometrics variability |
JP2006239926A JP2007128046A (en) | 2005-10-31 | 2006-09-05 | Authentication method, authentication system, and correction element deriving method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/263,752 US7788101B2 (en) | 2005-10-31 | 2005-10-31 | Adaptation method for inter-person biometrics variability |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070100622A1 true US20070100622A1 (en) | 2007-05-03 |
US7788101B2 US7788101B2 (en) | 2010-08-31 |
Family
ID=37997635
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/263,752 Active 2028-11-12 US7788101B2 (en) | 2005-10-31 | 2005-10-31 | Adaptation method for inter-person biometrics variability |
Country Status (2)
Country | Link |
---|---|
US (1) | US7788101B2 (en) |
JP (1) | JP2007128046A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070177770A1 (en) * | 2006-01-30 | 2007-08-02 | Derchak P A | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
US20080045815A1 (en) * | 2006-06-20 | 2008-02-21 | Derchak P A | Automatic and ambulatory monitoring of congestive heart failure patients |
US20080221401A1 (en) * | 2006-10-27 | 2008-09-11 | Derchak P Alexander | Identification of emotional states using physiological responses |
US20100110078A1 (en) * | 2008-10-30 | 2010-05-06 | Ricoh Company, Ltd. | Method and computer program product for plotting distribution area of data points in scatter diagram |
US20110185176A1 (en) * | 2008-10-31 | 2011-07-28 | Hitachi, Ltd. | Biometric authentication method and system |
US8033996B2 (en) | 2005-07-26 | 2011-10-11 | Adidas Ag | Computer interfaces including physiologically guided avatars |
US20120101822A1 (en) * | 2010-10-25 | 2012-04-26 | Lockheed Martin Corporation | Biometric speaker identification |
US20130246388A1 (en) * | 2010-12-01 | 2013-09-19 | Aware, Inc. | Relationship Detection within Biometric Match Results Candidates |
US8628480B2 (en) | 2005-05-20 | 2014-01-14 | Adidas Ag | Methods and systems for monitoring respiratory data |
US9462975B2 (en) | 1997-03-17 | 2016-10-11 | Adidas Ag | Systems and methods for ambulatory monitoring of physiological signs |
US9504410B2 (en) | 2005-09-21 | 2016-11-29 | Adidas Ag | Band-like garment for physiological monitoring |
US10540978B2 (en) * | 2018-05-30 | 2020-01-21 | Cirrus Logic, Inc. | Speaker verification |
WO2020235716A1 (en) * | 2019-05-22 | 2020-11-26 | 엘지전자 주식회사 | Intelligent electronic device and authentication method using message transmitted to intelligent electronic device |
US10872255B2 (en) * | 2017-08-14 | 2020-12-22 | Samsung Electronics Co., Ltd. | Method of processing biometric image and apparatus including the same |
US11488608B2 (en) * | 2019-12-16 | 2022-11-01 | Sigma Technologies Global Llc | Method and system to estimate speaker characteristics on-the-fly for unknown speaker with high accuracy and low latency |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE541714C2 (en) | 2017-06-27 | 2019-12-03 | Scania Cv Ab | Reducing agent dosing arrangement and exhaust gas system |
WO2022149384A1 (en) * | 2021-01-05 | 2022-07-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Identification device, identification method, and program |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4292471A (en) * | 1978-10-10 | 1981-09-29 | U.S. Philips Corporation | Method of verifying a speaker |
US5522012A (en) * | 1994-02-28 | 1996-05-28 | Rutgers University | Speaker identification and verification system |
US5675704A (en) * | 1992-10-09 | 1997-10-07 | Lucent Technologies Inc. | Speaker verification with cohort normalized scoring |
US6003002A (en) * | 1997-01-02 | 1999-12-14 | Texas Instruments Incorporated | Method and system of adapting speech recognition models to speaker environment |
US6012027A (en) * | 1997-05-27 | 2000-01-04 | Ameritech Corporation | Criteria for usable repetitions of an utterance during speech reference enrollment |
US6058364A (en) * | 1997-11-20 | 2000-05-02 | At&T Corp. | Speech recognition of customer identifiers using adjusted probabilities based on customer attribute parameters |
US6081660A (en) * | 1995-12-01 | 2000-06-27 | The Australian National University | Method for forming a cohort for use in identification of an individual |
US6141644A (en) * | 1998-09-04 | 2000-10-31 | Matsushita Electric Industrial Co., Ltd. | Speaker verification and speaker identification based on eigenvoices |
US6311272B1 (en) * | 1997-11-17 | 2001-10-30 | M-Systems Flash Disk Pioneers Ltd. | Biometric system and techniques suitable therefor |
US6401063B1 (en) * | 1999-11-09 | 2002-06-04 | Nortel Networks Limited | Method and apparatus for use in speaker verification |
US20020095287A1 (en) * | 2000-09-27 | 2002-07-18 | Henrik Botterweck | Method of determining an eigenspace for representing a plurality of training speakers |
US20030036904A1 (en) * | 2001-08-16 | 2003-02-20 | Ibm Corporation | Methods and apparatus for the systematic adaptation of classification systems from sparse adaptation data |
US6567765B1 (en) * | 2000-08-17 | 2003-05-20 | Siemens Corporate Research, Inc. | Evaluation system and method for fingerprint verification |
US6577997B1 (en) * | 1999-05-28 | 2003-06-10 | Texas Instruments Incorporated | System and method of noise-dependent classification |
US6591224B1 (en) * | 2000-06-01 | 2003-07-08 | Northrop Grumman Corporation | Biometric score normalizer |
US6681205B1 (en) * | 1999-07-12 | 2004-01-20 | Charles Schwab & Co., Inc. | Method and apparatus for enrolling a user for voice recognition |
US6691089B1 (en) * | 1999-09-30 | 2004-02-10 | Mindspeed Technologies Inc. | User configurable levels of security for a speaker verification system |
US20040107099A1 (en) * | 2002-07-22 | 2004-06-03 | France Telecom | Verification score normalization in a speaker voice recognition device |
US20040128130A1 (en) * | 2000-10-02 | 2004-07-01 | Kenneth Rose | Perceptual harmonic cepstral coefficients as the front-end for speech recognition |
US6760701B2 (en) * | 1996-11-22 | 2004-07-06 | T-Netix, Inc. | Subword-based speaker verification using multiple-classifier fusion, with channel, fusion, model and threshold adaptation |
US6823305B2 (en) * | 2000-12-21 | 2004-11-23 | International Business Machines Corporation | Apparatus and method for speaker normalization based on biometrics |
US20050033573A1 (en) * | 2001-08-09 | 2005-02-10 | Sang-Jin Hong | Voice registration method and system, and voice recognition method and system based on voice registration method and system |
US20050055214A1 (en) * | 2003-07-15 | 2005-03-10 | Microsoft Corporation | Audio watermarking with dual watermarks |
US6895376B2 (en) * | 2001-05-04 | 2005-05-17 | Matsushita Electric Industrial Co., Ltd. | Eigenvoice re-estimation technique of acoustic models for speech recognition, speaker identification and speaker verification |
US20060036443A1 (en) * | 2004-08-13 | 2006-02-16 | Chaudhari Upendra V | Policy analysis framework for conversational biometrics |
US20070129941A1 (en) * | 2005-12-01 | 2007-06-07 | Hitachi, Ltd. | Preprocessing system and method for reducing FRR in speaking recognition |
US7487089B2 (en) * | 2001-06-05 | 2009-02-03 | Sensory, Incorporated | Biometric client-server security system and method |
US7490043B2 (en) * | 2005-02-07 | 2009-02-10 | Hitachi, Ltd. | System and method for speaker verification using short utterance enrollments |
-
2005
- 2005-10-31 US US11/263,752 patent/US7788101B2/en active Active
-
2006
- 2006-09-05 JP JP2006239926A patent/JP2007128046A/en active Pending
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4292471A (en) * | 1978-10-10 | 1981-09-29 | U.S. Philips Corporation | Method of verifying a speaker |
US5675704A (en) * | 1992-10-09 | 1997-10-07 | Lucent Technologies Inc. | Speaker verification with cohort normalized scoring |
US5522012A (en) * | 1994-02-28 | 1996-05-28 | Rutgers University | Speaker identification and verification system |
US6081660A (en) * | 1995-12-01 | 2000-06-27 | The Australian National University | Method for forming a cohort for use in identification of an individual |
US6760701B2 (en) * | 1996-11-22 | 2004-07-06 | T-Netix, Inc. | Subword-based speaker verification using multiple-classifier fusion, with channel, fusion, model and threshold adaptation |
US6003002A (en) * | 1997-01-02 | 1999-12-14 | Texas Instruments Incorporated | Method and system of adapting speech recognition models to speaker environment |
US6012027A (en) * | 1997-05-27 | 2000-01-04 | Ameritech Corporation | Criteria for usable repetitions of an utterance during speech reference enrollment |
US6311272B1 (en) * | 1997-11-17 | 2001-10-30 | M-Systems Flash Disk Pioneers Ltd. | Biometric system and techniques suitable therefor |
US6058364A (en) * | 1997-11-20 | 2000-05-02 | At&T Corp. | Speech recognition of customer identifiers using adjusted probabilities based on customer attribute parameters |
US6141644A (en) * | 1998-09-04 | 2000-10-31 | Matsushita Electric Industrial Co., Ltd. | Speaker verification and speaker identification based on eigenvoices |
US6697778B1 (en) * | 1998-09-04 | 2004-02-24 | Matsushita Electric Industrial Co., Ltd. | Speaker verification and speaker identification based on a priori knowledge |
US6577997B1 (en) * | 1999-05-28 | 2003-06-10 | Texas Instruments Incorporated | System and method of noise-dependent classification |
US6681205B1 (en) * | 1999-07-12 | 2004-01-20 | Charles Schwab & Co., Inc. | Method and apparatus for enrolling a user for voice recognition |
US6691089B1 (en) * | 1999-09-30 | 2004-02-10 | Mindspeed Technologies Inc. | User configurable levels of security for a speaker verification system |
US6401063B1 (en) * | 1999-11-09 | 2002-06-04 | Nortel Networks Limited | Method and apparatus for use in speaker verification |
US6591224B1 (en) * | 2000-06-01 | 2003-07-08 | Northrop Grumman Corporation | Biometric score normalizer |
US6567765B1 (en) * | 2000-08-17 | 2003-05-20 | Siemens Corporate Research, Inc. | Evaluation system and method for fingerprint verification |
US20020095287A1 (en) * | 2000-09-27 | 2002-07-18 | Henrik Botterweck | Method of determining an eigenspace for representing a plurality of training speakers |
US20040128130A1 (en) * | 2000-10-02 | 2004-07-01 | Kenneth Rose | Perceptual harmonic cepstral coefficients as the front-end for speech recognition |
US6823305B2 (en) * | 2000-12-21 | 2004-11-23 | International Business Machines Corporation | Apparatus and method for speaker normalization based on biometrics |
US6895376B2 (en) * | 2001-05-04 | 2005-05-17 | Matsushita Electric Industrial Co., Ltd. | Eigenvoice re-estimation technique of acoustic models for speech recognition, speaker identification and speaker verification |
US7487089B2 (en) * | 2001-06-05 | 2009-02-03 | Sensory, Incorporated | Biometric client-server security system and method |
US20050033573A1 (en) * | 2001-08-09 | 2005-02-10 | Sang-Jin Hong | Voice registration method and system, and voice recognition method and system based on voice registration method and system |
US20030036904A1 (en) * | 2001-08-16 | 2003-02-20 | Ibm Corporation | Methods and apparatus for the systematic adaptation of classification systems from sparse adaptation data |
US20040107099A1 (en) * | 2002-07-22 | 2004-06-03 | France Telecom | Verification score normalization in a speaker voice recognition device |
US7409343B2 (en) * | 2002-07-22 | 2008-08-05 | France Telecom | Verification score normalization in a speaker voice recognition device |
US20050055214A1 (en) * | 2003-07-15 | 2005-03-10 | Microsoft Corporation | Audio watermarking with dual watermarks |
US20060036443A1 (en) * | 2004-08-13 | 2006-02-16 | Chaudhari Upendra V | Policy analysis framework for conversational biometrics |
US7490043B2 (en) * | 2005-02-07 | 2009-02-10 | Hitachi, Ltd. | System and method for speaker verification using short utterance enrollments |
US20070129941A1 (en) * | 2005-12-01 | 2007-06-07 | Hitachi, Ltd. | Preprocessing system and method for reducing FRR in speaking recognition |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9462975B2 (en) | 1997-03-17 | 2016-10-11 | Adidas Ag | Systems and methods for ambulatory monitoring of physiological signs |
US9750429B1 (en) | 2000-04-17 | 2017-09-05 | Adidas Ag | Systems and methods for ambulatory monitoring of physiological signs |
US8628480B2 (en) | 2005-05-20 | 2014-01-14 | Adidas Ag | Methods and systems for monitoring respiratory data |
US8033996B2 (en) | 2005-07-26 | 2011-10-11 | Adidas Ag | Computer interfaces including physiologically guided avatars |
US8790255B2 (en) | 2005-07-26 | 2014-07-29 | Adidas Ag | Computer interfaces including physiologically guided avatars |
US9504410B2 (en) | 2005-09-21 | 2016-11-29 | Adidas Ag | Band-like garment for physiological monitoring |
US20070177770A1 (en) * | 2006-01-30 | 2007-08-02 | Derchak P A | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
US20140304792A1 (en) * | 2006-01-30 | 2014-10-09 | Adidas Ag | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
US8762733B2 (en) * | 2006-01-30 | 2014-06-24 | Adidas Ag | System and method for identity confirmation using physiologic biometrics to determine a physiologic fingerprint |
US8475387B2 (en) | 2006-06-20 | 2013-07-02 | Adidas Ag | Automatic and ambulatory monitoring of congestive heart failure patients |
US20080045815A1 (en) * | 2006-06-20 | 2008-02-21 | Derchak P A | Automatic and ambulatory monitoring of congestive heart failure patients |
US9833184B2 (en) | 2006-10-27 | 2017-12-05 | Adidas Ag | Identification of emotional states using physiological responses |
US20080221401A1 (en) * | 2006-10-27 | 2008-09-11 | Derchak P Alexander | Identification of emotional states using physiological responses |
US20100110078A1 (en) * | 2008-10-30 | 2010-05-06 | Ricoh Company, Ltd. | Method and computer program product for plotting distribution area of data points in scatter diagram |
US8412940B2 (en) * | 2008-10-31 | 2013-04-02 | Hitachi, Ltd. | Biometric authentication method and system |
US20110185176A1 (en) * | 2008-10-31 | 2011-07-28 | Hitachi, Ltd. | Biometric authentication method and system |
US20120101822A1 (en) * | 2010-10-25 | 2012-04-26 | Lockheed Martin Corporation | Biometric speaker identification |
US8719018B2 (en) * | 2010-10-25 | 2014-05-06 | Lockheed Martin Corporation | Biometric speaker identification |
US20130246388A1 (en) * | 2010-12-01 | 2013-09-19 | Aware, Inc. | Relationship Detection within Biometric Match Results Candidates |
US9984157B2 (en) * | 2010-12-01 | 2018-05-29 | Aware Inc. | Relationship detection within biometric match results candidates |
US10521478B2 (en) | 2010-12-01 | 2019-12-31 | Aware, Inc. | Relationship detection within biometric match results candidates |
US11250078B2 (en) | 2010-12-01 | 2022-02-15 | Aware, Inc. | Relationship detection within biometric match results candidates |
US10872255B2 (en) * | 2017-08-14 | 2020-12-22 | Samsung Electronics Co., Ltd. | Method of processing biometric image and apparatus including the same |
US10540978B2 (en) * | 2018-05-30 | 2020-01-21 | Cirrus Logic, Inc. | Speaker verification |
US11024318B2 (en) | 2018-05-30 | 2021-06-01 | Cirrus Logic, Inc. | Speaker verification |
WO2020235716A1 (en) * | 2019-05-22 | 2020-11-26 | 엘지전자 주식회사 | Intelligent electronic device and authentication method using message transmitted to intelligent electronic device |
US11315549B2 (en) | 2019-05-22 | 2022-04-26 | Lg Electronics Inc. | Intelligent electronic device and authentication method using message sent to intelligent electronic device |
US11488608B2 (en) * | 2019-12-16 | 2022-11-01 | Sigma Technologies Global Llc | Method and system to estimate speaker characteristics on-the-fly for unknown speaker with high accuracy and low latency |
Also Published As
Publication number | Publication date |
---|---|
JP2007128046A (en) | 2007-05-24 |
US7788101B2 (en) | 2010-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7788101B2 (en) | Adaptation method for inter-person biometrics variability | |
US20060222210A1 (en) | System, method and computer program product for determining whether to accept a subject for enrollment | |
US7356168B2 (en) | Biometric verification system and method utilizing a data classifier and fusion model | |
US6519561B1 (en) | Model adaptation of neural tree networks and other fused models for speaker verification | |
WO2017113658A1 (en) | Artificial intelligence-based method and device for voiceprint authentication | |
US20070219801A1 (en) | System, method and computer program product for updating a biometric model based on changes in a biometric feature of a user | |
US6219639B1 (en) | Method and apparatus for recognizing identity of individuals employing synchronized biometrics | |
US7603275B2 (en) | System, method and computer program product for verifying an identity using voiced to unvoiced classifiers | |
US8209174B2 (en) | Speaker verification system | |
Bigun et al. | Multimodal biometric authentication using quality signals in mobile communications | |
US20060259304A1 (en) | A system and a method for verifying identity using voice and fingerprint biometrics | |
US8219571B2 (en) | Object verification apparatus and method | |
US20070219792A1 (en) | Method and system for user authentication based on speech recognition and knowledge questions | |
EP2879130A1 (en) | Methods and systems for splitting a digital signal | |
Fierrez-Aguilar et al. | Kernel-based multimodal biometric verification using quality signals | |
Lip et al. | Comparative study on feature, score and decision level fusion schemes for robust multibiometric systems | |
US7983484B2 (en) | Pattern recognition system, pattern recognition method, and pattern recognition program | |
KR100882281B1 (en) | Method To Confirm Identification Of Biometric System | |
Bigun et al. | Combining biometric evidence for person authentication | |
US20050232470A1 (en) | Method and apparatus for determining the identity of a user by narrowing down from user groups | |
US7162641B1 (en) | Weight based background discriminant functions in authentication systems | |
Varchol et al. | Multimodal biometric authentication using speech and hand geometry fusion | |
JP2003044858A (en) | Device and method for personal identification | |
Nallagatla et al. | Sequential decision fusion for controlled detection errors | |
Czyz et al. | Scalability analysis of audio-visual person identity verification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAVARES, CLIFFORD;REEL/FRAME:017179/0804 Effective date: 20051028 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |