Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS6961453 B2
Publication typeGrant
Application numberUS 09/994,173
Publication dateNov 1, 2005
Filing dateNov 26, 2001
Priority dateAug 31, 2001
Fee statusPaid
Also published asUS20030076986
Publication number09994173, 994173, US 6961453 B2, US 6961453B2, US-B2-6961453, US6961453 B2, US6961453B2
InventorsJun Sung Yoon, Dong Hun Kim, Chul Min Joe, Soon Won Jung, Hwi Seok Lee, Byun Jin Lee, Dong Won Lee, Taek Ki Lee
Original AssigneeSecugen Corporation
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method for extracting fingerprint feature data using ridge orientation model
US 6961453 B2
Abstract
The disclosed invention herein is a method for extracting fingerprint feature data using a ridge orientation model. The fingerprint feature data extracting method includes the steps of inputting a fingerprint image, extracting ridge orientations according to regions, calculating ridge qualities according to regions and separating background region from the fingerprint image, extracting regions having a core and a delta, setting an initial ridge orientation model, calculating a ridge orientation function, and extracting the ridge orientations and the core and delta positions of the fingerprint image.
Images(10)
Previous page
Next page
Claims(10)
1. A method for extracting fingerprint feature data using a ridge orientation model, comprising:
dividing a digital fingerprint image of a predetermined format into a plurality of regions, each region with a predetermined size;
calculating ridge orientations in the regions;
calculating qualities of ridges according to regions;
separating the fingerprint image into a fingerprint region and a background region according to the calculated ridge qualities;
extracting at least one position of at least one core or delta in the fingerprint region;
determining a candidate position of a core or a delta outside the fingerprint region and a candidate position of each of the at least one core or delta in the fingerprint region from the extracted at least one position of the at least core or delta in the fingerprint region;
setting the determined candidate positions both in and outside the fingerprint region as initial parameters of a ridge orientation model;
setting a quality threshold for fingerprint regions;
calculating parameters of a ridge orientation function by minimizing errors between ridge orientation values of the ridge orientation model and ridge orientation values of fingerprint regions with quality higher than the quality threshold;
calculating ridge orientation values in all regions based on the ridge orientation function; and
deciding at least one final position of the at least one core or delta in the fingerprint region from one or more of the parameters of the ridge orientation function.
2. The method according to claim 1, wherein
the ridge qualities are calculated from a difference between a gray level difference of ridge orientation with a minimum gray level difference, and a gray level difference of ridge orientation with a maximum gray level difference, and
the fingerprint image is determined as a fingerprint region if both a gray level difference in a longitudinal orientation of ridges and a gray level difference in a lateral orientation of ridges are higher than a gray threshold, and is determined as a background region if both the gray level difference in the longitudinal orientation of ridges and the gray level difference in the lateral orientation of ridges are lower than the gray threshold.
3. The method according to claim 2, wherein the ridge orientation with the minimum gray level difference is determined to be the longitudinal orientation of the ridges, and the ridge orientation with the maximum gray level difference is determined as the lateral orientation of the ridges.
4. The method according to claim 1, wherein if a region surrounded by fingerprint regions is calculated to have a ridge quality corresponding to the background region, the region surrounded by fingerprint regions and having quality corresponding to the background region is processed as a fingerprint region.
5. The method according to claim 1, wherein the at least one position of the at least one core or delta is extracted by calculating a Poincare Index with respect to each point within a scope of the fingerprint region.
6. The method according to claim 5, wherein extracting at least one position of at least one core or delta in the fingerprint region further comprises expanding the scope and calculating the Poincare index with respect to each point within the expanded scope until the at least one position of the at least one core or delta is extracted.
7. The method according to claim 1, wherein the candidate position of the core or delta located outside the fingerprint region is determined using the following Equation, O m ( z ) = 1 2 k = 1 K g k ( arg ( z - z k ) ) where g k ( arg ( z - z k ) ) = - π 2 - arg ( z - z k ) , z k foradeltacandidateposition, = π 2 + arg ( z - z k ) , z k foracorecandidateposition,
and
z is a complex value (x+yi) representing a single arbitrary position in a two-dimensional region, and zk is a complex value representing the candidate position.
8. The method according to claim 7, wherein the candidate position is determined by optimizing an error of the following Equation in a region R having a ridge quality higher than a predetermined minimum, using a steepest descent method, O e 2 = R ( O ( z ) - O m ( z ) ) 2 z .
9. The method according to claim 1, wherein the initial ridge orientation model is set by the following Equation, O m ( Z ) = O 0 + 1 2 k = 1 K g k ( arg ( z - z k ) ; C k , 1 , C k , 2 , , C k , l where g k ( θ ) = C k , l + θ - θ l 2 π / L [ C k , l + 1 - C k , l ] , θ 1 θ θ l + 1 θ = arg ( z - z k ) ; θ k + 1 - θ k = 2 π L ; C k , l = g k ( θ 1 ) , C k , l = - π 2 - θ l , z k foradeltacandidateposition, C k , l = π 2 + θ l , z k foracorecandidateposition,
and
Oo is 0, z is a complex value (x+yi) representing a single arbitrary position in a two-dimensional region, Zk is a complex value for representing each of the determined candidate positions in turn, K is the total number of cores and deltas, and L is a positive integer.
10. The method according to claim 1, wherein the ridge orientation function is determined by optimizing an error of the following Equation in each region R with a ridge quality higher than a predetermined minimum, using a steepest descent method, O e 2 = R ( O ( z ) - O m ( Z ) ) 2 z .
Description
PRIORITY INFORMATION

This application claims priority under 35 U.S.C. 119 to Korean Patent Application Number 2001-053110 filed Aug. 31, 2001 in the Republic of Korea.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates generally to a fingerprint recognition method, and more particularly to a method for extracting the ridge orientations, and core and delta positions of a fingerprint, by using a ridge orientation model with the regional ridge orientation information and entire ridge orientation information of a fingerprint image.

2. Description of the Prior Art

As well known to those skilled in the art, fingerprint recognition is one of security authentication technologies based on biological information. A fingerprint recognition technology is used to recognize a fingerprint having unique features for each person by an image processing method, and determine whether the recognized fingerprint is a registered person's. In the fingerprint recognition technology, the most significant aspect is a process of extracting the minutiae of a fingerprint and generating fingerprint feature data. The process of generating the fingerprint feature data is described in brief as follows. First, a fingerprint acquisition device reads a fingerprint and obtains a fingerprint image. The fingerprint image obtained by the fingerprint acquisition device is divided into a plurality of regions. Thereafter, the orientation values of fingerprint ridges are extracted according to the regions, and then the grayscale values of the ridges are binarized using directional masks. Further, each ridge is thinned into a single line (or skeleton), and minutiae are extracted from thinned ridges. In this case, erroneous minutiae (pseudo minutiae) are removed from the minutiae, and the positions and directions of the correct minutiae are formed into fingerprint feature data.

FIG. 1 a is a view showing the bifurcation and ending point of fingerprint minutiae, and FIG. 1 b is a fingerprint image view showing the core and delta of fingerprint minutiae.

Referring to FIG. 1 a, a bifurcation 21 and an ending point 22 are used as the minutiae of a fingerprint. The bifurcation 21 is a point where the fingerprint ridge is branched, and the ending point 22 is a point where the fingerprint ridge is terminated. Further, a core 23 and a delta 24 shown in FIG. 1 b are also used as the minutiae of a fingerprint. The numbers of the cores and the deltas are only zero, one, or two each in any one fingerprint, and there is no other numbers of the cores and the deltas in any one fingerprint. The core 23 and the delta 24 may be recognized with the naked eyes, and have been long used as the references of various fingerprint-classifying methods.

FIG. 2 is a flowchart of a conventional fingerprint minutia extracting method. FIG. 3 a is an example view showing eight directional masks, and FIG. 3 b is an example view showing the application of directional masks to a fingerprint region.

With reference to FIG. 2, there will be described in detail a conventional fingerprint minutia extracting method, particularly a method for extracting the ending point and bifurcation of fingerprint ridges as minutiae. A fingerprint image is inputted through a fingerprint acquisition device at step 11. At step 12 of extracting and correcting a ridge orientation, the entire fingerprint image is divided into square regions, each with a predetermined size. An orientation having a smallest brightness variation in each region is designated as a ridge orientation in a corresponding region. In order to correct the ridge orientation, the orientation value of a corresponding region is determined by averaging the orientation of the corresponding region and the orientations of its surrounding regions. At step 13, a binarization process is performed using orientation information as follows. For each pixel P, each grayscale, value in a region, of which the center is the pixel P and the size is the same size as a directional mask, is multiplied by a correspondent coefficient of directional mask corresponding to the direction at the center point of the region among masks 25 a to 25 h (a mask 25 h is used in FIG. 3 b). At this time, if the summation of the multiplied result is a positive value, the pixel P is in a ridge of the fingerprint and the pixel P is converted to 1. If it is a negative value, the pixel P is in a valley of the fingerprint, and the pixel P is converted to 0. FIG. 4 is a view showing a thinning process, and FIGS. 5 a and 5 b are views showing the principle of finding minutiae in a thinned image. At thinning step 14, in order to determine a skeleton of the ridge in the binary image having a constant ridge width as shown in FIG. 4, an outline of the ridge is converted into a valley until the width of the ridge becomes to 1 (in other words, when a skeleton of the ridge remains only). Finally, at the step 15 of extracting minutia positions and its directions, with respect to a point at which a value is 1 in the thinned image, the number of regions where a transition between 1 and 0 in adjacent arbitrary two points among the neighboring eight points appears is counted (a boundary of regions represented with dotted lines in FIGS. 5 a and 5 b). When the counted numbers are 2, 4, 6, and 8, respectively, the center points corresponding to the counted numbers are respectively classified into an ending point, a ridge, a bifurcation, and a cross point. FIG. 5 a shows an ending point having a ridge direction from the left to the right, and FIG. 5 b shows a bifurcation having a ridge direction from the left to the right. Finally, the ending and the bifurcation are used as the most important feature for distinguishing the fingerprints from one other.

The accuracy of the conventional method is relatively high. However, the conventional method for extracting the ridge orientation is disadvantageous in that it extracts the ridge orientation according to regions divided in the fingerprint image and it determines representative orientations according to regions, such that a fine variation of a orientation in a region cannot be exactly represented. Further, under the provision that the ridge orientation is not rapidly changed, the conventional method evaluates the average of ridge orientations of neighboring regions around each region, and corrects a orientation value using the average. However, such a provision cannot be adapted to regions near a core or a delta where the ridge is rapidly changed in orientation. Moreover, the orientation of a region with a tiny wound or a wrinkle on a fingerprint can be corrected by predicting a ridge orientation from the ridge orientations of neighboring regions. However, if the wounded part is large, it is impossible to find an exact ridge orientation. Such problems are due to a fact that the extraction and correction of the ridge orientation are only based on local image information.

Further, a conventional core and delta extracting method calculates positions of the core and the delta by detecting a directional variation in local regions. Therefore, such a method is disadvantageous in that it cannot extract the exact positions of the core or the delta, or cannot find the exact position at all, in a fingerprint image with a wound near the core or the delta. In addition, the conventional method is also disadvantageous in that it calculates the positions of the core and the delta using the orientation values mainly calculated according to regions, such that accuracy of the positions depends on a size of a region for calculating the orientations.

As described above, the conventional computer-aided method for extracting the ridge orientation and the core and delta positions has the basic limitation for lack of the information about entire ridge flow. As an example, a fingerprint expert knows a variety of types of fingerprints. Even if some regions of the fingerprint are damaged, the fingerprint expert can recognize an entire ridge flow from ridge orientations in undamaged regions of the fingerprint. Thereby, the expert can find a precise ridge orientation of the damaged region, and extract the precise positions of the core and delta, even if the regions near the core and the delta are damaged.

Therefore, in order to solve the above problems in the conventional computer-aided algorithm, there is required a method for extracting minutiae in consideration of the entire shape (flow) of the fingerprint as well as its local region.

SUMMARY OF THE INVENTION

Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide to a method for extracting the ridge orientations and core and delta positions of a fingerprint using a ridge orientation model with the regional ridge orientation information and entire ridge orientation information of a fingerprint image.

In order to accomplish the above object, the present invention provides a method for extracting fingerprint feature data using a ridge orientation model, comprising the steps of scanning a fingerprint of a person requiring a fingerprint recognition with a fingerprint input device, and converting the fingerprint into a digital fingerprint image of predetermined format; dividing the digital fingerprint image into a plurality of regions, each with a predetermined size, and calculating ridge orientations in the regions; calculating qualities of ridges according to regions and separating the fingerprint image into a fingerprint region and a background region according to the calculated ridge qualities; evaluating and extracting positions of a core and a delta or regions with a core or delta in the fingerprint region; determining the candidate positions of the core and the delta within and outside the fingerprint region from the extracted positions of the cores and deltas in the previous step, and setting the determined candidate positions as initial parameters of an initial ridge orientation model for the core and delta; calculating a ridge orientation function by calculating parameters with a minimum error between ridge orientation values of the initial ridge orientation model and ridge orientation values of regions with quality higher than a threshold; and calculating ridge orientation values in all regions using the ridge orientation function, and deciding and extracting the positions of the core and the delta from the parameters for core and delta of the ridge orientation function.

According to the present invention, the method sets a model for representing entire feature data contained in a ridge flow of a fingerprint, and obtains a ridge orientation function by selectively utilizing information extracted from divided regions according to the model. Therefore, the exact orientations of the fingerprint ridges at all positions and exact positions of cores and deltas can be found. Further, the ridge orientations of the entire fingerprint image are represented as a few parameters constituting the ridge orientation function, thus enabling ridge orientation information to be compressed. Accordingly, the ridge orientations of the entire fingerprint image may be used as feature data in the process of fingerprint classification or fingerprint recognition.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:

FIG. 1 a is a view showing a bifurcation and an ending point of fingerprint minutiae;

FIG. 1 b is a fingerprint image view showing a core and a delta of fingerprint minutiae;

FIG. 2 is a flowchart of a conventional method for extracting fingerprint feature data;

FIG. 3 a is an example view showing eight directional masks;

FIG. 3 b is an example view showing the application of directional masks to a fingerprint region;

FIG. 4 is a view showing the thinning process;

FIGS. 5 a and 5 b are views showing the principle for finding minutiae in a thinned image;

FIG. 6 is a flowchart of a method for extracting fingerprint feature data according to the present invention;

FIG. 7 is an example view showing a mask for extracting ridge orientations of this invention;

FIG. 8 is a view showing an ideal ridge model of this invention;

FIG. 9 a is a view showing a fingerprint image with damaged parts;

FIG. 9 b is view showing the image after ridge orientation extraction, quality calculation and background separation are processed on the fingerprint image of FIG. 9 a of this invention;

FIG. 10 a is a view showing a fingerprint image having a core and a delta;

FIG. 10 b is a view showing a Poincare index with respect to the core and the delta;

FIG. 11 a is a view showing a fingerprint image having a core in a center part; and

FIG. 11 b is a view showing an initial ridge orientation model processed from the fingerprint image of FIG. 11 a.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

FIG. 6 is a flowchart of a method for extracting the ridge orientations and core and delta positions of a fingerprint in accordance with the present invention. Referring to FIG. 6, the method comprises the steps of acquiring a fingerprint image (step 51), extracting ridge orientations according to regions (step 52), calculating ridge qualities according to regions and separating background region from the fingerprint image (step 53), extracting regions having a core and a delta (step 54), setting an initial ridge orientation model (step 55), calculating a ridge orientation function (step 56), and extracting the ridge orientations and the core and delta positions of the fingerprint image (step 57).

At step 51, a fingerprint acquisition device scans a fingerprint of a person requiring fingerprint recognition, and the acquired fingerprint is converted to a digital fingerprint image of predetermined format.

At step 52, similarly to a conventional method, the fingerprint image is divided into a plurality of regions, each with a predetermined size, and ridge orientations are calculated at the regions. In other words, the entire region of the fingerprint image is divided into square blocks, each with a predetermined size, and a ridge orientation O(x,y) is calculated with respect to each point P(x,y) in each square block using the following Equation [1]. In this case, a orientation that the most points in each square block have among all orientations is determined as a representative ridge orientation of the block. O ( x , y ) = min θ [ i = - n n - 1 P i θ - P i + 1 θ ] [ 1 ]

In Equation (1), Pθi is a gray level at a position of (x+di cos θ,y+di sin θ), and θ is k π 8 ,
where k is 0, 1, 2, . . . , 7 and d is the average distance between a ridge and a valley.

FIG. 7 is an example view showing a orientation mask for extracting ridge orientations and, particularly, FIG. 7 shows positions for points P−n to Pn with respect to θ. In Equation [1], d is a half of an average distance between adjacent ridges in order that the periodicity of the ridges can be maximally reflected on the calculation of the ridge orientation O(x,y). Further, the ridge orientation can be calculated with respect to some points and not all points in the block to improve a calculation speed.

At step 53, the quality of the fingerprint image in each region is determined according as how satisfactorily the ridge represents typical ridge features, and the each region in image is separated into a fingerprint region and a background region according to the calculated quality.

The typical ridges have a periodicity that there is a constant distance between any two adjacent ridges, and have constant directionality in a narrow region. Therefore, it is assumed that the typical ridges may constitute a wave whose section forms a shape of sine wave and which is extended in a single orientation, as shown in FIG. 8.

FIG. 8 is a view showing an ideal ridge model in accordance with this invention. According to the assumption, the more a gray level difference (h in FIG. 8) between a ridge and a valley is increased, the periodicity of the ridges is increased (in other words, the more w of FIG. 8 is similar to the average distance between the ridge and the valley) and the directionality of the ridges is constant, the higher the ridge qualities according to regions are.

From the sine wave model shown in FIG. 8, it is deduced that the larger a difference between a gray level difference in the orientation with a minimum gray level difference and a gray level difference in orientation with a maximum gray level difference is, the better the ridge quality is. In order to evaluate the ridge quality, the method of the present invention uses the following Equation [2], A = Q i = - n n - 1 P i θ a - P i + 1 θ a B = Q i = - n n - 1 P i θ b - P i + 1 θ b where , θ a = min θ i = - n n - 1 P i θ - P i + 1 θ , θ b = max θ i = - n n - 1 P i θ - P i + 1 θ , [ 2 ]

Pθ i is a gray level at a position of (x+wi cos θ,y+wi sin θ) and w is an average distance between a ridge and a valley. In Equation [2], A is a gray level difference in a longitudinal orientation of the ridges (x direction of FIG. 8), and B is a gray level difference in the lateral orientation of the ridges (y direction of FIG. 8). In the case that A is smaller and B is larger, a ridge quality at an arbitrary point is higher. Namely, the ridge quality at P(x,y) is directly proportional to the difference between A and B. The quality of each block is calculated as an average quality of points in the block.

Further, a region where A and B have a low value, in other words, a region having a ridge quality lower than a predetermined threshold, is classified as a background region at step 53. But, the background regions surrounded by foreground (fingerprint) regions are classified as fingerprint region. FIGS. 9 a and 9 b show that the regions in the right side of the fingerprint image shown in FIG. 9 a are represented as the background region as shown in FIG. 9 b.

As an example, FIG. 9 a is a view showing a fingerprint image with damaged parts, and FIG. 9 b is view showing the image obtained after ridge orientation extraction, quality calculation and background separation are performed on the fingerprint image of FIG. 9 a.

Referring to FIG. 9 a, a reference numeral 60 represents some parts with a poor ridge quality in a fingerprint region. The parts 60 are portions where the fingerprint is damaged, and is not classified as the background region but as the fingerprint region as shown in FIG. 9 b (referring to FIG. 9 b, regions indicated with the circles of FIG. 9 a are processed as fingerprint parts). Here, it is well known in the field that even if these parts have a poor quality, the part is processed not as a background region, but as a fingerprint region.

Referring to FIG. 6 again, at step 54, the positions or parts where a core or a delta is positioned are determined.

FIG. 10 a is a view showing a fingerprint image having a core and a delta, and FIG. 10 b is a view showing a Poincare index with respect to the core and the delta. Referring to FIG. 10 a, the positions of the core and the delta can be found using a property that the ridge orientations in a core part C and a delta part D are changed in a orientation indicated by the arrows of FIG. 10 a. More specifically, referring to FIG. 10 b, the positions of the core and the delta can be obtained by calculating the Poincare Index (Pin) of the following Equation [3] with respect to an arbitrary region, by using a fact that orientation values in regions having a core and a delta are respectively changed by λ, and −π. Therefore, regions having the core and the delta are evaluated. The Poincare Index at a point is a value obtained by integrating differences of ridge orientations along a boundary of region including the point. P i n ( x , y ) = lim ɛ 0 1 2 π 0 2 π θ O ( x + ɛcos θ , y + ɛsin θ ) θ ɛ : a radius of region [ 3 ]

Equation (3) is used to calculate the Poincare Index in the case of a circular region, and may be extended to the case of any closed curve as well as a circle. Referring to FIG. 10 b, it is well known that the Poincare Index with respect to the core is a , while Poincare Index with respect to the delta is −. Using Equation (3), each point of the fingerprint region is found and then the Poincare Index of each point is calculated, so existence of the core and the delta and their positions can be found.

If regions around the core and the delta are damaged, in other words, the core and the delta appear in the regions with poor qualities, the positions of the core and the delta cannot be exactly found by the Poincare Index. To solve the problem in this case, the Equation [3] is applied by expanding the scope of the regions for calculating the Poincare Index. This method is able to find regions where the core and the delta are located. (This is due to a fact that the Poincare Index is a constant regardless of the scope of the region including the core and the delta, as shown in FIG. 10 b and Equation [3]).

At step 55, the positions of the core and the delta are temporarily determined from the positions or regions extracted at step 54, and the determined positions are set as initial parameters to the core or the delta in a ridge orientation model. At the step 55 of setting an initial ridge orientation model, the ridge orientation model Om(Z) is set according to following Equation [4] by using the property that ridge directions are changed by π and −π in the core and the delta, respectively. O m ( Z ) = O 0 + 1 2 k = 1 K g k ( arg ( z - z k ) ; C k , 1 , C k , 2 , , C k , l ) where g k ( θ ) = C k , l + θ - θ l 2 π / L [ C k , l + 1 - C k , l ] , θ 1 θ θ l + 1 θ = arg ( z - z k ) ; θ k + 1 - θ k = 2 π L ; C k , l = g k ( θ 1 ) [ 4 ]

In Equation (4), z is a complex value (x+yi) representing a single arbitrary position in a two-dimensional region, and zk is a complex value representing the position of the core or the delta. Further, Oo is an orientation value when z is infinite, K is the total number of cores or deltas, and L is a positive integer. In Equation 4, values for determining the ridge orientation model are Oo, Ck,l, zk. Here, the Oo is 0(zero) in the initial ridge orientation, and Ck,l is calculated by Equation [5]. C k , l = - π 2 - θ l , z k is a delta , C k , l = π 2 + θ l , z k is a core , [ 5 ]

In the above description, K, the total number of cores or deltas, contains a core or delta, which does not exist in the fingerprint region. The initial position of a core or delta existing in the fingerprint region is determined as the position, or a single arbitrary point or a center in the region temporarily calculated at step 54.

Further, the positions of the core or the delta, which does not exist in the fingerprint region, are calculated using a fact that only zero, one, or two cores and deltas exist in a finger. In addition, considering the limited size of a finger, the initial positions of core or the delta outside the image (acquired fingerprint image) in Equation [6] is set as one point outside the image, but not as one point deviating from the restricted size of the finger. The temporary positions of the core or the delta not existing in the fingerprint are calculated using Equations [6] and [7]. In Equation [6], K and zk are the number and the position of the core or delta, respectively, as calculated above. O m ( z ) = 1 2 k = 1 K g k ( arg ( z - z k ) ) where g k ( arg ( z - z k ) ) = - π 2 - arg ( z - z k ) , if z k is a delta , = π 2 + arg ( z - z k ) , if z k is a core , [ 6 ]

In Equation [6], z is a complex value of (x+yi) representing a single arbitrary position in a two-dimensional region, and zk is a complex value representing the position of the core or the delta.

Further, an error is defined as Equation [7], and the core or delta position having a minimum error is calculated using a steepest descent method. In the optimizing process of minimizing the error obtained by Equation [7], Each of the positions of the core or the delta within the fingerprint region (image) and the number of the core or the delta is a constant, and only a position of the core or the delta outside the fingerprint region is determined through the minimum error optimization process as a variable. O e 2 = R ( O ( z ) - O m ( z ) ) 2 z [ 7 ]

In Equation (7), R is a region with a quality higher than a predetermined threshold.

zk determined at step 55 are used as initial values in the error minimizing process to be executed at following step 56. In other words, they are position candidates for finding the more exact position of the core or the delta at following step 56.

FIGS. 11 a and 11 b are views showing an example of the initial ridge orientation model of this invention. A fingerprint image having the core in its center part as shown in FIG. 11 a is processed as the initial ridge orientation model, as shown in FIG. 11 b.

At step 56, a ridge orientation function is calculated using Equation [4], [5], [8] and parameters in Equation [4] is calculated with a minimum error between the ridge orientation values of the ridge orientation model and regions with qualities higher than a predetermined threshold.

The detailed description is as follows. The ridge orientation function is calculated by using the orientation values of regions, which satisfy the condition that they are the fingerprint regions and have qualities higher than the predetermined threshold at step 53. The parameters in this function are calculated with the minimum error between these orientation values and orientation values of the ridge orientation model (Equation [4]). As one method of finding the parameters having a minimum error, the error is defined as Equation [8] and then the parameters with the minimum error are found using a steepest descent method. O e 2 = R ( O ( z ) - O m ( Z ) ) 2 z [ 8 ]

In Equation [8], R is a region with a quality higher than a predetermined threshold.

At step 57, the ridge orientation values in all regions are calculated using the ridge orientation function obtained at step 56, and the final positions of the core and delta are determined as the core and delta parameters of the ridge orientation function. Further, the parameters are stored as feature data of the fingerprint, and can be used to classify and match fingerprints.

As described above, the present invention provides a method for extracting fingerprint feature data (ridge orientation, position of core and delta position), which is advantageous in that it can exactly extract the feature data of a fingerprint even in regions having a rapid variation in a ridge orientation, such as a core and a delta, by using a ridge orientation model. Further, the method of the present invention is advantageous in that even if a region with poor quality such as a cut, scar or wrinkle is large, it can exactly calculate the ridge orientations at these regions using the remainder of the ridge orientations except orientations of these regions. Further, the method of this invention may extract the exact ridge orientation at any point from ridge orientation values calculated according to regions. Moreover, in the method of this invention, the ridge orientations at all positions can be calculated from the ridge orientation function based on the ridge orientation model (Equation [4]), such that the ridge orientation information of an entire fingerprint image is compactly represented as a few parameters forming a ridge orientation function. Additionally, the parameters can be utilized as the feature data in the process of fingerprint classification and fingerprint recognition.

Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3859633Jun 29, 1973Jan 7, 1975IbmMinutiae recognition system
US3893080Jun 29, 1973Jul 1, 1975IbmMinutiae recognition system
US3944978Sep 9, 1974Mar 16, 1976Recognition Systems, Inc.Electro-optical method and apparatus for making identifications
US3959884Jul 25, 1975Jun 1, 1976First Ann Arbor CorporationMethod of classifying fingerprints
US4015240Feb 12, 1975Mar 29, 1977Calspan CorporationPattern recognition apparatus
US4047154Sep 10, 1976Sep 6, 1977Rockwell International CorporationOperator interactive pattern processing system
US4135147Sep 10, 1976Jan 16, 1979Rockwell International CorporationMinutiae pattern matcher
US4151512Sep 10, 1976Apr 24, 1979Rockwell International CorporationAutomatic pattern processing system
US4156230Nov 2, 1977May 22, 1979Rockwell International CorporationMethod and apparatus for automatic extraction of fingerprint cores and tri-radii
US4185270Jun 14, 1978Jan 22, 1980Fingermatrix, Inc.Fingerprint identification method and apparatus
US4208651May 30, 1978Jun 17, 1980Sperry CorporationFingerprint identification by ridge angle and minutiae recognition
US4210899Nov 25, 1977Jul 1, 1980Fingermatrix, Inc.Fingerprint-based access control and identification apparatus
US4225850Nov 15, 1978Sep 30, 1980Rockwell International CorporationNon-fingerprint region indicator
US4310827Apr 2, 1980Jan 12, 1982Nippon Electric Co., Ltd.Device for extracting a density as one of pattern features for each feature point of a streaked pattern
US4414684Dec 24, 1980Nov 8, 1983Interlock Sicherheitssysteme GmbhMethod and apparatus for performing a comparison of given patterns, in particular fingerprints
US4525859Sep 3, 1982Jun 25, 1985Bowles Romald EPattern recognition system
US4541113Jan 19, 1983Sep 10, 1985Seufert Wolf DApparatus and method of line pattern analysis
US4581760Sep 13, 1983Apr 8, 1986Fingermatrix, Inc.Fingerprint verification method
US4607384May 1, 1984Aug 19, 1986At&T - Technologies, Inc.Fingerprint classification arrangement
US4641350May 17, 1984Feb 3, 1987Bunn Robert FFingerprint identification system
US4646352Jun 28, 1983Feb 24, 1987Nec CorporationMethod and device for matching fingerprints with precise minutia pairs selected from coarse pairs
US4668995Apr 12, 1985May 26, 1987International Business Machines CorporationSystem for reproducing mixed images
US4685145Dec 7, 1984Aug 4, 1987Fingermatrix, Inc.Conversion of an image represented by a field of pixels in a gray scale to a field of pixels in binary scale
US4747147Jun 16, 1986May 24, 1988Sparrow Malcolm KFingerprint recognition and retrieval system
US4752966Aug 6, 1986Jun 21, 1988Fingermatrix, Inc.Fingerprint identification system
US4790564Feb 20, 1987Dec 13, 1988Morpho SystemesAutomatic fingerprint identification system including processes and apparatus for matching fingerprints
US4811414Feb 27, 1987Mar 7, 1989C.F.A. Technologies, Inc.Methods for digitally noise averaging and illumination equalizing fingerprint images
US4817183Apr 1, 1987Mar 28, 1989Sparrow Malcolm KFingerprint recognition and retrieval system
US4872203Mar 25, 1988Oct 3, 1989Nec CorporationImage input device for processing a fingerprint prior to identification
US4876726Jan 7, 1986Oct 24, 1989De La Rue Printrak, Inc.Method and apparatus for contextual data enhancement
US4896363Apr 24, 1989Jan 23, 1990Thumbscan, Inc.Apparatus and method for matching image characteristics such as fingerprint minutiae
US4933976Jan 25, 1988Jun 12, 1990C.F.A. Technologies, Inc.System for generating rolled fingerprint images
US4944021Oct 6, 1989Jul 24, 1990Nec CorporationIdentification system employing verification of fingerprints
US4947442May 24, 1989Aug 7, 1990Nec CorporationMethod and apparatus for matching fingerprints
US4983036Dec 19, 1988Jan 8, 1991Froelich Ronald WSecure identification system
US5040223Feb 17, 1989Aug 13, 1991Nippondenso Co., Ltd.Fingerprint verification method employing plural correlation judgement levels and sequential judgement stages
US5040224Apr 24, 1989Aug 13, 1991Nec CorporationFingerprint processing system capable of detecting a core of a fingerprint image by statistically processing parameters
US5054090Jul 20, 1990Oct 1, 1991Knight Arnold WFingerprint correlation system with parallel FIFO processor
US5105467Nov 27, 1990Apr 14, 1992Kim Bong IMethod of fingerprint verification
US5109428Dec 6, 1989Apr 28, 1992Fujitsu LtdMinutia data extraction in fingerprint identification
US5140642Apr 23, 1991Aug 18, 1992Wen Hsing HsuMethod and device for allocating core points of finger prints
US5177792May 30, 1991Jan 5, 1993Nec CorporationCollation of a streaked pattern at a high speed
US5187747Oct 4, 1989Feb 16, 1993Capello Richard DMethod and apparatus for contextual data enhancement
US5210797Oct 29, 1990May 11, 1993Kokusan Kinzoku Kogyo Kabushiki KaishaAdaptive dictionary for a fingerprint recognizer
US5337369Sep 9, 1993Aug 9, 1994Nec CorporationEquipment for fingerprint pattern classification
US5351304Dec 22, 1993Sep 27, 1994Yozan, Inc.Fingerprint data registration method
US5497429Sep 29, 1994Mar 5, 1996Nec CorporationApparatus for automatic fingerprint classification
US5537484Apr 20, 1995Jul 16, 1996Nippon Telegraph And Telephone CorporationMethod and apparatus for image processing
US5555314Apr 25, 1994Sep 10, 1996Ezel Inc.Method of processing an unknown physical fingerprint
US5572597Jun 6, 1995Nov 5, 1996Loral CorporationFingerprint classification system
US5613014Oct 12, 1994Mar 18, 1997Martin Marietta Corp.Fingerprint matching system
US5631971Jul 15, 1994May 20, 1997Sparrow; Malcolm K.In a fingerprint recognition method
US5631972May 4, 1995May 20, 1997Ferris; StephenHyperladder fingerprint matcher
US5633947Oct 12, 1994May 27, 1997Thorn Emi PlcMethod and apparatus for fingerprint characterization and recognition using auto correlation pattern
US5635723Dec 26, 1995Jun 3, 1997Nec CorporationFingerprint image input apparatus
US5659626Oct 20, 1994Aug 19, 1997Calspan CorporationFingerprint identification system
US5717777Jan 11, 1996Feb 10, 1998Dew Engineering And Development LimitedLongest line method and apparatus for fingerprint alignment
US5717786May 31, 1995Feb 10, 1998Nec CorporationApparatus for determining ridge direction patterns
US5740276Jul 27, 1995Apr 14, 1998Mytec Technologies Inc.Holographic method for encrypting and decrypting information using a fingerprint
US5799098Apr 3, 1997Aug 25, 1998Calspan CorporationUseful in comparing one fingerprint to another
US5812252Jan 31, 1995Sep 22, 1998Arete AssociatesFingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby
US5825907Jul 11, 1997Oct 20, 1998Lucent Technologies Inc.Neural network system for classifying fingerprints
US5832102Nov 8, 1996Nov 3, 1998Nec CorporationApparatus for extracting fingerprint features
US5841888Jan 23, 1996Nov 24, 1998Harris CorporationMethod for fingerprint indexing and searching
US5845005Dec 30, 1997Dec 1, 1998Harris CorporationApparatus for fingerprint indexing and searching
US5848176Apr 4, 1996Dec 8, 1998Nec CorporationFingerprint fingertip orientation detection method and device
US5864296May 19, 1997Jan 26, 1999Trw Inc.Fingerprint detector using ridge resistance sensor
US5878158May 10, 1995Mar 2, 1999Ferris; Stephen G.Ridge-valley minutia associator for fingerprints
US5883971Oct 23, 1996Mar 16, 1999International Business Machines CorporationSystem and method for determining if a fingerprint image contains an image portion representing a smudged fingerprint impression
US5901239Aug 1, 1996May 4, 1999Nec CorporationSkin pattern and fingerprint classification system
US5909501Sep 9, 1996Jun 1, 1999Arete AssociatesSystems and methods with identity verification by comparison and interpretation of skin patterns such as fingerprints
US5915035Jan 27, 1997Jun 22, 1999Aetex Biometric CorporationMethod for extracting high-level features for fingerprint recognition
US5917928Jul 14, 1997Jun 29, 1999Bes Systems, Inc.System and method for automatically verifying identity of a subject
US5926555Apr 3, 1997Jul 20, 1999Calspan CorporationFingerprint identification system
US5933515Jul 25, 1996Aug 3, 1999California Institute Of TechnologyUser identification through sequential input of fingerprints
US5933516Oct 28, 1997Aug 3, 1999Lockheed Martin Corp.Fingerprint matching by estimation of a maximum clique
US5937082Dec 16, 1996Aug 10, 1999Nec CorporationFingerprint/palmprint image processing apparatus
US5960101Aug 30, 1996Sep 28, 1999Printrak International, Inc.Expert matcher fingerprint system
US5963657Sep 9, 1996Oct 5, 1999Arete AssociatesEconomical skin-pattern-acquisition and analysis apparatus for access control; systems controlled thereby
US5974163Dec 12, 1996Oct 26, 1999Nec CorporationFingerprint classification system
US5982913Mar 25, 1997Nov 9, 1999The United States Of America As Represented By The National Security AgencyMethod of verification using a subset of claimant's fingerprint
US5982914Jul 29, 1997Nov 9, 1999Smarttouch, Inc.Identification of individuals from association of finger pores and macrofeatures
US5987156Nov 25, 1996Nov 16, 1999Lucent TechnologiesApparatus for correcting fixed column noise in images acquired by a fingerprint sensor
US5991408May 16, 1997Nov 23, 1999Veridicom, Inc.Identification and security using biometric measurements
US5995630Feb 28, 1997Nov 30, 1999Dew Engineering And Development LimitedBiometric input with encryption
US5995640Oct 23, 1996Nov 30, 1999International Business Machines CorporationSystem and method for determining if a fingerprint image contains an image portion representing a dry fingerprint impression
US5995641Aug 26, 1996Nov 30, 1999Fujitsu Denso Ltd.Fingerprint identification device and method of using same
US5995642 *Jun 30, 1997Nov 30, 1999Aetex Biometric CorporationMethod for automatic fingerprint classification
US6002784Oct 11, 1996Dec 14, 1999Nec CorporationApparatus and method for detecting features of a fingerprint based on a set of inner products corresponding to a directional distribution of ridges
US6002785Oct 16, 1996Dec 14, 1999Nec CorporationTenprint card selector with fingerprint pattern level processing used to select candidates for sub-pattern level processing
US6002787Dec 6, 1995Dec 14, 1999Jasper Consulting, Inc.Fingerprint analyzing and encoding system
US6005963 *Oct 23, 1996Dec 21, 1999International Business Machines CorporationSystem and method for determining if a fingerprint image contains an image portion representing a partial fingerprint impression
EP0617919A2Mar 29, 1994Oct 5, 1994Nec CorporationApparatus for obtaining ridge line pattern image on a curved surface
EP0847024A2Dec 5, 1997Jun 10, 1998Yamatake-Honeywell Co. Ltd.Fingerprint input apparatus
JPH07131322A Title not available
JPS61221883A Title not available
WO1998011478A2Sep 10, 1997Mar 19, 1998Yang LiA biometric based method for software distribution
WO1998011501A2Sep 10, 1997Mar 19, 1998D Ramesh K RaoEmbeddable module for fingerprint capture and matching
WO1998011750A2Sep 10, 1997Mar 19, 1998Yang LiMethod of using fingerprints to authenticate wireless communications
Non-Patent Citations
Reference
1 *"A Model For Interpreting Fingerprint Topology" by Sherlock et al., Pattern Recognition, v. 26, No. 7, 1993, pp. 1047-1095.
2 *"A Nonlinear Orientation Model for Global Description of Fingerprints" Vizcaya et al., Pattern Recognition, v. 29, No. 7, 1996, pp. 1221-1231.
3Ahn, D-S. et al. (Date Unknown), "Automatic Real-Time Identification of Fingerprint Images Using Block-FFT,"Source unknown, pp:1-25, (English Abstract only.).
4Author Unknown (Date Unknown). "An Algorithm For Filtering False Minutiae in Fingerprint Recognition and Its Performance Evaluation," (Source Unknown) pp. 1-34 (English Abstract only.).
5Califano, A. et al. (1994). "Multidimensional Indexing For Recognizing Visual Shapes," IEEE Transactions on Pattern Analysis and Machine Intelligence 16(4):373-392.
6Hoshino, S. et al. (2001). "Mapping a Fingerprint Image to an Artificial Finger," The Institute of Electronics, Information and Communication Engineers pp. 53-60 (English Abstract only.).
7Park, G-T., et al. (May 7, 1990). "Auto-Tuning PI Control by Pattern Recognition," Auto-Tuning PI, pp. 510-513. (English Abstract only.).
8Park, Y.T. (Sep., 2000). "Robust Fingerprint Verification by Selective Ridge Matching," (Source Unknown), pp. 351-358. (English Abstract only.).
9Vizcaya, P.R. et al. (1996). "A Nonlinear Orientation Model For Global Description of Fingerprints," Pattern Recognition 29(4):1221-1231.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7120280 *Sep 27, 2002Oct 10, 2006Symbol Technologies, Inc.Fingerprint template generation, verification and identification system
US7809211 *Nov 17, 2006Oct 5, 2010Upek, Inc.Image normalization for computed image construction
US7853054Aug 1, 2006Dec 14, 2010Symbol Technologies, Inc.Fingerprint template generation, verification and identification system
US7885437 *Feb 27, 2007Feb 8, 2011Nec CorporationFingerprint collation apparatus, fingerprint pattern area extracting apparatus and quality judging apparatus, and method and program of the same
US8165356 *May 17, 2007Apr 24, 2012Samsung Electronics Co., Ltd.Apparatus and method for determining the acceptability of a fingerprint image to be analyzed
Classifications
U.S. Classification382/125, 340/5.83, 902/3
International ClassificationG06K9/00
Cooperative ClassificationG06K9/00067
European ClassificationG06K9/00A2
Legal Events
DateCodeEventDescription
Mar 7, 2013FPAYFee payment
Year of fee payment: 8
Apr 28, 2009FPAYFee payment
Year of fee payment: 4
May 9, 2008ASAssignment
Owner name: NITGEN CO., LTD., KOREA, REPUBLIC OF
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SECUGEN CORPORATION;REEL/FRAME:020919/0349
Effective date: 20080507
Jan 8, 2003ASAssignment
Owner name: MORRISON & FOERSTER LLP, CALIFORNIA
Free format text: SECURITY INTEREST;ASSIGNOR:SECUGEN CORPORATION;REEL/FRAME:013645/0449
Effective date: 20021220
Nov 26, 2001ASAssignment
Owner name: SECUGEN CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YOON, JUN SUNG;KIM, DONG HUN;JOE, CHUL MIN;AND OTHERS;REEL/FRAME:012329/0389;SIGNING DATES FROM 20011113 TO 20011119