Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20020154794 A1
Publication typeApplication
Application numberUS 10/017,118
Publication dateOct 24, 2002
Filing dateDec 7, 2001
Priority dateMar 6, 2001
Also published asCN1255756C, CN1493055A, EP1374144A1, EP1374144A4, US7298874, US20040114782, US20080159600, WO2002071316A1
Publication number017118, 10017118, US 2002/0154794 A1, US 2002/154794 A1, US 20020154794 A1, US 20020154794A1, US 2002154794 A1, US 2002154794A1, US-A1-20020154794, US-A1-2002154794, US2002/0154794A1, US2002/154794A1, US20020154794 A1, US20020154794A1, US2002154794 A1, US2002154794A1
InventorsSeong-Won Cho
Original AssigneeSeong-Won Cho
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Non-contact type human iris recognition method for correcting a rotated iris image
US 20020154794 A1
Abstract
The present invention relates to an iris recognition method for correcting a rotated iris image. The iris image is acquired by an image acquisition device using an infrared illuminator. The inner and outer boundaries of the iris are detected by analyzing the differences in pixels using a Canny edge detector and the image for the inputted iris image, so as to allow the boundaries of the iris to be more accurately detected from the eye image of a user. If the iris image acquired has been rotated at an angle with respect to the centerline of the iris, the rotated iris image is corrected for subsequent authentication purposes.
Images(8)
Previous page
Next page
Claims(10)
What is claimed is:
1. A method for correcting a rotated iris image, the method comprising the steps of:
illuminating at least a portion of iris and pupil of a person's eye to detect an eye image;
(a) extracting an iris image from said detected eye image;
(b) detecting an inner and outer boundary of said iris image;
(c) converting said extracted iris image into polar coordinates;
(d) if said iris image is slanted, normalizing the corresponding said polar coordinates of said converted iris image so as to yield a predetermined dimension;
(e) comparing previously obtained iris identification information with said normalized polar coordinates obtained in step (d); and,
(f) determining whether there is a match in data compared in step (e) to authenticate said person.
2. The method of claim 1, further comprising the steps of:
determining whether said iris image is rotated at an angle with respect to the centerline of said iris image;
if yes, temporarily generating a plurality of arrays of said iris image with respect to an array of said converted polar coordinates;
performing a wavelet transform to generate characteristic vectors corresponding to the plurality of said arrays that are temporarily generated;
comparing said respective characteristic vectors generated with previously registered characteristic vectors to obtain similarities; and,
accepting a new characteristic vector corresponding to the maximum similarity among said obtained similarities as the correct characteristic vector of said person.
3. The method of claim 1, wherein the outer boundary of said iris image is obtained by comparing the pixel value representing said detected eye image with coordinates (x, y) with the other pixel values surrounding the inner boundary of said iris image to determine the maximum difference indicative of the outer boundary of said iris image.
4. The method of claim 1, wherein a predetermined percentage of regions around said iris image is converted into polar coordinates for said comparing step (e).
5. The method of claim 1, wherein an infrared light is used for illumination.
6. The method of claim 1, wherein in a Canny edge detector is used for detecting the inner boundary of said iris image.
7. A method for correcting a rotated iris image, the method comprising the steps of:
(a) capturing a plurality of iris images from a person's eye;
(b) detecting an inner and outer boundary of said iris images;
(c) converting the predetermined amount of said captured iris images into polar coordinates;
(d) determining whether one of said iris images is slanted;
(e) if yes, temporarily generating a plurality of arrays of said iris image with respect to an array of said converted polar coordinates;
(f) performing a wavelet transform to generate characteristic vectors corresponding to the plurality of said arrays that are temporarily generated; and,
(g) comparing said respective characteristic vectors generated with previously registered characteristic vectors to obtain similarities in order to authenticate said person.
8. The method of claim 7, wherein said step (c) comprises the steps of:
normalizing the corresponding said polar coordinates of said converted iris images so as to yield a predetermined dimension;
comparing previously obtained iris identification information with said normalized polar coordinates obtained in said step (c);
determining whether there is a match in data compared in step (e); and,
accepting a new characteristic vector corresponding to the maximum similarity among said obtained similarities as the correct characteristic vector of said person.
9. The method of claim 7, wherein the outer boundary of said iris image is obtained by comparing the pixel value representing said detected eye image with coordinates (x, y) with the other pixel values surrounding the inner boundary of said iris image to determine the maximum difference indicative of the outer boundary of said iris image.
10. The method of claim 7, wherein in a Canny edge detector is used for detecting the inner boundary of said iris image.
Description
CLAIM OF PRIORITY

[0001] This application makes reference to, incorporates the same herein, and claims all benefits accruing under 35 U.S.C. Section 119 from an application for “NON-CONTACT TYPE HUMAN IRIS RECOGNITION METHOD BY CORRECTION OF ROTATED IRIS IMAGE,” filed earlier in the Korean Industrial Property Office on Mar. 6, 2001, and there duly assigned Ser. No. 2001-11441.

BACKGROUND OF THE INVENTION

[0002] 1. Field of Invention

[0003] The present invention relates to a non-contact iris recognition method of authenticating the identity of a person. More particularly, the present invention relates to a method for correcting a rotated iris image during the authentication process.

[0004] 2. Description of the Related Art

[0005] An iris recognition system is used for identifying the identity of a person by distinguishing one's own particular iris pattern. The iris recognition system is superior in its accuracy in terms of personal identification compared to the other biometric methods, such as voice or fingerprint.

[0006] Many researches in the field of iris recognition system have been focused to acquire a more accurate eye image using a variety of image acquisition devices and to efficiently obtain unique iris patterns from the human eye. However, the conventional iris recognition system contains many errors in real applications. For example, it is unlikely that an accurate eye image can be realized if the eye is not directly facing the front of a camera but positioned at a slight angle with respect to the camera. Thus, there are many instances where the eye image is rotated at an angle when the iris region is scanned for authentication purposes.

[0007] In order to solve the above problem associated with obtaining an accurate eye image, the iris identification system must accurately detect the inner/outer boundaries of the iris region and correct the iris image as needed. Currently, most conventional iris recognition methods have drawbacks in that they cannot accurately detect the deformed or slanted eye image, other than manually readjusting the image after defining an arbitrary center of the pupil or using a mean value of the entire image to readjust the slanted image. Accordingly, there is a need for an iris recognition method to normalize the rotated or slanted iris image in response to when the subject eye is not directly facing the front of the camera, or the rotated iris image is caused by the movement of the user, i.e., tilting one's head.

SUMMARY OF INVENTION

[0008] The present invention provides a non-contact iris recognition method for authenticating the identification of a person.

[0009] One aspect of the present invention provides a human iris recognition method, such that in the event that the iris image is rotated by an angle, the rotated iris image is corrected into a normal iris image. To this end, the iris image with an irregular shape is converted into polar coordinates so that the slanted iris image is reflected at a lower portion of the converted iris image in the polar coordinates. Then the iris image is normalized to predetermined dimensions, so that the iris image with a variety of deformations is corrected.

[0010] Another aspect of the invention provides the method of detecting an iris image from the eye image of a user using an image acquisition device and converting the iris image in polar coordinates, wherein the inner and outer boundaries of an iris are detected using a Canny edge detector and infrared illuminator. The method of converting to the polar coordinates further includes the steps of: comparing the pixel value of the image information at the center coordinates (x, y) of the detected inner boundary of the iris with the other pixel values of image information, which is obtained by measuring the upward, downward, leftward, and rightward directions from the inner boundary; determining the maximum value among the compared pixel values and detecting the outer boundary of the iris; extracting an iris region, defining the region between the inner and outer boundaries; and, converting the extracted iris region into the polar coordinates.

[0011] Another aspect of the inventive method provides, if the iris in the acquired eye image has been slanted, the step of normalizing the converted iris image in the polar coordinates so as to have predetermined dimensions.

[0012] Another aspect of the inventive method provides, if the iris in the acquired eye image has been rotated at an angle with respect to the centerline of the iris, the steps of temporarily generating a plurality of arrays of the iris image caused by the shifts with respect to an array of the converted iris image in the polar coordinates; performing the wavelet transform to generate the characteristic vectors of the iris corresponding to the plurality of arrays of the iris image that have been temporarily generated; comparing the respective characteristic vectors generated by the wavelet transform with the previously registered characteristic vectors to obtain similarities; and, accepting a characteristic vector corresponding to the maximum similarity among the obtained similarities as the characteristic vector of the user.

BRIEF DESCRIPTION OF THE DRAWINGS

[0013] A more complete understanding of the method of the present invention may be had by reference to the following detailed description when taken in conjunction with the accompanying drawings wherein:

[0014]FIG. 1 is a flowchart explaining the operation steps of normalizing the iris image of a person according to the present invention.

[0015]FIG. 2a is a view showing the detection result of a pupil boundary using a Canny edge detector.

[0016]FIG. 2b is a view showing the center coordinates and the diameter of a pupil.

[0017]FIG. 2c shows an iris image upon obtaining the radius and the center of the outer boundary of an iris according to the present invention.

[0018] FIGS. 3(a) to (d) show the process of normalizing a slanted iris image.

[0019] FIGS. 4(a) and (b) show a rotated iris image resulting from the tilting of a head.

[0020] FIGS. 5(a) and (b) show the process of correcting the rotated iris image shown in FIGS. 4(a) and (b).

DETAILED DESCRIPTION FOR PREFERRED EMBODIMENT

[0021] In the following description, for purposes of explanation rather than limitation, specific details are set forth such as the particular architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present invention. For purposes of simplicity and clarity, detailed descriptions of well-known devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.

[0022]FIG. 1 is a flowchart illustrating the operation steps of normalizing the iris image of a person according to the present invention. Referring to FIG. 1, in step 110, the eye image is acquired using any commercially available image acquisition device equipped with an infrared illuminator and a visible light rejection filter. The image acquisition device typically generates a reflective light to be gathered in the pupil of the eye region, so that information indicative of the iris image can be generated. In step 120, the inner and outer boundaries of the iris are detected to extract only the iris region from the acquired eye image, then the center of the detected inner and outer boundaries is set. Step 120 can be performed in a variety of ways using a well-known method known to those skilled in this art. For example, detecting the inner and outer boundaries of the iris using the differences in pixels can be performed using a Canny edge detector. See for example, U.S. Pat. No. 5,566,246 filed on Jun. 7, 1996, the content of which is hereby incorporated by reference.

[0023]FIG. 2a is an exemplary view illustrating the detection result of a pupillary boundary, i.e., the inner boundary of the iris, using the Canny edge detector. Referring to FIG. 2a, it is noted that only the pupillary boundary is detected using the Canny edge detector. The Canny edge detector smoothes the acquired image using Gaussian filtering and then detects the boundary using a Sobel operation. The Gaussian filtering process can be expressed as shown in Equation 1, and the Sobel operation can be expressed as Equation 2. I G = ( x , y ) = G ( x , y ) I ( x , y ) ; ( 1 ) S x = I [ i - 1 ] [ j + 1 ] + 2 I [ i ] [ j + 1 ] + I [ i + 1 ] [ j + 1 ] - I [ i - 1 ] [ j - 1 ] - 2 I [ i ] [ j - 1 ] - I [ i + 1 ] [ j - 1 ] S y = I [ i + 1 ] [ j + 1 ] + 2 I [ i + 1 ] [ j ] + I [ i + 1 ] [ j - 1 ] - I [ i - 1 ] [ j + 1 ] - 2 I [ i - 1 ] [ j ] - I [ i - 1 ] [ j - 1 ] . ( 2 )

[0024] When the boundary detecting method employing the Canny edge detector is used, even though a normal eye image is not acquired—that is, if the eye of a user is not directly facing the front of the camera but at a slight angle with respect to the camera, the inner boundary of the iris, i.e., the papillary boundary, can be correctly detected and the center coordinates and the radius of the pupil can be obtained. It should be noted that there are other detection systems known in this art that can be implemented to perform the detection of the inner boundary of the iris. FIG. 2b shows the center coordinates and diameter of the pupil detected. As shown in FIG. 2b, the pupil's radius is d/2, and the pupil's center coordinates are (x+d/2, y+d/2).

[0025] In the embodiment, the outer boundary of the iris can be detected by determining the pixel values away from the upward, downward, leftward, and rightward directions of the pupillary boundary, i.e., the inner boundary of the iris, where the maximum values of differences occur in the pixel values. The detected maximum values are represented by Max{I(x, y)−I(x−1, y)}, Max{I(x, y)−I(x+1, y)}, Max{I(x, y)−I(x, y−1)}, and Max{I(x, y) −I(x, y+1)}, where I(x, y) represents the pixel value of the image at the point of (x, y). In the event that the iris image changes due to the movement during the scanning process, the inner and outer centers should be adjusted accordingly.

[0026]FIG. 2c shows the iris image after determining the radius and the center of the outer boundary of the iris according to the present invention. If an incomplete eye image is obtained as the eye is not directly facing the front of the camera and positioned at a slight angle with respect to the camera, the process of adjusting the centers of the inner/outer boundaries of the iris is required. First, the radial distances RL, RR, Ru, and RD extending from the inner boundary to the left, right, upper, and lower portions to the outer boundary, respectively, and the radius RI of the inner boundary, i.e., the pupillary boundary, are calculated. The center of the outer boundary is obtained by determining the bisection points among the upward, downward, leftward, and rightward regions of the calculated values.

[0027] In step 130, iris patterns are detected only at predetermined radial distances from the inner boundary to the outer boundary (explained later). In step 140, the detected iris pattern is converted into polar coordinates as shown in FIG. 3. In step 150, the converted iris image in the polar coordinates is normalized to obtain an image with predetermined dimensions in its width and height as discussed below.

[0028] The conversion of the extracted iris patterns into the iris image in the polar coordinates can be expressed as the following Equation 3:

I(x(r,θ),y(r,θ))

I(r,θ)  (3),

[0029] Where θ is increased by 0.8 degrees, and r is calculated by using the second Cosine Rule from the distance between the outer center CO and the inner center CI of the iris, the radius RO of the outer boundary, and the value ofθ. The iris patterns between the inner and outer boundaries of the iris are extracted using the r and θ. If the iris image between the inner and outer boundaries of the iris is divided into 60 segments and theθ is varied by 0.8 degrees to represent 450 data, the iris image can be normalized into a 27000 segmented iris image (θr=45060).

[0030]FIG. 3(a) shows the slanted iris image. FIG. 3(b) shows the iris image in polar coordinates converted from the slanted iris image, as described in the preceding paragraph. It can be seen from FIG. 3(b) that the lower portion of the converted iris image in the polar coordinates is curved with an irregular shape, which is caused by the slanted iris image. FIG. 3(c) shows an iris image with the dimensions of M pixels in width and N pixels in height which is normalized from the irregular image of the iris patterns shown in FIG. 3(b).

[0031] Hereinafter, the normalization process of the slanted iris image will be described with reference to FIGS. 3(a) to (c). Referring to FIG. 3(a), the iris patterns existing at only a portion corresponding to a certain amount, X %, of the distance between the inner and outer boundaries of the iris are taken to eliminate interference from the illuminator. That is, when the inner and outer boundaries of the iris are detected, the iris patterns are taken and then converted into the polar coordinates. However, where the reflective light from the illuminator is gathered on the iris, iris patterns existing at only a portion corresponding to 60% of the distance from the inner boundary among the region from the inner boundary (pupillary boundary) of the iris to the outer boundary are converted into those in the polar coordinates. The value of the 60% selected in this embodiment of the present invention was experimentally determined as a range in which the greatest deal of iris patterns can be picked up, while excluding the reflective light gathered on the iris.

[0032] In FIG. 3(b), the slanted iris image is converted into the iris image in the polar coordinates. As shown in FIG. 3(b), when the iris patterns are converted into those in the polar coordinates, the lower portion of the converted iris pattern image in the polar coordinates is curved having an irregular shape due to a slanted iris image. In FIG. 3(c), the irregular image of the iris patterns is normalized to obtain the iris image with the dimensions of M pixels in width and N pixels in height, by scaling up/down the iris image using the nearest neighbor pixel interpolation.

[0033] In general, the performance of the iris recognition system is evaluated by two factors: a false acceptance rate (FAR) and a false rejection rate (FRR). The FAR indicates the probability that the iris recognition system incorrectly identifies an impostor as an enrollee and thus allows entrance of the impostor. The FRR indicates the probability that the iris recognition system incorrectly identifies the enrollee as an impostor and thus rejects entrance to the enrollee. In an actual simulation, the inventive method of detecting the boundaries of the iris and normalizing the slanted iris image produce the FAR that was reduced from 5.5% to 2.83% and the FRR that was reduced from 5.0% to 2.0% as compared with the conventional iris recognition system.

[0034] Finally, in step 160, if the iris in the detected eye image is rotated at an angle with respect to the centerline of the iris during the operation, the arrays of the pixels of the iris image information are moved. Hence, a correction of the rotated iris image is performed as described below.

[0035] FIGS. 4(a) to (b) show the rotated iris image resulting from the tilting of the user's head. During the acquisition of an iris image, the user's head may be tilted slightly toward the left or right, as shown in FIG. 4(a). If the eye image acquired in step 110 has been rotated at a certain angle with respect to the centerline of the eye, the process of correcting the rotated image is performed. FIG. 4(a) shows the iris image rotated by about 15 degrees in a clockwise or counterclockwise direction depending on the direction of the head tilt with respect to the centerline of the eye. When the rotated iris image is converted into an image in the polar coordinates, the iris patterns in the converted image are shifted leftward or rightward as shown in FIG. 4(b) based on the rotation direction of the angle.

[0036] FIGS. 5(a) and (b) show the process of correcting the rotated iris images shown in FIGS. 4(a) and (b). The process of correcting the rotated iris image, which has resulted from the tilting of the user's head, through comparing and moving the arrays of the iris image information will be described below with reference to FIGS. 5(a) and (b).

[0037] Referring to FIG. 5(a), from the rotated iris image resulting from the tiling of the user's head, a plurality of arrays of the iris image are temporarily generated with respect to the Array(0) of the converted iris image in the polar coordinates. That is, by shifting columns leftward or rightward of the Array(0) based on the Array(0) of the converted iris image in the polar coordinates, 20 arrays of image information from Array(0) to Array(−10) and from Array(0) to Array(10) are temporarily generated.

[0038] In order to generate characteristic vectors of the iris corresponding to the plurality of arrays of the iris image that have been temporarily generated, a wavelet transform is performed. The respective characteristic vectors generated by the wavelet transform are compared with previously registered characteristic vectors to obtain similarities. A characteristic vector corresponding to the maximum similarity among the obtained similarities is accepted as the characteristic vector of the user. In other words, by generating the arrays Array(n) of image information on the rotated image as mentioned above and performing the wavelet transform for the respective arrays of the image information as shown FIG. 5(b), the characteristic vectors fT(n) of the iris corresponding to the temporarily generated plurality of arrays Array(n) of the iris image are then generated. The characteristic vectors fT(n) are generated from fT(0) to fT(10) and from fT(0) to fT(−10). The respective generated characteristic vectors fT(n) are compared with each of the characteristic vectors fR of the enrollees and thus similarities Sn are obtained. A characteristic vector fT(n) corresponding to the maximum similarity among the obtained similarities Sn is considered the resulting value in which the rotation effect is corrected and accepted as the characteristic vector of the user's iris. As a result, the authentication of a person can be performed even though the person's iris is moved during the authentication process.

[0039] As described above, according to the non-contact iris recognition method that is capable of correcting the rotated iris image, there is an advantage in that by detecting the inner and outer boundaries of the iris using the differences in pixels of the Canny edge detector, the boundaries of the iris can be more correctly detected from the eye image of the user. If the iris in the eye image acquired by the image acquisition device has been rotated at an arbitrary angle with respect to the centerline of the iris, the rotated iris image is corrected. In addition, if the lower portion of the converted iris image in the polar coordinates is curved and thus has an irregular shape due to the slanted iris image, the iris image is normalized in predetermined dimensions. Hence, the present invention is capable of enabling a variety of deformations that may occur during the authentication operation into a correct iris image necessary for authentication purposes, so as to greatly reduce both false acceptance and rejection rates.

[0040] It should be noted that the above description merely exemplifies embodiments of the non-contact type human iris recognition method by the correction of the rotated iris image according to the present invention; thus, the present invention is not limited to the above embodiments. A person skilled in the art can make various modifications and changes to the present invention without departing from the technical spirit and scope of the present invention defined by the appended claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US5291560 *Jul 15, 1991Mar 1, 1994Iri Scan IncorporatedBiometric personal identification system based on iris analysis
US5566246 *Jun 7, 1995Oct 15, 1996Texas Instruments IncorporatedSystem and method for ranking and extracting salient contours for target recognition
US5572596 *Sep 2, 1994Nov 5, 1996David Sarnoff Research Center, Inc.Automated, non-invasive iris recognition system and method
US6144754 *Mar 25, 1998Nov 7, 2000Oki Electric Industry Co., Ltd.Method and apparatus for identifying individuals
US6285780 *Mar 27, 1998Sep 4, 2001Oki Electric Industry Co., Ltd.Apparatus for identifying individual animals and image processing method
US6373968 *Jun 5, 1998Apr 16, 2002Oki Electric Industry Co., Ltd.System for identifying individuals
US6526160 *Jul 9, 1999Feb 25, 2003Media Technology CorporationIris information acquisition apparatus and iris identification apparatus
US6700998 *Mar 8, 2000Mar 2, 2004Oki Electric Industry Co, Ltd.Iris registration unit
US6714665 *Dec 3, 1996Mar 30, 2004Sarnoff CorporationFully automated iris recognition system utilizing wide and narrow fields of view
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7436986 *Mar 25, 2003Oct 14, 2008Bausch & Lomb IncorporatedPositive patient identification
US7599524 *Apr 5, 2004Oct 6, 2009Sarnoff CorporationMethod and apparatus for providing a robust object finder
US7693307 *Dec 14, 2004Apr 6, 2010Sagem Defense SecuriteMethod and apparatus for iris recognition
US7756301Jan 26, 2005Jul 13, 2010Honeywell International Inc.Iris recognition system and method
US8009876 *Sep 13, 2005Aug 30, 2011Iritech Inc.Multi-scale variable domain decomposition method and system for iris identification
US8317325Oct 31, 2008Nov 27, 2012Cross Match Technologies, Inc.Apparatus and method for two eye imaging for iris identification
US8472681Jun 11, 2010Jun 25, 2013Honeywell International Inc.Iris and ocular recognition system using trace transforms
US20040190759 *Mar 25, 2003Sep 30, 2004Caldwell Lloyd M.Positive patient identification
US20040197011 *Apr 5, 2004Oct 7, 2004Camus Theodore A.Method and apparatus for providing a robust object finder
WO2006052192A2 *Nov 9, 2005May 18, 2006Fredrik CasparPhotography unit
WO2006081209A2Jan 25, 2006Aug 3, 2006Honeywell Int IncIris recognition system and method
WO2006081209A3 *Jan 25, 2006Dec 21, 2006Honeywell Int IncIris recognition system and method
WO2006082291A1 *Jan 12, 2006Aug 10, 2006Sagem Defense Securite S AMethod for determining an eye reference axis
WO2010062371A1 *Oct 30, 2009Jun 3, 2010Cross Match Technologies, Inc.Apparatus and method for two eye imaging for iris identification
Classifications
U.S. Classification382/117
International ClassificationA61B5/117, H04N7/18, G06T1/00, G06T7/00, G06K9/00, G06T3/00
Cooperative ClassificationG06K9/00597
European ClassificationG06K9/00S
Legal Events
DateCodeEventDescription
Dec 7, 2001ASAssignment
Owner name: EVERMEDIA CO., LTD., KOREA, REPUBLIC OF
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHO, SEONG-WON;REEL/FRAME:012388/0109
Effective date: 20011205