WO2005114553A2 - Method and system for pupil detection for security applications - Google Patents

Method and system for pupil detection for security applications Download PDF

Info

Publication number
WO2005114553A2
WO2005114553A2 PCT/US2005/015583 US2005015583W WO2005114553A2 WO 2005114553 A2 WO2005114553 A2 WO 2005114553A2 US 2005015583 W US2005015583 W US 2005015583W WO 2005114553 A2 WO2005114553 A2 WO 2005114553A2
Authority
WO
WIPO (PCT)
Prior art keywords
hght
imager
source
filter
wavelengths
Prior art date
Application number
PCT/US2005/015583
Other languages
French (fr)
Other versions
WO2005114553A3 (en
Inventor
Julie E. Fouquet
Richard E. Haven
Scott Corzine
Original Assignee
Agilent Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agilent Technologies, Inc. filed Critical Agilent Technologies, Inc.
Priority to JP2007513211A priority Critical patent/JP4799550B2/en
Priority to EP05771748A priority patent/EP1745413B1/en
Priority to DE602005015569T priority patent/DE602005015569D1/en
Publication of WO2005114553A2 publication Critical patent/WO2005114553A2/en
Publication of WO2005114553A3 publication Critical patent/WO2005114553A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the object may be imaged or detected in daylight and/ or in darkness, depending on the application. Examples of such applications include, but are not limited to, personal safety and security.
  • Security applications typically use motion detectors to trigger alarms, bright floodlights, or video cameras when a sufficiently heavy or warm mass moves within their range.
  • Motion detectors are used, for example, in home security systems and commercial security settings. Unfortunately, motion detectors do not always discriminate between human, animal, and manirnate objects. Thus, a large object, such as a dog or a truck, that moves near a motion detector may be detected and unnecessarily create a false positive by triggering an alarm, floodlight, or video camera. False positives result in extra costs for the individuals, businesses, and police departments that are required to respond to all triggered events.
  • a method and system for pupil detection are provided.
  • An object to be imaged or detected is iUuminated by a single broadband light source or multiple Hght sources emitting light at different wavelengths.
  • the light is received by a receiving module, which includes a Hght-detecting sensor and a hybrid filter.
  • the hybrid filter includes a multi-band narrowband filter and a patterned filter layer.
  • One or more images captured by the sensor are analyzed to determine if one or both pupils are detected. When a pupil (or pupils) is detected, an alert signal is generated.
  • the alert signal may trigger, for example, an alarm system, floodlights, or video cameras.
  • Pupil detection may be used independently or in combination with other features in a security system, such as, for example, motion detectors.
  • FIG. 1 is a block diagram of a system for pupil detection in an embodiment in accordance with the invention
  • FIG. 2 is a flowchart of a method for pupil detection in an embodiment in accordance with the invention
  • FIG. 3 is a diagram of a first application that uses pupil detection in an embodiment in accordance with the invention.
  • FIG. 4 is a diagram of a second application that uses pupil detection in an embodiment in accordance with the invention.
  • FIG. 5a illustrates an image generated with an on-axis light source in accordance with the embodiment of FIG. 3;
  • FIG. 5b depicts an image generated with an off-axis light source in accordance with the embodiment of FIG. 3;
  • FIG. 5c illustrates an image resulting from the difference between the
  • FIG.6 is a diagram of a third application that utilizes pupil detection in an embodiment in accordance with the invention.
  • FIG. 7 depicts a sensor in an embodiment in accordance with the invention.
  • FIG. 8 is a cross-sectional diagram of an imager in an embodiment in accordance with the invention.
  • FIG. 9 illustrates a first method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention
  • FIG. 10 depicts the spectrum for the dual-band narrowband filter of
  • FIG. 9 is a diagrammatic representation of FIG. 9
  • FIG. 11 illustrates a Fabry-Perot (FP) resonator used in a second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention
  • FIG. 12 depicts the spectrum for the Fabry-Perot resonator of FIG. 11;
  • FIG. 13 depicts a coupled-cavity resonator used in the second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention
  • FIG. 14 depicts the spectrum for the coupled-cavity resonator of FIG. 13;
  • FIG. 15 illustrates a stack of three coupled-cavity resonators that form a dual-band narrowband filter in an embodiment in accordance with the invention
  • FIG. 16 depicts the spectrum for the dual-band narrowband filter of FIG.14;
  • FIG. 17 illustrates spectra for polymer filters and a tri-band narrowband filter in an embodiment in accordance with the invention.
  • FIG. 18 depicts a sensor in accordance with the embodiment shown in FIG. 17.
  • Wavelength-dependent imaging is a technique for detecting an object, and typically involves detecting one or more particular wavelengths that reflect off the object.
  • FIG. 1 there is shown a block diagram of a system for pupil detection in an embodiment in accordance with the invention.
  • the system 100 includes an imager 102 and two light sources 104, 106.
  • two images of a subject's face and/ or eyes are captured using imager 102.
  • One of the images is taken using light source 104, while the second image is taken using Hght source 106.
  • Light sources 104, 106 are shown on opposite sides of imager 102 in the FIG. 1 embodiment.
  • light sources 104, 106 may be located on the same side of imager 102.
  • Light sources 104, 106 emit light at different wavelengths that produce substantially equal image intensity (brightness) in this embodiment in accordance with the invention.
  • Light sources 104, 106 are implemented as Hght-emitting diodes (LEDs) or multi-mode semiconductor lasers having infrared or near-infrared wavelengths in this embodiment, and each Hght source 104, 106 may be implemented as one, or multiple, sources.
  • Hght sources 104, 106 may also be replaced by a single broadband Hght source emitting Hght at two or more different wavelengths, such as the sun, for example.
  • Imager 102 is positioned to receive Hght reflected from the eye or eyes of the subject (not shown). Even though Hght sources 104, 106 can be at any wavelength, the wavelengths selected in the FIG.1 embodiment are chosen so that the Hght will not distract the subject and the iris of the eye will not contract in response to the Hght. The selected wavelengths are typicaHy in a range that imager 102 can detect.
  • System 100 further includes controHer 108, which may be dedicated to system 100 or may be a shared device. Frames of image information are generated by the imager 102 and then processed and analyzed by controller
  • ControHer 108 to distinguish a pupil (or pupils) from other features within the field of view of imager 102.
  • ControHer 108 is connected to timer 110.
  • Timer 110 represents any known device or technique that enables controHer 108 to make time-based determinations.
  • ControHed device 112 receives an alert signal from controHer 108. The alert signal is generated when controHer 108 detects the pupfl or pupHs of a subject.
  • ControHed device 112 may be implemented, for example, as an alarm or one or more floodHghts or video cameras.
  • Input device 114 may be used to input or alter various parameters associated with controHer 108.
  • FIG.2 is a flowchart of a method for pupH detection in an embodiment in accordance with the invention.
  • InitiaHy a Hght source emits Hght within a field of view, as shown in block 200.
  • One or more images are then captured and the image or images analyzed (blocks 202, 204).
  • the analysis includes generating a difference image in this embodiment in accordance with the invention.
  • the difference image wiH be discussed in more detafl in conjunction with FIGS. 3, and 5.
  • the one or more images may be captured and analyzed on a continuous basis. In other embodiments in accordance with the invention, the one or more images may be captured and analyzed at regular intervals.
  • a timer is reset (block 210) and the process returns to block 202. The method continues through blocks 202 to 210 until, at block 206, a pupil (or pupHs) is detected.
  • an alert signal is generated at block 214 and the process ends. If the pupH detection should be vaHdated, a determination is made as to whether a specified period of time has expired. When the specified time period has expired, the process continues at block 210 where a timer is reset. The method then returns to block 202 and continues until an alert signal is generated.
  • An alert signal may trigger, for example, an alarm system, floodHghts, or video cameras. One or more video cameras may include pan, tilt, and zoom features to aHow a security person to focus in on the subject, thereby aHowing the user to better identify the subject.
  • Pupil detection may be used independently or in combination with other features in a security system, such as, for example, motion detectors.
  • FIG. 3 there is shown a diagram of a first appHcation that uses pupH detection in an embodiment in accordance with the invention.
  • the appHcation may include, for example, a home security system monitoring a haHway or entranceway in the home.
  • the system includes imager 102 and Hght sources 104, 106.
  • Light sources 104, 106 emit Hght towards the face and/ or eyes of subject 300.
  • the eye or eyes reflects Hght that is captured by imager 102.
  • Hi this embodiment for pupfl detection two images of the face and/ or eyes (not shown) of subject 300 are captured using imager 102.
  • Hght source 104 which is close to or on axis 302 of the imager 102 ("the on-axis Hght").
  • the second image is taken using Hght source 106 that is located at a larger angle away from axis 302 of the imager 102 ("the off-axis Hght").
  • Differential reflectivity off a retina of subject 300 is dependent upon the angle 304 between Hght source 104 and axis 302 of the imager 102, and the angle 306 between Hght source 106 and axis 302. Hi general, making angle 304 smaHer wHl increase the retinal return.
  • “retinal return” refers to the Hght that is reflected off the back of the subject's 300 eye and detected at imager 102.
  • “Retinal return” is also used to include reflection off other tissue at the back of the eye (other than or in addition to the retina). Accordingly, angle 304 is selected such that Hght source 104 is on or close to axis 302.
  • angle 304 is typicaHy in the range from approximately zero to two degrees.
  • the size of angle 306 is chosen so that only low retinal return from Hght source 106 wiH be detected at imager 102. The iris surrounding the pupfl blocks this signal, and so pupil size under different Hghting conditions should be considered when selecting the size of angle 306.
  • angle 306 is typicaHy in the range from approximately three to fifteen degrees.
  • the size of angles 304, 306 may be different. For example, the field of view to be monitored, the distance at which the pupils should be detected, and the characteristics of a particular subject may determine the size of the angles 304, 306.
  • the images captured by imager 102 are processed and analyzed by controHer 108.
  • controHer 108 When one or both pupils of subject 300 are detected, controHer 108 generates an alert signal, which is then transmitted to a controHed device (not shown).
  • Light sources 104, 106 are constructed in the same housing with detector 102 in this embodiment in accordance with the invention.
  • Hght sources 106 may be located in a housing separate from Hght sources 104 and detector 102.
  • Hght sources 104 may be located in a housing separate from detector 102 by placing a beam spHtter between detector 102 and the object, which has the advantage of permitting a smaller effective on-axis angle of iUumination.
  • FIG. 4 is a diagram of a second appHcation that uses pupil detection in an embodiment in accordance with the invention.
  • the system includes two detectors 102a, 102b, two on-axis Hght sources 104a, 104b, two off-axis Hght sources 106a, 106b, and two controUers 108a, 108b.
  • the system generates a three-dimensional image of the eye or eyes of subject 300 by using two of the FIG.
  • each controHer 108a, 108b performs an independent analysis to detect in two-dimensions an eye or eyes of a subject 300.
  • Stereo controHer 400 uses the data generated by both controUers 108a, 108b to generate a three- dimensional image of the eye or eyes of subject 300.
  • On-axis Hght sources 104a, 104b and off-axis Hght sources 106a, 106b may be positioned in any desired configuration.
  • an on-axis Hght source (e.g. 104b) may be used as the off-axis Hght source (e.g. 106a) for the opposite system.
  • FIG. 5a illustrates an image generated with an on-axis Hght source 104 in accordance with the embodiment of FIG. 3.
  • Image 500 shows an eye that is open. The eye has a bright pupfl due to a strong retinal return created by on- axis Hght source 104.
  • FIG. 5b depicts an image generated with off-axis Hght source 106 in accordance with the embodiment of FIG. 3.
  • Image 502 illustrates a normal, dark pupil.
  • FIG. 5c illustrates image 504 resulting from the difference between the
  • FIG. 5a image and the FIG. 5b image By taking the difference between images 500, 502, relatively bright spot 506 remains against relatively dark background 508 when the eye is open. There may be vestiges of other features of the eye remaining in the background 508. However, in general, bright spot 506 wfll stand out in comparison to background 508. When the eye is closed or nearly closed, there wfll not be bright spot 506 in the differential image.
  • FIGS. 5a-5c illustrate one eye of a subject. Those skilled in the art wfll appreciate that both eyes may be monitored as weH. It wfll also be understood that a similar effect will be achieved if the images include other features of the subject (e.g. other facial features), as weH as features of the subject's environment.
  • FIG. 6 there is shown a diagram of a third appHcation that utilizes pupil detection in an embodiment in accordance with the invention.
  • an owner of building 600 wants to be alerted when a person approaches an entrance (not shown) to the building from street 602.
  • One or more imagers 604 are used to detect Hght and generate images that are processed and analyzed by one or more controUers (not shown).
  • Hght sources 606 emit Hght over a desired field of view
  • field of view 608, the number and type of imagers 604, and the number and type of Hght sources 606 are determined by the appHcation.
  • the distance at which an imager is first able to detect a pupfl also influences the number and type of imagers 604 and Hght sources 606.
  • Higher resolution imagers and a telescope with a large aperture are examples of two techniques that may be used to increase the distance at which a system first detects a pupil or pupils.
  • a controHer (not shown) detects one or both pupils and responsively generates an alert signal.
  • the alert signal may trigger, for example, an alarm, floodHghts, or one or more video cameras.
  • One or more video cameras may include pan, tilt, and zoom features to allow a security person to focus in on the subject, thereby aUowing the user to better identify the subject.
  • a security person may be required to confirm the identity of the subject prior to aHowing the subject to enter building 600.
  • FIG. 7 depicts a sensor in an embodiment in accordance with the invention. Hi this embodiment, sensor 700 is incorporated into imager 102 (FIG. 1), and is configured as a complementary metal-oxide semiconductor (CMOS) imaging sensor.
  • CMOS complementary metal-oxide semiconductor
  • Sensor 700 may be implemented with other types of imaging devices in other embodiments in accordance with the invention, such as, for example, a charge-coupled device (CCD) imager.
  • Patterned filter layer 702 is formed on sensor 700 using different filter materials shaped into a checkerboard pattern. The two filters are determined by the wavelengths being used by Hght sources 104, 106.
  • patterned filter layer 702 includes regions (identified as 1) that include a filter material for selecting the wavelength used by Hght source 104, while other regions (identified as 2) include a filter material for selecting the wavelength used by Hght source 106. Hi the FIG.
  • patterned filter layer 702 is deposited as a separate layer of sensor 700, such as, for example, on top of an underlying layer, using conventional deposition and photoHthography processes while stiU in wafer form.
  • patterned filter layer 702 can be can be created as a separate element between sensor 700 and mcident Hght. AdditionaUy, the pattern of the filter materials can be configured in a pattern other than a checkerboard pattern.
  • patterned filter layer 702 can be formed into an interlaced striped or a non-symmetrical configuration (e.g. a 3-pixel by 2-pixel shape). Patterned filter layer 702 may also be incorporated with other functions, such as color imagers.
  • patterned filter layer 702 may include blank regions (e.g. region 1) that do not cover selected areas of sensor 700 with a filter material.
  • the uncovered regions of sensor 700 therefore receive the Hght from both Hght sources 104, 106. Since the covered regions pass Hght from only one Hght source and block Hght from the other Hght source, a gain factor is calculated and appHed to the Hght passing through the covered regions. The gain factor compensates for the Hght absorbed by the filter material and for differences in sensor sensitivity between the two wavelengths.
  • Various types of filter materials can be used in patterned filter layer 702.
  • the filter materials include polymers doped with pigments or dyes.
  • the filter materials may include interference filters, reflective filters, and absorbing filters made of semiconductors, other inorganic materials, or organic materials.
  • FIG. 8 is a cross-sectional diagram of an imager in an embodiment in accordance with the invention. Only a portion of imager 102 is shown in this figure. Imager 102 includes sensor 700 comprised of pixels 800, 802, 804, 806, patterned filter layer 808 including two alternating filter regions 810, 812, glass cover 814, and dual-band narrowband filter 816.
  • Sensor 700 is configured as a CMOS imager and patterned filter layer 808 as two polymers 810, 812 doped with pigments or dyes in this embodiment in accordance with the invention.
  • Each region in patterned filter layer 808 e.g. a square in the checkerboard pattern
  • Narrowband filter 816 and patterned filter layer 808 form a hybrid filter in this embodiment in accordance with the invention.
  • the Hght at visible wavelengths ⁇ vis and at wavelengths ( ⁇ n ) are filtered out in this embodiment, while the Hght at or near wavelengths ⁇ i and ⁇ 2 transmit through the narrowband filter 816.
  • the Hght at or near wavelengths ⁇ and ⁇ 2 passes through glass cover 814.
  • polymer 810 transmits the Hght at wavelength ⁇ i while blocking the Hght at wavelength ⁇ %. Consequently, pixels 800 and 804 receive only the Hght at wavelength ⁇ i, thereby generating the image taken with the on-axis Hght source 104.
  • Polymer 812 transmits the Hght at wavelength ⁇ 2 while blocking the Hght at wavelength ⁇ i, so that pixels 802 and 806 receive only the Hght at wavelength ⁇ 2 . Hi this manner, the image taken with off-axis Hght source 106 is generated.
  • the shorter wavelength ⁇ i is associated with on-axis Hght source 104, and the longer wavelength ⁇ 2 with off-axis Hght source 106 in this embodiment in accordance with the invention.
  • the shorter wavelength ⁇ i may be associated with off-axis Hght source 106 and the longer wavelength ⁇ 2 with on-axis Hght source 104 in other embodiments in accordance with the invention.
  • Narrowband filter 816 is a dielectric stack filter in this embodiment in accordance with the invention.
  • Dielectric stack filters are designed to have particular spectral properties.
  • H this embodiment in accordance with the invention the dielectric stack filter is formed as a dual-band narrowband filter.
  • Narrowband filter 816 is designed to have one peak at ⁇ i and another peak at ⁇ 2 . Therefore, only the Hght at or near wavelengths ⁇ i and ⁇ 2 strikes polymer filters 810, 812 in patterned filter layer 808. Patterned filter layer 808 is then used to discriminate between ⁇ i and ⁇ 2 . Wavelength ⁇ i is transmitted through filter 810 (and not through filter 812), while wavelength ⁇ 2 is transmitted through filter 812 (and not through filter 810).
  • patterned filter layer 808 provides a mechanism for selecting channels at pixel spatial resolution.
  • channel one is associated with the on-axis image and channel two with the off-axis image.
  • channel one may be associated with the off-axis image and channel two with the on-axis image.
  • Sensor 700 sits in a carrier (not shown) in this embodiment in accordance with the invention.
  • Glass cover 814 typicaHy protects sensor 700 from damage and particle contamination (e.g. dust).
  • Glass cover 814 is formed as a colored glass filter in this embodiment, and is included as the substrate of the dielectric stack filter (i.e., narrowband filter 816).
  • the colored glass filter is designed to have certain spectral properties, and is doped with pigments or dyes.
  • Schott Optical Glass Hie a company located in Mainz, Germany, is one company that manufactures colored glass that can be used in colored glass filters.
  • FIG. 9 there is shown a first method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention.
  • narrowband filter 816 is a dielectric stack filter that is formed as a dual-band narrowband filter.
  • Dielectric stack filters can include any combination of filter types. The desired spectral properties of the completed dielectric stack filter determine which types of filters are included in the layers of the stack.
  • a filter can be fabricated by combining two filters 900, 902.
  • Band-blocking filter 900 filters out the Hght at wavelengths between the regions around wavelengths ⁇ i and ⁇ 2 , while bandpass filter 902 transmits Hght near and between wavelengths ⁇ i and ⁇ 2 .
  • the combination of filters 900, 902 transmits Hght in the hatched areas, while blocking Hght at aH other wavelengths.
  • FIG. 10 depicts the spectrum for the dual-band narrowband filter of FIG. 9. As can be seen, Hght transmits through the combined filters only at or near the wavelengths of interest, ⁇ i (spectrum 1000) and ⁇ 2 (spectrum 1002).
  • a dual-band narrowband filter can also be fabricated by stacking coupled-cavity resonators on top of each other, where each coupled-cavity resonator is formed with two Fabry-Perot resonators.
  • FIG. 11 iHustrates a Fabry-Perot (FP) resonator used in a second method for fabricating a dual- band narrowband filter in an embodiment in accordance with the invention.
  • Resonator 1100 includes upper Distributed Bragg reflector (DBR) 1102 layer and lower DBR layer 1104.
  • DBR Distributed Bragg reflector
  • the materials that form the DBR layers include N pairs of quarter-wavelength (m ⁇ /4) thick low index material and quarter- wavelength (n ⁇ /4) thick high index material, where the variable N is an integer number and the variables m and n are odd integer numbers.
  • the wavelength is defined as the wavelength of Hght in a layer, which is equal to the freespace wavelength divided by the layer index of refraction.
  • Cavity 1106 separates the two DBR layers 1102, 1104. Cavity 1106 is configured as a haH-wavelength ( ⁇ /2) thick cavity, where p is an integer number. The thickness of cavity 1106 and the materials in DBR layers 1102,
  • FIG. 1104 determine the transmission peak for FP resonator 1100.
  • FIG. 12 depicts the spectrum for the Fabry-Perot resonator of FIG. 11.
  • FP resonator 1100 has a single transmission peak 1200.
  • FIG. 13 depicts a coupled-cavity resonator used in the second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention.
  • Coupled-cavity resonator 1300 includes upper DBR layer 1302, cavity 1304, strong-coupling DBR 1306, cavity 1308, and lower DBR layer 1310.
  • the strong-coupling DBR 1306 is formed when the lower DBR layer of top FP resonator (i.e., layer 1104) merges with an upper DBR layer of bottom FP resonator (i.e., layer 1102). Stacking two FP resonators together spHts the single transmission peak 1200 in FIG. 12 into two peaks, as shown in FIG. 14. The number of pairs of quarter-wavelength thick index materials in strong-coupling DBR 1306 determines the coupling strength between cavities 1304, 1308. And the coupling strength between cavities 1304, 1308 controls the spacing between peak 1400 and peak 1502. FIG.
  • Dual-band narrowband fflter 1500 includes upper DBR layer 1502, cavity 1504, strong-coupling DBR 1506, cavity 1508, weak-coupling DBR 1510, cavity 1512, strong-coupling DBR 1514, cavity 1516, weak-coupling DBR 1518, cavity 1520, strong-coupling DBR 1522, cavity 1524, and lower DBR layer 1526.
  • Stacking three coupled-cavity resonators together spHts each of the two peaks 1400, 1402 into a triplet of peaks 1600, 1602, respectively.
  • 16 depicts the spectrum for the dual-band narrowband filter of FIG. 15.
  • the strength of the coupling in weak-coupling DBRs 1510, 1518 is reduced by increasing the number of mirror pairs in coupling DBRs 1510, 1518.
  • the reduced coupling strength merges each triplet of peaks 1600, 1602 into a single broad, fairly flat transmission band.
  • Changing the number of pairs of quarter-wavelength thick index materials in weak-coupling DBRs 1510, 1518 alters the spacing within the triplet of peaks 1600, 1602.
  • FIG. 17 fllustrates spectra for polymer filters and a tri-band narrowband filter in an embodiment in accordance with the invention.
  • a hybrid filter in this embodiment detects Hght at three wavelengths of interest, ⁇ i, ⁇ 2 , and ⁇ 3 .
  • Spectra 1700 and 1702 at wavelengths ⁇ i and ⁇ 3 respectively, represent two signals to be utilized by an imaging system.
  • Light detected at wavelength ⁇ 2 (spectrum 1704) is used to determine the amount of Hght received by the imaging system outside the two wavelengths of interest.
  • the amount of Hght detected at wavelength ⁇ 2 may be used as a reference amount of Hght detectable by the imaging system.
  • a tri-band narrowband filter transmits light at or near the wavelengths of interest ( ⁇ j, ⁇ 2 , and ⁇ 3 ) while blocking the transmission of Hght at aU other wavelengths in this embodiment in accordance with the invention.
  • Polymer filters in a patterned filter layer then discriminate between the Hght received at wavelengths ⁇ i ⁇ 2 , and ⁇ 3 .
  • FIG. 18 depicts a sensor in accordance with the embodiment shown in FIG. 17.
  • Patterned filter layer 1800 is formed on sensor 1802 using three different filters. Hi this embodiment, one region in patterned filter layer (e.g.
  • region 1) transmits the Hght at wavelength ⁇ i while blocking the Hght at wavelengths ⁇ 2 and ⁇ 3 (see spectrum 1706 in FIG. 17).
  • region 3 Another region in patterned filter layer (e.g. region 3) transmits the Hght at wavelength ⁇ 3 while blocking the Hght at wavelengths ⁇ i and ⁇ 2 (see spectrum 1708 in FIG. 17).
  • the third region transmits Hght at wavelength ⁇ 2 while blocking the Hght at wavelengths ⁇ i and ⁇ 3 (see spectrum 1710 in FIG. 17).

Abstract

An object (300) to be detected is illuminated by a single broadband light source or multiple light sources (104, 106) emitting light at different wavelengths. The light is captured by an imager (102), which includes a light-detecting sensor (700) covered by a hybrid filter.

Description

METHOD AND SYSTEM FOR PUPIL DETECTION FOR SECURITY APPLICATIONS
BACKGROUND
There are a number of applications in which it is of interest to detect or image an object. The object may be imaged or detected in daylight and/ or in darkness, depending on the application. Examples of such applications include, but are not limited to, personal safety and security. Security applications typically use motion detectors to trigger alarms, bright floodlights, or video cameras when a sufficiently heavy or warm mass moves within their range. Motion detectors are used, for example, in home security systems and commercial security settings. Unfortunately, motion detectors do not always discriminate between human, animal, and manirnate objects. Thus, a large object, such as a dog or a truck, that moves near a motion detector may be detected and unnecessarily create a false positive by triggering an alarm, floodlight, or video camera. False positives result in extra costs for the individuals, businesses, and police departments that are required to respond to all triggered events.
SUMMARY
In accordance with the invention, a method and system for pupil detection are provided. An object to be imaged or detected is iUuminated by a single broadband light source or multiple Hght sources emitting light at different wavelengths. The light is received by a receiving module, which includes a Hght-detecting sensor and a hybrid filter. The hybrid filter includes a multi-band narrowband filter and a patterned filter layer. One or more images captured by the sensor are analyzed to determine if one or both pupils are detected. When a pupil (or pupils) is detected, an alert signal is generated.
The alert signal may trigger, for example, an alarm system, floodlights, or video cameras. Pupil detection may be used independently or in combination with other features in a security system, such as, for example, motion detectors.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will best be understood by reference to the following detailed description of embodiments in accordance with the invention when read in conjunction with the accompanying drawings, wherein:
FIG. 1 is a block diagram of a system for pupil detection in an embodiment in accordance with the invention; FIG. 2 is a flowchart of a method for pupil detection in an embodiment in accordance with the invention;
FIG. 3 is a diagram of a first application that uses pupil detection in an embodiment in accordance with the invention;
FIG. 4 is a diagram of a second application that uses pupil detection in an embodiment in accordance with the invention;
FIG. 5a illustrates an image generated with an on-axis light source in accordance with the embodiment of FIG. 3;
FIG. 5b depicts an image generated with an off-axis light source in accordance with the embodiment of FIG. 3; FIG. 5c illustrates an image resulting from the difference between the
FIG.5a image and the FIG.5b image;
FIG.6 is a diagram of a third application that utilizes pupil detection in an embodiment in accordance with the invention; FIG. 7 depicts a sensor in an embodiment in accordance with the invention;
FIG. 8 is a cross-sectional diagram of an imager in an embodiment in accordance with the invention;
FIG. 9 illustrates a first method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention; FIG. 10 depicts the spectrum for the dual-band narrowband filter of
FIG. 9;
FIG. 11 illustrates a Fabry-Perot (FP) resonator used in a second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention;
FIG. 12 depicts the spectrum for the Fabry-Perot resonator of FIG. 11;
FIG. 13 depicts a coupled-cavity resonator used in the second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention;
FIG. 14 depicts the spectrum for the coupled-cavity resonator of FIG. 13;
FIG. 15 illustrates a stack of three coupled-cavity resonators that form a dual-band narrowband filter in an embodiment in accordance with the invention; FIG. 16 depicts the spectrum for the dual-band narrowband filter of FIG.14;
FIG. 17 illustrates spectra for polymer filters and a tri-band narrowband filter in an embodiment in accordance with the invention; and
FIG. 18 depicts a sensor in accordance with the embodiment shown in FIG. 17.
DETAILED DESCRIPTION
The following description is presented to enable one skilled in the art to make and use the invention, and is provided in the context of a patent application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the generic principles herein may be applied to other embodiments. Thus, the invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the appended claims and with the principles and features described herein. It should be understood that the drawings referred to in this description are not drawn to scale. Embodiments in accordance with the invention described herein utilize wavelength-dependent imaging for security applications. Wavelength- dependent imaging is a technique for detecting an object, and typically involves detecting one or more particular wavelengths that reflect off the object. In some applications, only solar or ambient illumination is required, while in other applications other or additional iUumination is needed. With reference now to the figures and in particular with reference to FIG. 1, there is shown a block diagram of a system for pupil detection in an embodiment in accordance with the invention. The system 100 includes an imager 102 and two light sources 104, 106. In this embodiment for pupil detection, two images of a subject's face and/ or eyes (not shown) are captured using imager 102. One of the images is taken using light source 104, while the second image is taken using Hght source 106. Light sources 104, 106 are shown on opposite sides of imager 102 in the FIG. 1 embodiment. In other embodiments in accordance with the invention, light sources 104, 106 may be located on the same side of imager 102. Light sources 104, 106 emit light at different wavelengths that produce substantially equal image intensity (brightness) in this embodiment in accordance with the invention. Light sources 104, 106 are implemented as Hght-emitting diodes (LEDs) or multi-mode semiconductor lasers having infrared or near-infrared wavelengths in this embodiment, and each Hght source 104, 106 may be implemented as one, or multiple, sources. In other embodiments in accordance with the invention, Hght sources 104, 106 may also be replaced by a single broadband Hght source emitting Hght at two or more different wavelengths, such as the sun, for example. Imager 102 is positioned to receive Hght reflected from the eye or eyes of the subject (not shown). Even though Hght sources 104, 106 can be at any wavelength, the wavelengths selected in the FIG.1 embodiment are chosen so that the Hght will not distract the subject and the iris of the eye will not contract in response to the Hght. The selected wavelengths are typicaHy in a range that imager 102 can detect. System 100 further includes controHer 108, which may be dedicated to system 100 or may be a shared device. Frames of image information are generated by the imager 102 and then processed and analyzed by controller
108 to distinguish a pupil (or pupils) from other features within the field of view of imager 102. ControHer 108 is connected to timer 110. Timer 110 represents any known device or technique that enables controHer 108 to make time-based determinations. ControHed device 112 receives an alert signal from controHer 108. The alert signal is generated when controHer 108 detects the pupfl or pupHs of a subject. ControHed device 112 may be implemented, for example, as an alarm or one or more floodHghts or video cameras. Input device 114 may be used to input or alter various parameters associated with controHer 108. For example, a user may want system 100 to capture a certain number of images within a particular time period or verify pupH detection prior to generating an alert signal. Input device 114 may be implemented, for example, as a computer or a control panel operating pursuant to a security program. FIG.2 is a flowchart of a method for pupH detection in an embodiment in accordance with the invention. InitiaHy a Hght source emits Hght within a field of view, as shown in block 200. One or more images are then captured and the image or images analyzed (blocks 202, 204). The analysis includes generating a difference image in this embodiment in accordance with the invention. The difference image wiH be discussed in more detafl in conjunction with FIGS. 3, and 5. A determination is made at block 206 as to whether a pupil (or pupHs) has been detected. If a pupH has not been detected, a determination is then made at block 208 as to whether a specified period of time has expired. In some embodiments in accordance with the invention, the one or more images may be captured and analyzed on a continuous basis. In other embodiments in accordance with the invention, the one or more images may be captured and analyzed at regular intervals. When the specified time period has expired, a timer is reset (block 210) and the process returns to block 202. The method continues through blocks 202 to 210 until, at block 206, a pupil (or pupHs) is detected. When a pupH is detected, a determination is made at block 212 as to whether the pupH detection should be vaHdated. If not, an alert signal is generated at block 214 and the process ends. If the pupH detection should be vaHdated, a determination is made as to whether a specified period of time has expired. When the specified time period has expired, the process continues at block 210 where a timer is reset. The method then returns to block 202 and continues until an alert signal is generated. An alert signal may trigger, for example, an alarm system, floodHghts, or video cameras. One or more video cameras may include pan, tilt, and zoom features to aHow a security person to focus in on the subject, thereby aHowing the user to better identify the subject. Pupil detection may be used independently or in combination with other features in a security system, such as, for example, motion detectors. Referring now to FIG. 3, there is shown a diagram of a first appHcation that uses pupH detection in an embodiment in accordance with the invention. The appHcation may include, for example, a home security system monitoring a haHway or entranceway in the home. The system includes imager 102 and Hght sources 104, 106. Light sources 104, 106 emit Hght towards the face and/ or eyes of subject 300. The eye or eyes reflects Hght that is captured by imager 102. Hi this embodiment for pupfl detection, two images of the face and/ or eyes (not shown) of subject 300 are captured using imager 102. One of the images is taken using Hght source 104, which is close to or on axis 302 of the imager 102 ("the on-axis Hght"). The second image is taken using Hght source 106 that is located at a larger angle away from axis 302 of the imager 102 ("the off-axis Hght"). When eyes of the subject 300 are open, the difference between the images highlights the pupils of the eyes. This is because specular reflection from the retina is detected only in the on-axis image. The diffuse reflections from other facial and environmental features are largely canceHed out, leaving the pupHs as the dominant feature in the differential image. Differential reflectivity off a retina of subject 300 is dependent upon the angle 304 between Hght source 104 and axis 302 of the imager 102, and the angle 306 between Hght source 106 and axis 302. Hi general, making angle 304 smaHer wHl increase the retinal return. As used herein, "retinal return" refers to the Hght that is reflected off the back of the subject's 300 eye and detected at imager 102. "Retinal return" is also used to include reflection off other tissue at the back of the eye (other than or in addition to the retina). Accordingly, angle 304 is selected such that Hght source 104 is on or close to axis 302. H this embodiment in accordance with the invention, angle 304 is typicaHy in the range from approximately zero to two degrees. H general, the size of angle 306 is chosen so that only low retinal return from Hght source 106 wiH be detected at imager 102. The iris surrounding the pupfl blocks this signal, and so pupil size under different Hghting conditions should be considered when selecting the size of angle 306. Hi this embodiment in accordance with the invention, angle 306 is typicaHy in the range from approximately three to fifteen degrees. H other embodiments in accordance with the invention, the size of angles 304, 306 may be different. For example, the field of view to be monitored, the distance at which the pupils should be detected, and the characteristics of a particular subject may determine the size of the angles 304, 306. The images captured by imager 102 are processed and analyzed by controHer 108. When one or both pupils of subject 300 are detected, controHer 108 generates an alert signal, which is then transmitted to a controHed device (not shown). Light sources 104, 106 are constructed in the same housing with detector 102 in this embodiment in accordance with the invention. In another embodiment in accordance with the invention, Hght sources 106 may be located in a housing separate from Hght sources 104 and detector 102. In yet another embodiment in accordance with the invention, Hght sources 104 may be located in a housing separate from detector 102 by placing a beam spHtter between detector 102 and the object, which has the advantage of permitting a smaller effective on-axis angle of iUumination. FIG. 4 is a diagram of a second appHcation that uses pupil detection in an embodiment in accordance with the invention. The system includes two detectors 102a, 102b, two on-axis Hght sources 104a, 104b, two off-axis Hght sources 106a, 106b, and two controUers 108a, 108b. The system generates a three-dimensional image of the eye or eyes of subject 300 by using two of the FIG. 3 systems in an epipolar stereo configuration. Hi this embodiment, the comparable rows of pixels in each detector 102a, 102b He in the same plane. In other embodiments in accordance with the invention comparable rows of pixels do not He in the same plane and adjustment values are generated to compensate for the row configurations. Each controHer 108a, 108b performs an independent analysis to detect in two-dimensions an eye or eyes of a subject 300. Stereo controHer 400 uses the data generated by both controUers 108a, 108b to generate a three- dimensional image of the eye or eyes of subject 300. On-axis Hght sources 104a, 104b and off-axis Hght sources 106a, 106b may be positioned in any desired configuration. H some embodiments in accordance with the invention, an on-axis Hght source (e.g. 104b) may be used as the off-axis Hght source (e.g. 106a) for the opposite system. FIG. 5a illustrates an image generated with an on-axis Hght source 104 in accordance with the embodiment of FIG. 3. Image 500 shows an eye that is open. The eye has a bright pupfl due to a strong retinal return created by on- axis Hght source 104. FIG. 5b depicts an image generated with off-axis Hght source 106 in accordance with the embodiment of FIG. 3. Image 502 in FIG. 5b may be taken at the same time as image 500, or it may be taken in an alternate frame (successively or non-successively) to image 500. Image 502 illustrates a normal, dark pupil. FIG. 5c illustrates image 504 resulting from the difference between the
FIG. 5a image and the FIG. 5b image. By taking the difference between images 500, 502, relatively bright spot 506 remains against relatively dark background 508 when the eye is open. There may be vestiges of other features of the eye remaining in the background 508. However, in general, bright spot 506 wfll stand out in comparison to background 508. When the eye is closed or nearly closed, there wfll not be bright spot 506 in the differential image. FIGS. 5a-5c illustrate one eye of a subject. Those skilled in the art wfll appreciate that both eyes may be monitored as weH. It wfll also be understood that a similar effect will be achieved if the images include other features of the subject (e.g. other facial features), as weH as features of the subject's environment. These features wiH largely cancel out in a manner similar to that just described. Referring now to FIG. 6, there is shown a diagram of a third appHcation that utilizes pupil detection in an embodiment in accordance with the invention. Hi this embodiment an owner of building 600 wants to be alerted when a person approaches an entrance (not shown) to the building from street 602. One or more imagers 604 are used to detect Hght and generate images that are processed and analyzed by one or more controUers (not shown). Several Hght sources 606 emit Hght over a desired field of view
608. Those skflled in the art wiU appreciate that field of view 608, the number and type of imagers 604, and the number and type of Hght sources 606 are determined by the appHcation. The distance at which an imager is first able to detect a pupfl also influences the number and type of imagers 604 and Hght sources 606. Higher resolution imagers and a telescope with a large aperture are examples of two techniques that may be used to increase the distance at which a system first detects a pupil or pupils. When a person approaches buflding 600, a controHer (not shown) detects one or both pupils and responsively generates an alert signal. The alert signal may trigger, for example, an alarm, floodHghts, or one or more video cameras. One or more video cameras may include pan, tilt, and zoom features to allow a security person to focus in on the subject, thereby aUowing the user to better identify the subject. For example, a security person may be required to confirm the identity of the subject prior to aHowing the subject to enter building 600. FIG. 7 depicts a sensor in an embodiment in accordance with the invention. Hi this embodiment, sensor 700 is incorporated into imager 102 (FIG. 1), and is configured as a complementary metal-oxide semiconductor (CMOS) imaging sensor. Sensor 700 may be implemented with other types of imaging devices in other embodiments in accordance with the invention, such as, for example, a charge-coupled device (CCD) imager. Patterned filter layer 702 is formed on sensor 700 using different filter materials shaped into a checkerboard pattern. The two filters are determined by the wavelengths being used by Hght sources 104, 106. For example, in this embodiment in accordance with the invention, patterned filter layer 702 includes regions (identified as 1) that include a filter material for selecting the wavelength used by Hght source 104, while other regions (identified as 2) include a filter material for selecting the wavelength used by Hght source 106. Hi the FIG. 7 embodiment, patterned filter layer 702 is deposited as a separate layer of sensor 700, such as, for example, on top of an underlying layer, using conventional deposition and photoHthography processes while stiU in wafer form. Hi another embodiment in accordance with the invention, patterned filter layer 702 can be can be created as a separate element between sensor 700 and mcident Hght. AdditionaUy, the pattern of the filter materials can be configured in a pattern other than a checkerboard pattern. For example, patterned filter layer 702 can be formed into an interlaced striped or a non-symmetrical configuration (e.g. a 3-pixel by 2-pixel shape). Patterned filter layer 702 may also be incorporated with other functions, such as color imagers. Hi other embodiments in accordance with the invention, patterned filter layer 702 may include blank regions (e.g. region 1) that do not cover selected areas of sensor 700 with a filter material. The uncovered regions of sensor 700 therefore receive the Hght from both Hght sources 104, 106. Since the covered regions pass Hght from only one Hght source and block Hght from the other Hght source, a gain factor is calculated and appHed to the Hght passing through the covered regions. The gain factor compensates for the Hght absorbed by the filter material and for differences in sensor sensitivity between the two wavelengths. Various types of filter materials can be used in patterned filter layer 702. In this embodiment in accordance with the invention, the filter materials include polymers doped with pigments or dyes. Hi other embodiments in accordance with the invention, the filter materials may include interference filters, reflective filters, and absorbing filters made of semiconductors, other inorganic materials, or organic materials. FIG. 8 is a cross-sectional diagram of an imager in an embodiment in accordance with the invention. Only a portion of imager 102 is shown in this figure. Imager 102 includes sensor 700 comprised of pixels 800, 802, 804, 806, patterned filter layer 808 including two alternating filter regions 810, 812, glass cover 814, and dual-band narrowband filter 816. Sensor 700 is configured as a CMOS imager and patterned filter layer 808 as two polymers 810, 812 doped with pigments or dyes in this embodiment in accordance with the invention. Each region in patterned filter layer 808 (e.g. a square in the checkerboard pattern) overHes a pixel in the CMOS imager. Narrowband filter 816 and patterned filter layer 808 form a hybrid filter in this embodiment in accordance with the invention. When Hght strikes narrowband filter 816, the Hght at wavelengths other than the wavelengths of Hght source 104 (λι) and Hght source 106 (λ2) are filtered out, or blocked, from passing through the narrowband filter 816. Thus, the Hght at visible wavelengths λvis and at wavelengths (λn) are filtered out in this embodiment, while the Hght at or near wavelengths λi and λ2 transmit through the narrowband filter 816. Thus, only Hght at or near the wavelengths λι and λ2 passes through glass cover 814. Thereafter, polymer 810 transmits the Hght at wavelength λi while blocking the Hght at wavelength λ%. Consequently, pixels 800 and 804 receive only the Hght at wavelength λi, thereby generating the image taken with the on-axis Hght source 104. Polymer 812 transmits the Hght at wavelength λ2 while blocking the Hght at wavelength λi, so that pixels 802 and 806 receive only the Hght at wavelength λ2. Hi this manner, the image taken with off-axis Hght source 106 is generated. The shorter wavelength λi is associated with on-axis Hght source 104, and the longer wavelength λ2 with off-axis Hght source 106 in this embodiment in accordance with the invention. The shorter wavelength λi, however, may be associated with off-axis Hght source 106 and the longer wavelength λ2 with on-axis Hght source 104 in other embodiments in accordance with the invention. Narrowband filter 816 is a dielectric stack filter in this embodiment in accordance with the invention. Dielectric stack filters are designed to have particular spectral properties. H this embodiment in accordance with the invention, the dielectric stack filter is formed as a dual-band narrowband filter. Narrowband filter 816 is designed to have one peak at λi and another peak at λ2. Therefore, only the Hght at or near wavelengths λi and λ2 strikes polymer filters 810, 812 in patterned filter layer 808. Patterned filter layer 808 is then used to discriminate between λi and λ2. Wavelength λi is transmitted through filter 810 (and not through filter 812), while wavelength λ2 is transmitted through filter 812 (and not through filter 810). Those skiUed in the art wiH appreciate patterned filter layer 808 provides a mechanism for selecting channels at pixel spatial resolution. Hi this embodiment in accordance with the invention, channel one is associated with the on-axis image and channel two with the off-axis image. In other embodiments in accordance with the invention, channel one may be associated with the off-axis image and channel two with the on-axis image. Sensor 700 sits in a carrier (not shown) in this embodiment in accordance with the invention. Glass cover 814 typicaHy protects sensor 700 from damage and particle contamination (e.g. dust). Glass cover 814 is formed as a colored glass filter in this embodiment, and is included as the substrate of the dielectric stack filter (i.e., narrowband filter 816). The colored glass filter is designed to have certain spectral properties, and is doped with pigments or dyes. Schott Optical Glass Hie, a company located in Mainz, Germany, is one company that manufactures colored glass that can be used in colored glass filters. Referring now to FIG. 9, there is shown a first method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention. As discussed in conjunction with the FIG. 8 embodiment, narrowband filter 816 is a dielectric stack filter that is formed as a dual-band narrowband filter.. Dielectric stack filters can include any combination of filter types. The desired spectral properties of the completed dielectric stack filter determine which types of filters are included in the layers of the stack. For example, a filter can be fabricated by combining two filters 900, 902. Band-blocking filter 900 filters out the Hght at wavelengths between the regions around wavelengths λi and λ2, while bandpass filter 902 transmits Hght near and between wavelengths λi and λ2. The combination of filters 900, 902 transmits Hght in the hatched areas, while blocking Hght at aH other wavelengths. FIG. 10 depicts the spectrum for the dual-band narrowband filter of FIG. 9. As can be seen, Hght transmits through the combined filters only at or near the wavelengths of interest, λi (spectrum 1000) and λ2 (spectrum 1002). A dual-band narrowband filter can also be fabricated by stacking coupled-cavity resonators on top of each other, where each coupled-cavity resonator is formed with two Fabry-Perot resonators. FIG. 11 iHustrates a Fabry-Perot (FP) resonator used in a second method for fabricating a dual- band narrowband filter in an embodiment in accordance with the invention. Resonator 1100 includes upper Distributed Bragg reflector (DBR) 1102 layer and lower DBR layer 1104. The materials that form the DBR layers include N pairs of quarter-wavelength (mλ/4) thick low index material and quarter- wavelength (nλ/4) thick high index material, where the variable N is an integer number and the variables m and n are odd integer numbers. The wavelength is defined as the wavelength of Hght in a layer, which is equal to the freespace wavelength divided by the layer index of refraction. Cavity 1106 separates the two DBR layers 1102, 1104. Cavity 1106 is configured as a haH-wavelength (ρλ/2) thick cavity, where p is an integer number. The thickness of cavity 1106 and the materials in DBR layers 1102,
1104 determine the transmission peak for FP resonator 1100. FIG. 12 depicts the spectrum for the Fabry-Perot resonator of FIG. 11. FP resonator 1100 has a single transmission peak 1200. Hi this second method for fabricating a dual-band narrowband filter, two FP resonators 1100 are stacked together to create a coupled-cavity resonator. FIG. 13 depicts a coupled-cavity resonator used in the second method for fabricating a dual-band narrowband filter in an embodiment in accordance with the invention. Coupled-cavity resonator 1300 includes upper DBR layer 1302, cavity 1304, strong-coupling DBR 1306, cavity 1308, and lower DBR layer 1310. The strong-coupling DBR 1306 is formed when the lower DBR layer of top FP resonator (i.e., layer 1104) merges with an upper DBR layer of bottom FP resonator (i.e., layer 1102). Stacking two FP resonators together spHts the single transmission peak 1200 in FIG. 12 into two peaks, as shown in FIG. 14. The number of pairs of quarter-wavelength thick index materials in strong-coupling DBR 1306 determines the coupling strength between cavities 1304, 1308. And the coupling strength between cavities 1304, 1308 controls the spacing between peak 1400 and peak 1502. FIG. 15 iHustrates a stack of three coupled-cavity resonators that form a dual-band narrowband filter in an embodiment in accordance with the invention. Dual-band narrowband fflter 1500 includes upper DBR layer 1502, cavity 1504, strong-coupling DBR 1506, cavity 1508, weak-coupling DBR 1510, cavity 1512, strong-coupling DBR 1514, cavity 1516, weak-coupling DBR 1518, cavity 1520, strong-coupling DBR 1522, cavity 1524, and lower DBR layer 1526. Stacking three coupled-cavity resonators together spHts each of the two peaks 1400, 1402 into a triplet of peaks 1600, 1602, respectively. FIG. 16 depicts the spectrum for the dual-band narrowband filter of FIG. 15. The strength of the coupling in weak-coupling DBRs 1510, 1518 is reduced by increasing the number of mirror pairs in coupling DBRs 1510, 1518. The reduced coupling strength merges each triplet of peaks 1600, 1602 into a single broad, fairly flat transmission band. Changing the number of pairs of quarter-wavelength thick index materials in weak-coupling DBRs 1510, 1518 alters the spacing within the triplet of peaks 1600, 1602. Although a hybrid filter has been described with reference to detecting
Hght at two wavelengths, λi and λ , hybrid filters in other embodiments in accordance with the invention may be used to detect more than two wavelengths of interest. FIG. 17 fllustrates spectra for polymer filters and a tri-band narrowband filter in an embodiment in accordance with the invention. A hybrid filter in this embodiment detects Hght at three wavelengths of interest, λi, λ2, and λ3. Spectra 1700 and 1702 at wavelengths λi and λ3, respectively, represent two signals to be utilized by an imaging system. Light detected at wavelength λ2 (spectrum 1704) is used to determine the amount of Hght received by the imaging system outside the two wavelengths of interest. The amount of Hght detected at wavelength λ2 may be used as a reference amount of Hght detectable by the imaging system. A tri-band narrowband filter transmits light at or near the wavelengths of interest (λj, λ2, and λ3) while blocking the transmission of Hght at aU other wavelengths in this embodiment in accordance with the invention. Polymer filters in a patterned filter layer then discriminate between the Hght received at wavelengths λi λ2, and λ3. FIG. 18 depicts a sensor in accordance with the embodiment shown in FIG. 17. Patterned filter layer 1800 is formed on sensor 1802 using three different filters. Hi this embodiment, one region in patterned filter layer (e.g. region 1) transmits the Hght at wavelength λi while blocking the Hght at wavelengths λ2 and λ3 (see spectrum 1706 in FIG. 17). Another region in patterned filter layer (e.g. region 3) transmits the Hght at wavelength λ3 while blocking the Hght at wavelengths λi and λ2 (see spectrum 1708 in FIG. 17). The third region transmits Hght at wavelength λ2 while blocking the Hght at wavelengths λi and λ3 (see spectrum 1710 in FIG. 17).

Claims

1. A system for pupil detection, comprising: a source for emitting Hght towards an object (300); a first imager (102a); a first hybrid filter positioned between the source and the first imager (102a); and a first controHer (108a) connected to the first imager (102a) for generating an alert signal when a pupil is detected.
2. The system of claim 1, further comprising: a second imager (102b) ; a second hybrid filter positioned between the second imager (102b) and the source; a second controHer (108b) connected to the second imager (102b); and a stereo controHer (400) connected to the first and second controUers
(108a, 108b) for generating at least one three-dimensional image.
3. The system of claim 1 or claim 2, further comprising: a timer (110) connected to the first controHer (108a); a controHed device (112) connected to the first controHer (108a); and an input device (114) connected to the first controUer (108a).
4. The system of one of claims 1-3, wherein the first hybrid filter comprises a dielectric stack filter (816).
5. The system of claim 4, wherein the dielectric stack filter (816) comprises a colored glass filter.
6. The system of claim 4, wherein the dielectric stack filter (816) comprises N coupled-cavity resonators (1300) stacked together, where N is an integer number.
7. The system of one of claims 1-6, wherein the source comprises a single broadband Hght source emitting Hght at multiple wavelengths of interest.
8. The imaging system of one of claims 1-6, wherein the source comprises a first Hght source (104) emitting Hght at a first wavelength and a second Hght source (106) emitting Hght at a second wavelength, and wherein the first Hght source (104) is positioned at a first angle (304) relative to an axis (302) of the first imager (102) and the second Hght source (106) is positioned at a second angle (306) relative to the axis (302) of the first imager (102) where the second angle (306) is larger than the first angle (304).
9. A method for wavelength-dependent detection, comprising: receiving Hght from an object (300), wherein the Hght includes Hght propagating at two or more wavelengths of interest; discriminating between Hght received at or near the wavelengths of interest while simultaneously blocking Hght received at aH other wavelengths; detecting the amount of Hght received at or near each wavelength of interest and generating one or more images using the Hght received at or near the wavelengths of interest; determining whether at least one of the one or more images includes a pupfl; and generating an alert signal when at least one of the images includes a pupil.
10. The method of claim 9, further comprising determining a difference between the amount of Hght received at each of the wavelengths of interest.
PCT/US2005/015583 2004-05-10 2005-05-04 Method and system for pupil detection for security applications WO2005114553A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2007513211A JP4799550B2 (en) 2004-05-10 2005-05-04 Pupil detection method and system
EP05771748A EP1745413B1 (en) 2004-05-10 2005-05-04 Method and system for pupil detection for security applications
DE602005015569T DE602005015569D1 (en) 2004-05-10 2005-05-04 PROCESS AND SYSTEM FOR PUPIL DETECTION FOR SAFETY APPLICATIONS

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/843,512 2004-05-10
US10/843,512 US7720264B2 (en) 2004-05-10 2004-05-10 Method and system for pupil detection for security applications

Publications (2)

Publication Number Publication Date
WO2005114553A2 true WO2005114553A2 (en) 2005-12-01
WO2005114553A3 WO2005114553A3 (en) 2006-05-18

Family

ID=35239478

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/015583 WO2005114553A2 (en) 2004-05-10 2005-05-04 Method and system for pupil detection for security applications

Country Status (6)

Country Link
US (1) US7720264B2 (en)
EP (1) EP1745413B1 (en)
JP (1) JP4799550B2 (en)
DE (1) DE602005015569D1 (en)
TW (1) TWI346898B (en)
WO (1) WO2005114553A2 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040057622A1 (en) * 2002-09-25 2004-03-25 Bradski Gary R. Method, apparatus and system for using 360-degree view cameras to identify facial features
US7286056B2 (en) * 2005-03-22 2007-10-23 Lawrence Kates System and method for pest detection
US7473884B2 (en) * 2005-04-21 2009-01-06 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Orientation determination utilizing a cordless device
US9864752B2 (en) * 2005-12-29 2018-01-09 Nextlabs, Inc. Multilayer policy language structure
US8832048B2 (en) 2005-12-29 2014-09-09 Nextlabs, Inc. Techniques and system to monitor and log access of information based on system and user context using policies
US7796119B2 (en) * 2006-04-03 2010-09-14 Avago Technologies General Ip (Singapore) Pte. Ltd. Position determination with reference
FI119830B (en) * 2006-05-24 2009-03-31 Valtion Teknillinen Spectrometer and interferometric procedure
DE102006034731A1 (en) * 2006-07-27 2008-01-31 Infratec Gmbh Infrarotsensorik Und Messtechnik Tunable dual-band Fabry-Perot filter
US7411497B2 (en) * 2006-08-15 2008-08-12 Lawrence Kates System and method for intruder detection
FR2910150B1 (en) * 2006-12-15 2009-04-17 Sagem Defense Securite METHOD AND INSTALLATION OF IRIS CAPTURE WITH PRIOR DETECTION OF COLOR.
JP2010043979A (en) * 2008-08-13 2010-02-25 Yuichi Kamata Spectral image measuring device
CN102292017B (en) * 2009-01-26 2015-08-05 托比股份公司 The detection to fixation point of being assisted by optical reference signal
JP5613988B2 (en) * 2009-03-09 2014-10-29 株式会社ニコン Display device
WO2012057645A1 (en) * 2010-10-29 2012-05-03 Antonov Dmitry Evgenievich Iris identification method of a person (alternatives)
KR101428229B1 (en) * 2012-11-29 2014-08-07 현대자동차주식회사 Apparatus and method for acquising differential image
TWI557004B (en) * 2014-01-10 2016-11-11 Utechzone Co Ltd Identity authentication system and its method
KR20150144185A (en) * 2014-06-16 2015-12-24 현대자동차주식회사 Method for extracting eye center point
CN105119723A (en) * 2015-09-15 2015-12-02 重庆智韬信息技术中心 Identity authentication and authorization method based on human eye recognition
TWI617845B (en) 2017-03-16 2018-03-11 財團法人工業技術研究院 Image sensing apparatus
US20190012552A1 (en) * 2017-07-06 2019-01-10 Yves Lambert Hidden driver monitoring
WO2020213418A1 (en) * 2019-04-15 2020-10-22 富士フイルム株式会社 Imaging device, signal processing device, signal processing method, and signal processing program
WO2020213419A1 (en) * 2019-04-15 2020-10-22 富士フイルム株式会社 Imaging device, signal processing device, signal processing method, and signal processing program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0350957A2 (en) 1988-07-14 1990-01-17 Atr Communication Systems Research Laboratories Image pickup apparatus
EP1452127A1 (en) 2003-02-28 2004-09-01 Agilent Technologies, Inc. Apparatus for detecting pupils

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3795444A (en) * 1972-10-05 1974-03-05 Eastman Kodak Co Exposure control apparatus
JPH07119933B2 (en) * 1987-06-12 1995-12-20 富士写真フイルム株式会社 Color photo printing device
GB2218307A (en) * 1988-04-08 1989-11-08 British Library Board Noise reduction in the playback of recorded sound
JPH0782539B2 (en) * 1988-07-14 1995-09-06 株式会社エイ・ティ・アール通信システム研究所 Pupil imager
US5204703A (en) * 1991-06-11 1993-04-20 The Center For Innovative Technology Eye movement and pupil diameter apparatus and method
US5359669A (en) * 1992-04-13 1994-10-25 Motorola, Inc. Remote retinal scan identifier
JPH06138374A (en) * 1992-10-30 1994-05-20 Canon Inc Optical device having sight line detecting device
JP3214195B2 (en) 1993-11-11 2001-10-02 三菱電機株式会社 Driver photography device
US5422690A (en) * 1994-03-16 1995-06-06 Pulse Medical Instruments, Inc. Fitness impairment tester
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
JP3296119B2 (en) * 1994-12-01 2002-06-24 日産自動車株式会社 Gaze direction measuring device for vehicles
US5912721A (en) * 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
US5873832A (en) * 1996-08-12 1999-02-23 Xeyex Corporation Method and apparatus for measuring properties of the eye using a virtual image
US6447119B1 (en) * 1996-08-12 2002-09-10 Visionrx, Inc. Apparatus for visualizing the eye's tear film
JP2917953B2 (en) * 1997-02-05 1999-07-12 日本電気株式会社 View point position detection device
DE19719695A1 (en) 1997-05-09 1998-11-19 Univ Eberhard Karls Method and device for measuring the attention of living beings within a set of living beings
JPH10321826A (en) * 1997-05-15 1998-12-04 Matsushita Electron Corp Solid-state image-pickup device with built-in optical low-pass filter and its manufacture
JP3477047B2 (en) * 1997-10-02 2003-12-10 三菱電機株式会社 Blink detection face image processing device using retinal reflection image
ATE220465T1 (en) * 1997-10-29 2002-07-15 Calum E Macaulay APPARATUS AND METHOD FOR MICROSCOPY USING SPACIALLY MODULATED LIGHT
FR2773521B1 (en) * 1998-01-15 2000-03-31 Carlus Magnus Limited METHOD AND DEVICE FOR CONTINUOUSLY MONITORING THE DRIVER'S VIGILANCE STATE OF A MOTOR VEHICLE, IN ORDER TO DETECT AND PREVENT A POSSIBLE TREND AS IT GOES TO SLEEP
EP1075208A1 (en) * 1998-04-29 2001-02-14 Carnegie-Mellon University Apparatus and method of monitoring a subject's eyes using two different wavelengths of light
AUPP697398A0 (en) * 1998-11-06 1998-12-03 Lions Eye Institute Of Western Australia Incorporated, The Eye tracker for refractive surgery
JP2001041983A (en) * 1999-07-29 2001-02-16 Matsushita Electric Ind Co Ltd Photo-voltage sensor
US6734911B1 (en) * 1999-09-30 2004-05-11 Koninklijke Philips Electronics N.V. Tracking camera using a lens that generates both wide-angle and narrow-angle views
JP2001189926A (en) * 1999-12-28 2001-07-10 Mitsubishi Electric Corp Image pickup device for road monitor
JP4469476B2 (en) * 2000-08-09 2010-05-26 パナソニック株式会社 Eye position detection method and eye position detection apparatus
WO2002065171A2 (en) * 2001-01-25 2002-08-22 Jax Holdings, Inc. Multi-layer thin film optical filter arrangement
JP2003004535A (en) * 2001-05-22 2003-01-08 Xerox Corp Measurement of change in color according to viewing angle in textile and other surfaces
US6616277B1 (en) * 2001-07-02 2003-09-09 Vision Research Corporation Sequential eye screening method and apparatus
JP2003030659A (en) * 2001-07-16 2003-01-31 Matsushita Electric Ind Co Ltd Iris authentication device and iris image pickup device
JP2003209411A (en) * 2001-10-30 2003-07-25 Matsushita Electric Ind Co Ltd High frequency module and production method for high frequency module
US7683326B2 (en) 2002-07-09 2010-03-23 Gentex Corporation Vehicle vision system with high dynamic range
JP3812505B2 (en) * 2002-07-10 2006-08-23 宇部興産株式会社 Cavity resonator, cavity resonator filter and module substrate
WO2004036299A1 (en) * 2002-10-14 2004-04-29 Koninklijke Philips Electronics N.V. Color filter and liquid crystal display device comprising such filter
US20050093437A1 (en) * 2003-10-31 2005-05-05 Ouyang Michael X. OLED structures with strain relief, antireflection and barrier layers
EP1691670B1 (en) * 2003-11-14 2014-07-16 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
JP2005252732A (en) * 2004-03-04 2005-09-15 Olympus Corp Imaging device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0350957A2 (en) 1988-07-14 1990-01-17 Atr Communication Systems Research Laboratories Image pickup apparatus
EP1452127A1 (en) 2003-02-28 2004-09-01 Agilent Technologies, Inc. Apparatus for detecting pupils

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1745413A4

Also Published As

Publication number Publication date
EP1745413A2 (en) 2007-01-24
WO2005114553A3 (en) 2006-05-18
EP1745413B1 (en) 2009-07-22
EP1745413A4 (en) 2008-05-28
US20050249384A1 (en) 2005-11-10
DE602005015569D1 (en) 2009-09-03
US7720264B2 (en) 2010-05-18
JP2007536064A (en) 2007-12-13
JP4799550B2 (en) 2011-10-26
TW200537391A (en) 2005-11-16
TWI346898B (en) 2011-08-11

Similar Documents

Publication Publication Date Title
EP1745413B1 (en) Method and system for pupil detection for security applications
US7217913B2 (en) Method and system for wavelength-dependent imaging and detection using a hybrid filter
US7583863B2 (en) Method and system for wavelength-dependent imaging and detection using a hybrid filter
US10606031B2 (en) Imaging apparatus, imaging system that includes imaging apparatus, electron mirror system that includes imaging apparatus, and ranging apparatus that includes imaging apparatus
US7580545B2 (en) Method and system for determining gaze direction in a pupil detection system
US7280678B2 (en) Apparatus and method for detecting pupils
US20060279745A1 (en) Color imaging system for locating retroreflectors
WO2006004960A2 (en) Method and system for reducing artifacts in image detection
EP3701603B1 (en) Vcsel based biometric identification device
US20070058881A1 (en) Image capture using a fiducial reference pattern
US20070031002A1 (en) Method and system for reducing artifacts in image detection
US20080089605A1 (en) Contrast-based technique to reduce artifacts in wavelength-encoded images
KR20230161767A (en) Semantic camera device and method for determining the quality of material and distance of an object to be photographed by the technology of multi-spectrum

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005771748

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007513211

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWP Wipo information: published in national office

Ref document number: 2005771748

Country of ref document: EP