|Publication number||US7531774 B2|
|Application number||US 11/446,109|
|Publication date||May 12, 2009|
|Filing date||Jun 5, 2006|
|Priority date||Jun 5, 2006|
|Also published as||US20070278386, WO2007142634A1|
|Publication number||11446109, 446109, US 7531774 B2, US 7531774B2, US-B2-7531774, US7531774 B2, US7531774B2|
|Inventors||Richard G. Paxman, John H. Seldin|
|Original Assignee||General Dynamics Advanced Information Systems, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (30), Non-Patent Citations (5), Referenced by (6), Classifications (10), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention relates to systems, methods and algorithms for imaging extended objects in the presence of unknown aberrations and to characterizing the aberrations of an optical system from image data.
Image data collected from an incoherently illuminated scene (for example, image data collected using ambient light from the sun) tends to be degraded in the presence of phase and amplitude aberrations. Aberrations may arise from a variety of sources, such as optical design residual, optical-fabrication error, misalignment among optical elements, degradations in reflectivity or transmisivity of portions of optical elements and atmospheric turbulence.
As a result of aberrations, images created from collected image data in the presence of unknown aberrations may be blurred or otherwise degraded, resulting in loss of resolution, loss of contrast and reduction in interpretability. Previous imaging techniques have attempted to overcome the degrading effects of aberrations in acquired image data. One of these approaches is known as the method of phase diversity and is described in U.S. Pat. No. 4,309,602 to Gonsalves, et al., entitled “Wavefront-Sensing by Phase Retrieval.” This technique involves collecting two images of an object in the presence of unknown aberrations, with one of the two images being degraded by a known amount of defocus and the other image being a focused image. The defocus of one image during data collection creates phase diversity between the two images. The two images are then processed to determine unknown atmospheric phase aberrations by identifying a combination of the object and phase aberrations consistent with the collected images, given the known amount of defocus. Thereafter, the system may be adaptively corrected to eliminate or minimize the phase aberrations in the received imagery.
Another technique previously used to overcome the degrading effects of aberrations in acquired image data is known as “Measurement-Diverse Speckle Imaging”, which is disclosed in U.S. Pat. No. 5,384,455 to Paxman. This technique involves collecting a sequence of two or more pairs of short-exposure images of an object in the presence of unknown aberrations, with each pair of images having measurement-diversity. An iterative process may then be employed to jointly estimate the object that is common to all collected images and the unknown phase aberrations associated with each image pair.
However, as stated above, images collected from an incoherently illuminated scene may be degraded by both phase and amplitude aberrations. Prior attempts at accounting for aberrations in acquired image data may account for phase aberrations in the collected image data, but they fail to account for any amplitude aberrations. As a result, the image created from the data collected by these systems may be degraded due to amplitude aberrations and a substantially diffraction-limited image of the object may not be obtained.
Therefore, there is a need for a system, method and algorithms capable of imaging extended objects which account for both phase and amplitude aberrations caused by the atmosphere and/or the system used for obtaining the image data.
The present invention relates to systems, methods and algorithms for imaging extended objects in the presence of unknown aberrations and to characterizing the aberrations of an optical system from image data.
In one embodiment of the present invention, a method for imaging an object may include the steps of acquiring at least a first image and a second image of the object, the images being measurement-diverse, estimating parameters to represent the object and phase and amplitude aberrations present in the acquired images and calculating a measure of likelihood that the estimated parameters correspond to the object and the phase and amplitude aberrations in the acquired images. Further, the method may include the step of repeating the steps of estimating and calculating until the measure of likelihood is substantially maximized, wherein the estimated parameters are adjusted prior to each repetition, and whereby said steps of estimating, calculating and repeating create an estimated image of the object.
An alternative embodiment of the present invention includes a system for imaging an object. The system may include a processor, at least one IO interface electrically coupled to the processor, at least one detector electrically coupled to the processor via the IO interface, the detector configured to acquire at least a first image and a second image of the object, the images being measurement-diverse and a memory device electrically coupled to the processor. The memory device may also include processor-readable code configured to instruct the processor to estimate parameters to represent the object and phase and amplitude aberrations present in the acquired images and calculate a measure of likelihood that the estimated parameters correspond to the object and the phase and amplitude aberrations in the acquired images. Further, the code may be configured to instruct the processor to repeat the steps of estimation and calculation until the measure of likelihood is substantially maximized, wherein the estimated parameters are adjusted prior to each repetition and whereby the estimation, calculation and repetition create an estimated image of the object.
Another alternative embodiment of the present invention includes a method for imaging an object including the steps of illuminating the object with a laser having a short coherence length, acquiring image data for the object and processing the acquired image data to obtain an estimate of the phase and amplitude aberrations present in the acquired image data. The image data may be acquired so as to have measurement diversity.
These and other objects and advantages of the invention will be apparent from the following description, the accompanying drawings and the appended claims.
While the specification concludes with claims particularly pointing out and distinctly claiming the present invention, it is believed the same will be better understood from the following description taken in conjunction with the accompanying drawings, which illustrate, in a non-limiting fashion, the best mode presently contemplated for carrying out the present invention, and in which like reference numerals designate like parts throughout the Figures, wherein:
The present disclosure will now be described more fully with reference to the Figures in which various embodiments of the present invention are shown. The subject matter of this disclosure may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein.
In one embodiment of the present invention, the detector arrays 50 and 60 may each comprise a conventional CCD array. However, it is contemplated that any type of optical sensor such as, but not limited to, CMOS devices, videcons, scanning sensors, microbolomoters and film may also be used in the system of the present invention. One skilled in the art will appreciate that, although not shown, detector arrays 50 and 60 may incorporate a mechanism for limiting the exposure time such as, for example, a physical or an electronic shutter. Further, the images may be collected simultaneously using a mechanical or electronic synchronization mechanism known to one of skill in the art.
Additionally, while a beam splitter 40 is illustrated in
A single object 10 to be imaged using the system 100 is illustrated in
While the figures illustrate the use of an optical system for acquiring images using incoherent light, it is contemplated that acoustic signals may also be used in lieu of optical signals. For example, in embodiments of the present invention utilizing acoustic signals, the optical components would obviously be replaced with acoustic components (such as acoustic detectors and the like). However, the processing of the received signals will remain substantially the same, as discussed below, with any differences being obvious to one of skill in the art.
The object 10 to be imaged may be any extended or localized object or objects capable of being imaged by the system 100 and located at any distance from the system so that the system is capable of receiving light reflected by the object. In addition to objects visible to the unaided human eye, the system may also be used for imaging objects such as, but not limited to, objects that are too distant (telescopic) or too small (microscopic) to be seen with the unaided eye, objects that are defined at a wavelength outside of the visible region (such as x-ray, ultraviolet, infrared, millimeter wave, microwave or radio wave) or objects defined by acoustic properties.
Further, as discussed above, image data collected from an incoherently illuminated object tends to be degraded by phase and amplitude aberrations. It should be noted that, although the sources 20 of these aberrations are illustrated generally in the figures, they may represent time-varying aberrations induced by atmospheric turbulence or by mechanical instability in the system used to collect the images or any other mechanism which distorts the received imagery.
In the embodiment of the present invention illustrated in
While the embodiment of the present invention shown in
In addition to acquiring phase-diverse images, it is contemplated that images may be collected in any manner, as long as the images have measurement-diversity. For example, a colored filter may be introduced in one or more of the channels (using, for example, a filter wheel, a dichroic beam splitter or a similar device) to create wavelength diversity between the received images. In addition, the amplitude of the image received in one of the channels may be altered relative to the other channels by using, for example, a pupil having a variable size, an amplitude plate or a similar device to create amplitude diversity between the received images. The collected images, each arising from different system perturbations, may be generally referred to as measurement-diverse images.
Where the phase and amplitude aberration sources 20 are fixed in time, a single pair of images (one image from each of the detectors 50 and 60 such as pair 152 illustrated in
In one embodiment of the present invention where time-varying aberrations are present, a first-short exposure image 152 a of the object and a second short-exposure diversity image 152 of the same object corresponding to the first aberration realization may be collected. Additional pairs 153-158, taken at later points in time with different aberration realizations, may then be collected. Once the image data have been collected, it may be digitized as shown at step 160 in
The digitization 160 of the image data collected by the first detector 50 and the second detector 60 may be accomplished with a first conventional analog-to-digital (“A/D”) converter 70 and a second conventional A/D converter 80, attached to the first detector 50 and the second detector 60, respectively. However, it is also contemplated that a single A/D converter may receive the image data from each of the detectors and may perform the digitization. Alternatively, the image data may be digitized by the detector array or the A/D converter may be a part of the CPU 90. Further, the digitization may be any one of a number of commercially available devices which may be utilized to digitize images captured on film or some other analog medium. The type of digitization will, of course, be a function of the type of detectors used for collection, as discussed above.
Once the image data have been digitized, it may then be transmitted to, and received by, a processor-based system 90. The system 90 may then store the image data for later use, process the image data, or display the image data. The storage, processing and display of the image data is discussed in detail below with reference to
In addition to the two-channel system illustrated in
In the system illustrated in
Once the image data have been collected using the system illustrated in
The memory 140 may include data structure 142 for storing data and one or more software algorithms 144. It should be noted that the system 90 may take the form of any system capable of processing the data received by the system of the present invention discussed above with respect to
As shown in
The processor 130 may be configured to run software algorithms 144 for performing the processing steps (illustrated as 170 in
Once the processor 130 processes the received data as discussed in detail below, the results of the processing may be stored in memory 140 or output to a display 148. Additionally, the outputs may be sent to, or accessed by, a separate system for use in further processing including, but not limited to, controls processing used to correct aberrations by commanding actuators in a feed-back loop in adaptive optics embodiments.
The processing 170 steps discussed below reference images collected having phase diversity. However, it should be realized that, depending on the data collection technique used, the method of the present invention may slightly differ to account for changes such as, for example, differing types of measurement diversity, multiple channels and series of measurement-diverse image data (such as measurement-diverse speckle data). However, any such changes in the method discussed below will involve only routine skill in the art and will be known to one of skill in the art.
Further, one of skill in the art will recognize that, once the image data are received, the processing steps may be performed according to the method discussed below. The method may be performed to obtain an image which is as close to a diffraction-limited image of an object, or objects, as possible while accounting for both phase and amplitude aberrations due to the atmosphere and/or the imaging system. According to one embodiment of the present invention, a model-based approach may be utilized to accomplish the joint estimation of the object (which may be common to all collected images) and the phase and amplitude aberrations for each aberration realization. Accordingly, an incoherent imaging model may be constructed to characterize the dependence of the imagery upon the object and the optical system, including aberrations due to the system and/or the atmosphere. It may then be possible to determine a probability density function for the collected imagery, given the object and aberrations. The functional form of the probability density function may be interpreted as a likelihood function for a given data set, a candidate object and an aberration estimate. The goal is to vary the object and aberration estimates in order to maximize the likelihood function, yielding a maximum-likelihood estimate. Once the likelihood function is maximized, the object and aberration estimates are a close approximation to the actual object and aberrations.
To jointly estimate the object and the aberrations, a Coherent Transfer Function (“CTF”) for the kth diversity channel may be modeled. This transfer function may be given by:
H k(u;α,β)=C(u;β)e i[φ(u;α)+θ
where: k=the index of the phase diversity channel,
Equation 1, the phase and amplitude aberrations may each be expressed as a weighted sum of appropriate basis function. These functions may be given by:
These equations may represent the phase aberration (Equation 2) and the amplitude aberration (Equation 3) used in the CTF of Equation 1.
Next, given the CTF (Equation 1), an incoherent Point-Spread Function (“PSF”) for the kth channel may be modeled as being proportional to the squared magnitude of the inverse Discrete Fourier Transform (“DFT”) of the CTF. Thus, the PSF may be modeled as:
Now, given a function for the object f(x), the image data for the kth diversity channel may be modeled as:
where n is a Gaussian random variable which may account for any unknown noise. It should be noted that n may be considered as being uniform across all detector elements, representing additive Gaussian noise. However, as will be appreciated by those skilled in the art, other noise models may be utilized including, but not limited to, a signal-dependent Poisson noise model or mixture noise models that account for both signal-dependent and additive noise.
Thus, dk(x) may represent the data associated with the kth channel if the object and aberrations (phase and amplitude) were accurately estimated. As discussed above, these data may be compared to the actual received data for each channel and aberration realization to determine the likelihood that the object and aberration estimates are consistent with the collected data. Constrained, likelihood-based estimators are preferably utilized for this purpose. In one embodiment, a constrained maximum-likelihood estimation under a photon-limited noise model, where a non-negativity object constraint is enforced, may be utilized to develop the object and aberration estimates. In another embodiment, the constrained estimation may ensure that the aberration amplitude is non-negative. In yet another embodiment, a constrained maximum-likelihood estimation under an additive Gaussian noise model may be utilized. Again, however, it will be appreciated by those skilled in the art that other constrained likelihood-based estimators, utilizing various noise models and constraints appropriate to the specific environmental and equipment characteristics of a particular system may also usefully be employed for this purpose. Further, the particulars of the prior knowledge about the estimated parameters, may also be useful.
In one embodiment of the present invention, the maximum-likelihood estimation may be accomplished by maximizing the regularized reduced-Gaussian objective function given by:
where Dk(u) is the DFT of the actual image data in the kth diversity channel, and τ is a regularization parameter.
The goal is to determine the maximum-likelihood solution for both the phase and amplitude parameters which most closely represent the phase and amplitude aberrations seen in the actual image data. The aberration parameters may be iteratively varied to determine the maximum value of the reduced-Gaussian objective function expressed in Equation 6. In one embodiment of the present invention, a closed-form expression for the gradient of Equation 6 may be derived which may greatly aid in the iterative search by providing direction as to what parameters to use to represent the aberrations.
Thus, the systems and methods of the present invention may provide for fine-resolution imaging and wavefront sensing utilizing measurement-diversity concepts. The use of these concepts, in conjunction with the estimation of the object and phase and amplitude aberrations utilizing constrained likelihood-based estimators involving an incoherent imaging model may yield an improved, fine-resolution estimation of the object and the aberrations. It will also be appreciated by one of skill in the art that the method of the present invention, particularly the joint object and aberration estimation techniques, may be utilized to perform post-detection correction of images obtained using imaging systems such as the imaging systems discussed in detail above. In addition, one of skill in the art will appreciate that pre-detection correction may also be accomplished using timely actuation to correct for phase and/or amplitude aberrations.
As will be readily apparent to one of skill in the art, the systems and method of the present invention may be utilized in many types of imaging systems. Additionally, the systems and method of the present invention may be utilized to image the pupil of an imaging system in conjunction with, or instead of, imaging an extended or a localized object.
A second example of an application of the present invention is shown in
Historically, conventional cameras have been used to image the pupil of the as-built system. However, this may result in the design and implementation of a complex and/or expensive auxiliary sensor (pupil camera) that is also subject to design tolerances and may divert light from the primary imaging function of the system. The present invention, however, may eliminate the need for these auxiliary sensors (pupil cameras) because the estimation method of the present invention may provide an image of the clear-aperture of the system.
As illustrated in
A beam splitter 310 (or an equivalent device such as, for example, a silvered mirror or a birefringent prism) may be arranged along the optical axis 315 of the imaging system. A second diversity detector 350, such as the type discussed with reference to
Thus, the detectors 340, 350 may be configured to receive the diverse optical signals and estimate an image of the pupil of the telescope 305 from the acquired images, as illustrated at 330. This image may then be used in calibration of the telescope 305 or any additional measurements which a user may desire to obtain from the telescope 305 where the clear-aperture of the telescope must be known. This estimation of the clear-aperture may be performed by a CPU using the estimation method of the present invention discussed above.
The foregoing descriptions of specific embodiments of the present invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations are possible in view of the above teachings. While the embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to best utilize the invention, various embodiments with various modifications as are suited to the particular use are also possible. The scope of the invention is to be defined only by the claims appended hereto, and by their equivalents.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4309602||Nov 1, 1979||Jan 5, 1982||Eikonix Corportation||Wavefront sensing by phase retrieval|
|US4518854||Jun 17, 1982||May 21, 1985||Itek Corporation||Combined shearing interferometer and Hartmann wavefront sensor|
|US4682025||Oct 14, 1986||Jul 21, 1987||Trw Inc.||Active mirror wavefront sensor|
|US5120128||Jan 14, 1991||Jun 9, 1992||Kaman Aerospace Corporation||Apparatus for sensing wavefront aberration|
|US5384455||Apr 12, 1993||Jan 24, 1995||Environmental Research Institute Of Michigan||Measurement-diverse speckle imaging|
|US5689335||Oct 10, 1995||Nov 18, 1997||The Regents Of The University Of California||Apparatus and method for heterodyne-generated two-dimensional detector array using a single element detector|
|US6130419||Jun 25, 1999||Oct 10, 2000||Wavefront Sciences, Inc.||Fixed mount wavefront sensor|
|US6429415||Dec 26, 2001||Aug 6, 2002||Geoffrey B. Rhoads||Wide field imaging through turbulent media|
|US6452146||Jul 26, 2001||Sep 17, 2002||The United States Of America As Represented By The Secretary Of The Air Force||Electro-optical field conjugation system|
|US6532073||Sep 4, 2001||Mar 11, 2003||Fuji Photo Optical Co., Ltd.||Fringe analysis error detection method and fringe analysis error correction method|
|US6570143||Sep 23, 1999||May 27, 2003||Isis Innovation Limited||Wavefront sensing device|
|US6639683||Oct 17, 2000||Oct 28, 2003||Remy Tumbar||Interferometric sensor and method to detect optical fields|
|US6674519||Dec 21, 2001||Jan 6, 2004||Northrop Grumman Corporation||Optical phase front measurement unit|
|US6683291||Nov 14, 2001||Jan 27, 2004||The United States Of America As Represented By The Secretary Of The Air Force||Optimal beam propagation system having adaptive optical systems|
|US6707020||Dec 28, 2000||Mar 16, 2004||Mza Associates Corporation||Adaptive dynamic range wavefront sensor|
|US6727992||Jun 24, 2002||Apr 27, 2004||Zygo Corporation||Method and apparatus to reduce effects of sheared wavefronts on interferometric phase measurements|
|US6781701||Apr 5, 2002||Aug 24, 2004||Intel Corporation||Method and apparatus for measuring optical phase and amplitude|
|US6787747||Sep 24, 2002||Sep 7, 2004||Lockheed Martin Corporation||Fast phase diversity wavefront correction using a neural network|
|US6818876||Jun 15, 1999||Nov 16, 2004||B. F. Goodrich Company||Scintillation-immune adaptive optics reconstructor|
|US6819435||Apr 9, 2001||Nov 16, 2004||Nano Or Technologies Inc.||Spatial and spectral wavefront analysis and measurement|
|US6833906||May 25, 2000||Dec 21, 2004||Canon Kabushiki Kaisha||Projection exposure apparatus, and device manufacturing method using the same|
|US6847456||Apr 27, 2001||Jan 25, 2005||Massachusetts Institute Of Technology||Methods and systems using field-based light scattering spectroscopy|
|US6924899||May 31, 2002||Aug 2, 2005||Optical Physics Company||System for measuring wavefront tilt in optical systems and method of calibrating wavefront sensors|
|US20020001088||Feb 23, 2001||Jan 3, 2002||Ulrich Wegmann||Apparatus for wavefront detection|
|US20040056174 *||Sep 24, 2002||Mar 25, 2004||Specht Donald Francis||Fast phase diversity wavefront correction using a neural network|
|US20040190002||Dec 24, 2003||Sep 30, 2004||Carl Zeiss Smt Ag||Interferometer system, method for recording an interferogram and method for providing and manufacturing an object having a target surface|
|US20050007603||Oct 16, 2002||Jan 13, 2005||Yoel Arieli||Spatial wavefront analysis and 3d measurement|
|US20050046857||Aug 26, 2003||Mar 3, 2005||Bingham Philip R.||Spatial-heterodyne interferometry for transmission (SHIFT) measurements|
|US20050112784||Sep 22, 2004||May 26, 2005||Genicon Sciences Corporation||Analyte assay using particulate labels|
|US20050151960||Jan 12, 2004||Jul 14, 2005||The Boeing Company||Scintillation tolerant optical field sensing system and associated method|
|1||Brady, G. et al.; "Retrieval of complex field using nonliner optimization", The Institute of Optics, University of Rochester, Rochester, NY 14627, Optical Society of America, 2005, pp. 1-3.|
|2||Brady, G., et al. "Nonliner optimization algorithm for retrieving the full complex pupil function", Optics Express, Jan. 23, 2006, pp. 474-486, Vo. 14, No. 2, OSA 2006.|
|3||Jefferies, S. et al; "Sensing wave-front amplitude and phase with phase diversity" Applied Optics, Apr. 10, 2002, vol. 41, No. 11, pp. 2095-2102, Optical Society of America 2002.|
|4||May 2007, PCT Search Report.|
|5||Paxman, R. et al.; "Joint estimation of object and aberrations by using phase diversity" J. Opt. Soc. Am., Jul. 1992, pp. 1072-1085, vol. 9, No. 7, Optical Society of America 1992.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7646419 *||Nov 2, 2006||Jan 12, 2010||Honeywell International Inc.||Multiband camera system|
|US8415599 *||Nov 7, 2008||Apr 9, 2013||Thales||Device for measuring the defects of an imaging instrument with two opto-electronic sensors|
|US9024239 *||Sep 13, 2013||May 5, 2015||Thales||Optic instrument with wavefront analyser|
|US20080106727 *||Nov 2, 2006||May 8, 2008||Honeywell International Inc.||Multiband camera system|
|US20100278378 *||Nov 7, 2008||Nov 4, 2010||Thales||Device for Measuring the Defects of an Imaging Instrument with Two Opto-Electronic Sensors|
|US20140077064 *||Sep 13, 2013||Mar 20, 2014||Thales||Optic instrument with wavefront analyser|
|U.S. Classification||250/201.9, 250/201.3|
|Cooperative Classification||H01L27/14625, H04N5/3572, G02B26/06, G01J9/00|
|European Classification||G02B26/06, G01J9/00, H04N5/357A|
|Jun 5, 2006||AS||Assignment|
Owner name: GENERAL DYNAMICS ADVANCED INFORMATION SYSTEMS, INC
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PAXMAN, RICHARD G.;SELDIN, JOHN H.;REEL/FRAME:017955/0271
Effective date: 20060605
|Nov 12, 2012||FPAY||Fee payment|
Year of fee payment: 4
|Aug 26, 2015||AS||Assignment|
Owner name: MDA INFORMATION SYSTEMS LLC, MARYLAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL DYNAMICS ADVANCED INFORMATION SYSTEMS, INC.;REEL/FRAME:036423/0830
Effective date: 20150824
|May 17, 2016||FPAY||Fee payment|
Year of fee payment: 8