|Publication number||US5600123 A|
|Application number||US 08/018,884|
|Publication date||Feb 4, 1997|
|Filing date||Feb 17, 1993|
|Priority date||Feb 17, 1993|
|Publication number||018884, 08018884, US 5600123 A, US 5600123A, US-A-5600123, US5600123 A, US5600123A|
|Original Assignee||Litton Systems, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (10), Referenced by (6), Classifications (14), Legal Events (9)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The Government has rights in this invention pursuant to contract No. DASG 60-90-C-0038 awarded by the Department of the Army.
Tracking systems for optically measuring the location or position of an object within a field of view have been developed for numerous applications. These tracking systems typically require the measurement of very small changes in the angle of incidence of light from the object being tracked. Autocollimators for example, are optical instruments used for measuring the movement of a distant reflecting surface. The basic principle of the autocollimator is that of observing the coincidence of an illuminated reticle or slit with its own image reflected from the distant reflector back through the instrument. When coincidence is observed, the distant reflector is accurately positioned perpendicular to the optical axis of the instrument. If the distant reflector is not perpendicular to the optical axis of the instrument, the reflected light will be displaced from the slit. The displacement of the reflected light is a measure of the angular deviation of the returning light beam which can be correlated with the angular deviation of the distant reflector.
The use of available autocollimators to provide tracking of the movement of objects has many limitations. Autocollimators are generally relatively large and thus may be inappropriate for applications having size constraints. Additionally, with a 0.5 microradian resolution and a 2 microradian repeatability, the accuracy and repeatability of available autocollimators is not suitable for extremely high precision applications. Furthermore, autocollimators typically require several seconds for a single measurement. Also, the noise level of present autocollimators may be too high for certain applications.
Accordingly, there is a continuing need for an improved optical tracking system that is compact, provides enhanced accuracy, resolution and operational speed at low noise levels.
The present invention provides an optical tracking system used to measure the movement of objects being tracked. The optical tracking system includes a scanning mirror which scans and receives an image within the scanned field of view scanned by the mirror. The scanning mirror is mounted on an actuator which controls the movement of the mirror so that it can be positioned within, or scanned across, the field of view in any desired manner. An optical system is employed to optically couple the image from the scanning mirror to an image sensor such as a charge coupled device, so that the reflected image can be recorded.
Coupled to the scanning mirror is a light dispersive element. Movement of the scanning mirror is thereby correlated with movement of the dispersive element. A light source such as a laser is positioned to direct a beam of radiation to the dispersive element. The dispersive element disperses the light from the light source into a pattern of light. A portion of the light pattern is directed onto the sensor and is within the field of view of the scanning mirror.
A preferred embodiment of the invention utilizes a control system including a data processor with a memory suitable for storing images in digital form and a controller, such as an application specific integrated circuit, that is used to control the light source and the scanning mirror, receive information from system sensors and monitor the relative angle of the steering mirror to an inertial measurement unit (IMU) beam. The IMU beam serves as a reference light source.
The optical tracking system of the present invention operates considerably faster, more accurately, with increased repeatability and produces less noise than known existing autocollimators. Additionally, the use of the light dispersive element increases the useful field of view in which to monitor the movement of the scanning mirror.
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
FIG. 1 is a schematic drawing of the optical components and control system of a preferred embodiment of the invention.
FIG. 2 is a schematic of a light pattern and positioning of the image within the pattern for a preferred embodiment of the invention.
FIG. 3 is a schematic process flow diagram illustrating the tracking method of a preferred embodiment of the invention.
In FIG. 1, an optical tracking system 100 in accordance with the invention utilizes a rotatable scanning mirror 12 for receiving target images. An actuator 14 is coupled to scanning mirror 12 and actuates movement of mirror 12, to scan a field of view in which to find a target image. In the preferred embodiment, scanning mirror 12 has at least a 4° field of view. Alternatively, scanning mirror 12 may have fields of view of differing sizes.
Light from target image 48 is received by scanning mirror 12 and is reflected onto primary mirror 38 of optical system 10. Primary mirror 38 receives image light 48 from scanning mirror 12 and reflects the image light 48 towards secondary mirror 40. Secondary mirror 40 receives and redirects image light 48 through aperture 42 which is centered on primary mirror 38. Image light 48 is then directed by reflecting mirror 36 into the field of view of a sensor, such as charge coupled device (CCD) camera 20.
In the preferred embodiment, CCD camera 20 is 7 mm long, is placed at the focus of a 2 meter focal length and has a 2 cm aperture diffraction limited lens which results in a 0.2° field of view. CCD camera 20 has 500 light detector or sensing elements (sometimes referred to as picture elements, or simply "pixels") on it with the average spacing of 14 microns. The light through the lens of CCD camera 20 forms a disc pattern of 154 microns for a 0.633 micron light source or a size of 11 pixels. The 2 meter focal length lens of CCD camera 20 provides a pixel spacing of 7 microradians. Alternatively, other cameras of different sizes, characteristics and attributes may be used. Additionally, a second CCD camera may also be used in conjunction with CCD camera 20.
An inertial measurement unit (IMU) beam 46 serves as a reference point from which the position of scanning mirror 12 is determined. The position of IMU beam 46 remains constant, therefore, the position of scanning mirror 12 is determined by monitoring the movement of scanning mirror 12 relative to IMU beam 46. IMU beam 46 originates from a laser in a fixed position located external to optical tracking system 100. Light from IMU beam 46 passes through hole 16 within scanning mirror 12 and enters optical system 10. Within optical system 10, IMU beam 46 is reflected by primary mirror 38 towards secondary mirror 40. Secondary mirror 40 redirects IMU beam 46 out of optical system 10 through aperture 42. IMU beam 46 then is deflected by reflecting mirror 36 to CCD camera 20. Alternatively, IMU beam 46 may be directed upon CCD camera 20 without passing through hole 16, optical system 10 or reflecting mirror 36.
Referring to both FIGS. 1 and 2, a dispersive element 18 such as a two dimensional reflective grating provides CCD camera 20 with a pattern of light 50 on focal plane 60.
In the preferred embodiment, a laser 26 produces the pattern of light 50. Referring to FIG. 1, laser 26 produces beam 44 which passes through lens 32 and deflects off reflecting mirror 34. Reflecting mirror 34 directs beam 44 into optical system 10 through aperture 42. Beam 44 is directed towards secondary mirror 40 and is redirected to primary mirror 38. Primary mirror 38 then receives and directs beam 44 to dispersive element 18. Dispersive element 18 reflects beam 44 back into optical system 10 as a pattern of light 50 (FIG. 2). The rectangular pattern of light 50 reflected by dispersive element 18 is reflected back into optical system 10. Primary mirror 38 receives the pattern of light 50 and redirects it to secondary mirror 40. Secondary mirror 40 reflects the pattern of light 50 through aperture 42 and out of optical system 10. The pattern of light 50 is directed to CCD camera 20 by reflecting mirror 36.
Controller 28 receives information from CCD camera 20, photo-potentiometer (photopot), laser 26 and actuator 14. Information received by controller 28 is processed by a data processor such as digital computer 30. Once the information is processed by computer 30, controller 28 issues commands regarding the operation of actuator 14, CCD camera 20, laser 26 and photopot 26. The information which photopot 22 and CCD camera 20 delivers to controller 28 is used in determining centers of diffracted orders 52. Computer 30 processes the information and determines the center of diffracted orders 52.
Referring to FIGS. 1 and 2, the pattern of light 50 provided by dispersive element 18 is made up of a grid of diffracted orders 52 (spots) and extends across the 4° field of view of scanning mirror 12. CCD area 54 is in the field of view of CCD camera 20. At least one diffracted order 52 is within CCD area 54 at a given time. Also within CCD area 54 is an image of IMU beam 46 (not shown). The pixels in CCD camera 20 receive the light of diffracted orders 52. The intensity of the light received by each pixel is converted into pixel intensity signals.
A centroid calculation is accomplished by applying coordinate dependent weights, defined by a Weighted Pixel Algorithm (WPA) to digitize pixel intensity signals and summing the weighted intensities. The weighting may be viewed as a multiplication of the digitized pixel intensity signals by an array of constant weights. The sum of the weighted intensities is divided by the sum of the unweighted intensities to yield the centroid of the diffracted order 52. Therefore, the position of the center of any diffracted order 52 on focal plane 60 can be determined by the centroid software.
An image processing board within computer 30 running the centroid software determines the center of a diffracted order 52 within 0.045 pixels or 0.35 microradians standard deviation error. The sensitivity can be varied by changing the parameters. By centroiding upon a diffracted order 52 within CCD area 54, the position of scanning mirror 12 can be determined in relation to IMU beam 46. The centroiding method provides a 0.35 microradian rms repeatability while taking 15 samples per second. Noise is reduced by averaging.
The movement of scanning mirror 12 produces a corresponding movement in the diffracted orders 52 in relation to CCD area 54. When a diffracted order 56 which is being centroided moves to the edge of CCD area 54, an exchange is made for another diffracted order 58 to accommodate the new position of scanning mirror 12. The movement results in a new mirror position and the center of the new diffracted order 58 is determined.
Photopot 22 (FIG. 1) receives signals from secondary mirror 40 through filters 24. The controiding process speeds up the processing by only looking over small windows within the CCD image plane rather than the whole pattern of light 50. Rough measurements are made using only one sample and increasing processing speed during fast moves. More accurate measurements are made by increasing the number of samples. A further increase in speed can be made by accessing the focal plane in parallel rather than in serial format.
In the preferred embodiment, referring to FIG. 1, dispersive element 18 is mounted on scanning mirror 12 and comprises a flat return mirror with a two dimensional grating on the reflective surface. The grating of dispersive element 18 has +/-10 diffracted orders 52 and produces a rectangular pattern of light over a 4° field when a beam of light is directed at dispersive element 18. The spacing of the grating is set so that each diffracted order 52 (FIG. 2) extends less than 1 field of view of optical system 10. The grating is made with either amplitude modulation, phase modulation or both, so that the diffracted orders 52 (FIG. 2) are of equal intensity within the limits of CCD camera 20. Alternatively, dispersive element 18 may comprise a light source or sources coupled to scanning mirror 12 which produces the pattern of light 50 (FIG. 2).
FIG. 3 illustrates the steps which occur in the preferred embodiment of the method of tracking for the present invention. In step 1, the actuator moves the scanning mirror into position to scan the target image. In step 1a, which occurs simultaneously with step 1, the IMU beam is directed into the optical system. Steps 1b-1d also occur simultaneously with step 1. In step 1b, the laser source directs a beam of light at the optical system. In step 1c, the optical system receives the light from the laser source. In step 1d, the dispersive element receives the beam from the optical system and produces a pattern of light comprising a grid of diffracted orders.
In step 2, the optical system receives signals from steps 1, 1a and 1d. The signals received by the optical system in step 2 are the light of the target image from the scanning mirror (step 1), the IMU beam (step 1a), and the pattern of light from the dispersive element (step 1d).
In step 3, a charge coupled device camera receives the signals representing the target image light, the IMU beam and the light pattern from the optical system simultaneously. This information is forwarded to a controller and data processor which processes the information.
In step 4, the information processed by the controller and data processor is used to determine the position of the scanning mirror. A centroiding program is used to determine the position of a diffracted order within the field of view of the CCD camera in relation to the IMU beam. Once the position for a given diffracted order is determined, the position for the scanning mirror is found.
While this invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention are defined by the appended claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US3015249 *||Mar 14, 1949||Jan 2, 1962||Northrop Corp||Tracking telescope|
|US3285124 *||Oct 26, 1964||Nov 15, 1966||Kollmorgen Corp||High precision pointing interferometer with modified kosters prism|
|US3316799 *||Nov 7, 1962||May 2, 1967||Barnes Eng Co||Two axis autocollimator using polarized light|
|US3554653 *||Jan 24, 1968||Jan 12, 1971||Lawrence Zielke||Autocollimator|
|US4472054 *||Aug 5, 1981||Sep 18, 1984||Office National D'etudes Et De Recherches Aerospatiales||Method and apparatus for determining a parameter of attitude of a body|
|US4725138 *||May 22, 1985||Feb 16, 1988||Adaptive Optics Associates Incorporated||Optical wavefront sensing system|
|US4807977 *||Jun 26, 1986||Feb 28, 1989||Eltron Research, Inc.||Multi-color electrochromic cells having solid polymer electrolytes and a distinct electrochromic layer|
|US4952056 *||May 5, 1989||Aug 28, 1990||Entwicklungsgemeinschaft Asi||Method of determining the autocollimation angle of a grating coupler|
|US4952809 *||Jul 6, 1988||Aug 28, 1990||Gec-Marconi Limited||Imaging system|
|US5110210 *||Jul 23, 1990||May 5, 1992||Mcdonnell Douglas Corporation||Precision angle sensor|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US5821526 *||Jan 25, 1996||Oct 13, 1998||Itt Defense, Inc.||Star scanning method for determining the line of sight of an electro-optical instrument|
|US6630660||Aug 15, 2000||Oct 7, 2003||Bae Systems Plc||Image processing system and method for removing or compensating for diffraction spots|
|US7072032 *||Nov 20, 2003||Jul 4, 2006||Kabushiki Kaisha Topcon||Automatic tracking apparatus for reflector|
|US7274802 *||Nov 20, 2003||Sep 25, 2007||Kabushiki Kaisha Topcon||Automatic tracking apparatus for reflector|
|US20040100628 *||Nov 20, 2003||May 27, 2004||Kabushiki Kaisha Topcon||Automatic tracking apparatus for reflector|
|US20040101164 *||Nov 20, 2003||May 27, 2004||Kabushiki Kaisha Topcon||Automatic tracking apparatus for reflector|
|U.S. Classification||250/203.3, 356/121, 250/234|
|International Classification||G01S7/481, G01S17/46, G01C21/16|
|Cooperative Classification||G01S7/4811, G01S17/46, G01S7/4817, G01C21/16|
|European Classification||G01S7/481E, G01S17/46, G01C21/16, G01S7/481B|
|Dec 13, 1993||AS||Assignment|
Owner name: LITTON SYSTEMS, INC., MASSACHUSETTS
Free format text: STATEMENT OF OWNERSHIP WITH ATTACHED EMPLOYEE AGREEMENT.;ASSIGNOR:PURRAZZELLA, JOSEPH;REEL/FRAME:006801/0803
Effective date: 19931122
|Mar 25, 1996||AS||Assignment|
Owner name: HUGHES DANBURY OPTICAL SYSTEMS, INC., CONNECTICUT
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LITTON SYSTEMS, INC.;REEL/FRAME:007945/0794
Effective date: 19960216
|Jun 19, 2000||AS||Assignment|
|Jun 22, 2000||AS||Assignment|
|Jul 18, 2000||FPAY||Fee payment|
Year of fee payment: 4
|Feb 13, 2001||AS||Assignment|
|Aug 4, 2004||FPAY||Fee payment|
Year of fee payment: 8
|Aug 4, 2008||FPAY||Fee payment|
Year of fee payment: 12
|Aug 11, 2008||REMI||Maintenance fee reminder mailed|