|Publication number||USRE39978 E1|
|Application number||US 09/111,978|
|Publication date||Jan 1, 2008|
|Filing date||Jul 8, 1998|
|Priority date||Jan 29, 1996|
|Also published as||DE69734504D1, DE69734504T2, EP0877914A1, EP0877914A4, EP0877914B1, US5646733, WO1997028421A1|
|Publication number||09111978, 111978, US RE39978 E1, US RE39978E1, US-E1-RE39978, USRE39978 E1, USRE39978E1|
|Inventors||Leonard H. Bieman|
|Original Assignee||Ismeca Europe Semiconductor Sa|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (35), Non-Patent Citations (2), Referenced by (4), Classifications (6), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application is related to U.S. patent application entitled “Optical Measuring System” filed Jun. 17, 1994 and having U.S. Ser. No. 08/262,130.
This invention relates to non-invasive measuring methods and systems and, in particular, to scanning phase measuring methods and systems for an object at a vision station.
Height distribution of a surface can be obtained by projecting a light stripe pattern onto the surface and then reimaging the light pattern that appears on the surface. A powerful technique for extracting this information based on taking multiple images (3 or more) of the light pattern that appears on the surface while shifting the position (phase) of the projected light stripe pattern is referred to as phase shifting interferometry as disclosed in U.S. Pat. Nos. 4,641,972 and 4,212,073.
The multiple images are usually taken using a CCD video camera with the images being digitized and transferred to a computer where phase shift analysis, based on images being used as “buckets,” converts the information to a contour map of the surface.
The techniques used to obtain the multiple images are based on methods that keep the camera and viewed surface stationary with respect to each other and moving the projected pattern.
A technique for capturing just one bucket image using a line scan camera is described in U.S. Pat. No. 4,965,665 but not enough information is available to do a phase calculation based on multiple buckets.
Other U.S. patents which show phase shifting include U.S. Pat. Nos. 5,202,749 to Pfister; 4,794,550 to Greivenkamp, Jr.; 5,069,548 to Boehnlein; and 5,307,152 to Boehnlein et al.
U.S. Pat. Nos. 5,398,113 and 5,355,221 disclose white light interferometry systems which profile surfaces of objects.
In the above-noted application, an optical measuring system is disclosed which includes a light source, gratings, lenses, and camera. A mechanical translation device moves one of the gratings in a plane parallel to a reference surface to effect a phase shift of a projected image of the grating on the contoured surface to be measured. A second mechanical translation device moves one of the lenses to effect a change in the contour interval. A first phase of the points on the contoured surface is taken, via a four-bucket algorithm, at a first contour interval. A second phase of the points is taken at a second contour interval. A control system, including a computer, determines a coarse measurement using the difference between the first and second phases. The control system further determines a fine measurement using either the first or second phase. The displacement or distance, relative to the reference plane, of each point is determined, via the control system, using the fine and coarse measurements.
An object of the present invention is to provide a method and system including an optical head for making an optical phase measurement of a viewed object by generating an image whose intensity as a function of position relative to the optical head and wherein the system is configured in a way which allows multiple images with different phase information as the viewed object is moved in a direction perpendicular to the imaging system and these multiple images are used to calculate a phase image that is proportional to the optical phenomena that creates the phase change.
Another object of the present invention is to provide a method and system including an optical head for making a phase measurement of imagable electromagnetic radiation returned to a multi-line linear detector array by setting up the optics in the optical head in a manner such that a different phase value is imaged onto each line of the detector array such that each line of the detector array creates an image with a different optical phase value for the same point on the imaged object.
Yet still another object of the present invention is to provide a method and system including an optical head for scanning the height of a surface wherein the optical head includes a light stripe projector and imaging system where the projected pattern does not move relative to the imaging system and the optical head is configured in a way which allows multiple images with different phase information as the surface is moved with respect to the imaging system and these multiple images are used to calculate a phase image that is proportional to the height of the scanned surface.
In carrying out the above objects and other objects of the present invention, a method is provided for high speed scanning phase measuring of an object at a vision station to develop physical information associated with the object. The method includes the steps of projecting a pattern of imagable electromagnetic radiation with at least one projector and moving the object relative to the at least one projector at the vision station to scan the projected pattern of electromagnetic radiation across a surface of the object to generate an imagable electromagnetic radiation signal. The method also includes the steps of receiving the imagable electromagnetic radiation signal from the surface of the object with a detector having a plurality of separate detector elements and maintaining the at least one projector and the detector in fixed relation to each other. Finally, the method includes the steps of measuring an amount of radiant energy in the received electromagnetic radiation signal wherein the detector elements produce images having different phases of the same scanned surface based on the measurement and computing phase values and amplitude values for the different phases from the images.
In one embodiment, preferably the physical information is dimensional information and the imagable electromagnetic radiation is light.
In another embodiment, preferably the physical information is polarization information, the imagable electromagnetic radiation is polarized, a response of the detector elements is polarization-sensitive and the images are based on polarization from the surface.
Further in carrying out the above objects and other objects of the present invention, a system is provided for carrying out the above method steps.
The above objects and other objects, features, and advantages of the present invention are readily apparent from the following detailed description of the best mode for carrying out the invention when taken in connection with the accompanying drawings.
Referring now to
In general, the invention relates to the non-invasive three-dimensional measurement of surface contours using technology such as moire technology with a novel approach that allows continuous scanning of a surface. A more general adaptation of this approach allows the measurement of other optical parameters via the same scanning approach but with a different optical configuration.
The machine vision system 12 typically includes an image digitizer/frame grabber 22 electrically coupled to the optical head 12. The image digitizer/frame grabber 22 samples and digitizes the input images from an image source such as a camera contained within the optical head 12 as described in detail herein below. The frame grabber 22 places each input image into a frame buffer having picture elements. Each of the picture elements may consist of an 8-bit number representing the brightness of that spot in the image.
The system 10 also includes a system bus 26 which receives information from the image digitizer/frame grabber 22 and passes the information on to the IBM compatible host computer such as a Pentium PC 28.
The system 10 may include input/output circuits 30 to allow the system 10 to communicate with one or more external peripheral devices such as a drive 31 or robots, programmable controllers, etc. having one or more stages. The drive 31 provides relatively uniform and continuous movement between the object 14 and the head 12. The I/O circuits 30 may support a three axis stepper board (i.e. supports multiple axis control) or other motion boards.
As illustrated in
The system bus 26 may be either a PCI, an EISA, ISA or VL system bus or any other standard bus.
The image digitizer/frame grabber 22 may be a conventional three channel color frame grabber board such as that manufactured by Imaging Technologies, or other frame grabber manufacturers. Alternatively, the image digitizer/frame grabber 22 may comprise a vision processor board such as made by Cognex.
The machine vision system 10 may be programmed at a mass storage unit 32 to include programs for image processing and/or image analysis, as described in greater detail hereinbelow.
A monitor 34 is also provided to display images.
Referring again to
Although taking images with movement in any direction could result in the ability to obtain phase shifts, there is only discussed herein two specialized cases. The first case is movement of the object 14 in a direction 20 perpendicular to an optical axis of a lens 40 of the camera 24 thereby creating a camera image. The second case is movement of the object 14 in a direction parallel to the optical axis of the lens 40 thereby creating a second camera image.
As with CCD linear array scanning, the object 14 is moved in the direction 20 which is perpendicular to both the optical axis of the linear array camera lens 40 and the line of pixels in the linear array camera 24. Thus, as the linear array camera 24 is read out line by line, the image of the object 14 moving past is created row by row. Using the trilinear array camera 24 for scanning produces three images of the scanned surface 18 with each image being offset by a certain number of rows. This offset is a function of the spacing between arrays and the rate at which the image of the surface 18 is moved past the sensing elements 25.
The concept of scanning phase measuring of the present invention is analogous to the color sensing by the above-noted color trilinear array except the color filters are not present and each of the three scanning lines measures a different phase of the projected light pattern instead of the color.
In terms of phase shifting technology, each scanning line measures a different “bucket,” and a three “bucket” algorithm is used on the computer 28 for measuring the phase of the projected light pattern and this phase is proportional to the surface height of the object being scanned.
Before the phase is calculated from the readings at each of the scanning lines, the three scanned images are registered so that the phase information from each of the three buckets is from the same point on the scanned surface. The registration correction and the calculation of the phase could be continuous if the electronics can accommodate this mode of operation.
As described above, three scanning lines are utilized. However, there is no reason that more scanning lines cannot be used to increase the number of buckets used in the phase calculation or to average more than one scan line for a bucket. For example, if one had 16 scanning lines, the sum of lines 1 through 4 could be used for bucket 1, the sum of lines 5 through 8 could be used for bucket 2, the sum of lines 9 through 12 could be used for bucket 3, and the sum of lines 13 through 16 could be used for bucket 4.
Case 2 alluded to above would most likely use a CCD area array in the optical head 1 but could use a linear array or single point photodetector. In this case, as the surface 18 is moved toward or away from the optical head 12, images are taken as the phase of the projection changes. The analysis would consist of correcting for registration between images and then using the images to create the buckets needed for the phase calculation. If the camera images telecentrically or nearly telecentrically, then registration would not be required.
Systems that employ the Case 2 set-up have been described for use in white light interferometry systems as described in U.S. Pat. Nos. 5,398,113 and 5,355,221 but not for a moire (light stripe) application.
Although a method is described above for making phase calculations based on a moire (light stripe) system, the described technique could also be applied to any optical base phenomena where the phase is changed between the images created when moving the object 14 of interest relative to the optical head 12. Techniques that can create this phase change include moire interferometry, white light interferometry, standard monochromatic light optical interferometry, ellipsometry, birefringence, and thermo-wave imaging.
The use of polarization to create an ellipsometer illustrates another optical based phenomena where phase is changed between the images created when moving the object 14 of interest relative to the optical head 12. The adaptation of this scanning phase measuring technique to ellipsometry and birefringence measurement can be understood as an adaptation of a rotating-analyzer ellipsometer (as described at pp. 410-413 of the book entitled “Ellipsometry and Polarized Light,” Azzam and Bashara). The rotating-analyzer ellipsometer projects polarized light onto a surface and the polarization of the reflected beam (or transmitted beam depending on geometry) is determined by rotating an analyzer (linear polarizer) in front of the receiving detector. The radiation received at the detector varies as a sinusoidal function that it twice the frequency of the rotating analyzer. The amplitude of the signal is to the degree of linear polarization of the light received at the analyzer and the phase defines the angle of polarization.
Using the scanning phase measuring technique of the present application, the rotating-analyzer would be replaced by three or more analyzers, each of which would have a row of detector elements (scanning lines) behind it to image the received radiation at difference polarized phase values. The object to be measured would be moved past the fixed projector and detector system on an optical head 12″ as shown in
Reference numerals 60, 61 and 62 designate an analyzer system in front of detector lines wherein 60 designates a linear polarizer parallel to the linear array 24″, 61 designates a linear polarizer at 45 degrees to the linear array 24″, and 62 designates a linear polarizer perpendicular to the linear array 24″.
The example shown in
Referring again to the first embodiment of the invention, the optical head 12 includes the light strip projector 28 and the camera includes the imaging lens 40 for focusing the scanned surface onto the trilinear array 24. The scanned surface is translated past the optical head 12 in the direction of the arrow 20. To eliminate perspective effects in both projection and imaging, the project and imaging system should be either telecentric or nearly telecentric. A nearly telecentric system is created by having the standoff from the optics being much larger than the measurement depth range.
For this discussion, the data from the first linear array in the detector is called b1 (for bucket 1). Likewise, the second and third linear arrays is called b2 and b3, respectively. The pitch of the projected light pattern creates a phase difference of ½ a cycle between b1 and b3. For each linear array, let b1(i,j), b2(i,j) and b3(i,j) designate the light intensity measurement for each linear array with j indicating the pixel number and let j indicating the scan number. For example, b2(25,33) would be the intensity reading of the 25 pixel of the second linear array taken from the 33 scan.
The phase value which is proportional to depth is calculated within the computer 28 using the light intensity reading from the trilinear array as the object 14 is moving uniformly past the optical head 12. The preferred equation is:
where m is an integer that provides the required image shift to match registration between b1, b2 and b3.
In like fashion, the preferred equation for amplitude value is:
amplitude value(i,j)=(((b1(i,j)−b2(i,j+m))2+(b2(i,j)−b3(i,j+2m))2)+ 1/2+
In some instances, it is desirable to project from more than one angle. For example, projecting from each side of the camera can reduce occlusion problems. Projecting with patterns having different contour intervals (the change in depth for one phase cycle) can be used to eliminate ambiguity if the measurement range is more than one contour interval.
Measurements with more than one projector by including a second projector 42 can be accomplished by cycling the part past the optical head and changing which of the projectors 38 or 42 is on for each cycle. Or, one of the illuminating projectors 38 or 42 can be changed for each scan of the array. For example, assuming two projectors, when j is even, the first projector 38 would be on and when j is odd, the second projector 42 would be on. For calculations to work out properly for this alternating system, then m, the integer shift value, must be even. Thus, using this alternating approach, phase value image for the first projector 38 would be: phase value (i,2j) where j=0,1,2, . . . ; and phase value image for the second projector 42 would be: phase value (i,2j+1) where j=0,1,2, . . . .
If it is desirable to increase the pitch of the imaged grating pattern, a second grating 44 can be added to the imaging side as illustrated in FIG. 3. In some instances, it is desirable to include an imaging lens between the grating 44 and the array 24. The parts shown in
The beat effect between the two grating patterns is the optical moire effect and will increase the pitch imaged onto the detector. This can be desirable when one wants to use a pitch finer than can be resolved by the detector. That is, the primary pitch is less than the width of a pixel.
While the best mode for carrying out the invention has been described in detail, those familiar with the art to which this invention relates will recognize various alternative designs and embodiments for practicing the invention as defined by the following claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US3614237 *||Jan 21, 1969||Oct 19, 1971||Lockheed Aircraft Corp||Method and apparatus for contour measurement|
|US3762818 *||Jul 14, 1971||Oct 2, 1973||Lockheed Aircraft Corp||Contour measurement apparatus|
|US3814521 *||Sep 12, 1972||Jun 4, 1974||Hoffmann La Roche||Object recognition|
|US4212073||Dec 13, 1978||Jul 8, 1980||Balasubramanian N||Method and system for surface contouring|
|US4639139||Sep 27, 1985||Jan 27, 1987||Wyko Corporation||Optical profiler using improved phase shifting interferometry|
|US4641972||Sep 14, 1984||Feb 10, 1987||New York Institute Of Technology||Method and apparatus for surface profilometry|
|US4794550||Oct 15, 1986||Dec 27, 1988||Eastman Kodak Company||Extended-range moire contouring|
|US5069548||Aug 8, 1990||Dec 3, 1991||Industrial Technology Institute||Field shift moire system|
|US5085502||Apr 30, 1987||Feb 4, 1992||Eastman Kodak Company||Method and apparatus for digital morie profilometry calibrated for accurate conversion of phase information into distance measurements in a plurality of directions|
|US5118192||Jul 11, 1990||Jun 2, 1992||Robotic Vision Systems, Inc.||System for 3-D inspection of objects|
|US5135308 *||Mar 8, 1991||Aug 4, 1992||Carl-Zeiss-Stiftung||Method and apparatus for non-contact measuring of object surfaces|
|US5202749||Dec 22, 1989||Apr 13, 1993||Klaus Pfister||Process and device for observing moire patterns of surfaces to be tested by application of the moire method using phase shifting|
|US5307152||Sep 29, 1992||Apr 26, 1994||Industrial Technology Institute||Moire inspection system|
|US5309222||Jul 14, 1992||May 3, 1994||Mitsubishi Denki Kabushiki Kaisha||Surface undulation inspection apparatus|
|US5319445||Sep 8, 1992||Jun 7, 1994||Fitts John M||Hidden change distribution grating and use in 3D moire measurement sensors and CMM applications|
|US5343294||Jun 15, 1992||Aug 30, 1994||Carl-Zeiss-Stiftung||Method for analyzing periodic brightness patterns|
|US5355221||Oct 25, 1993||Oct 11, 1994||Wyko Corporation||Rough surface profiler and method|
|US5398113||Feb 8, 1993||Mar 14, 1995||Zygo Corporation||Method and apparatus for surface topography measurement by spatial-frequency analysis of interferograms|
|US5450204||Mar 26, 1993||Sep 12, 1995||Sharp Kabushiki Kaisha||Inspecting device for inspecting printed state of cream solder|
|US5463227||Jun 24, 1992||Oct 31, 1995||Robotic Vision Systems, Inc.||Method for obtaining three-dimensional data from multiple parts or devices in a multi-pocketed tray|
|US5465152||Jun 3, 1994||Nov 7, 1995||Robotic Vision Systems, Inc.||Method for coplanarity inspection of package or substrate warpage for ball grid arrays, column arrays, and similar structures|
|US5471308 *||Sep 22, 1994||Nov 28, 1995||Zeien; Robert||Phase shifting device|
|US5488478 *||Feb 15, 1993||Jan 30, 1996||British Steel Plc||Method and apparatus for measuring the shape of a surface of an object|
|US5546189||May 19, 1994||Aug 13, 1996||View Engineering, Inc.||Triangulation-based 3D imaging and processing method and system|
|US5561525||Dec 20, 1995||Oct 1, 1996||Nikon Corporation||Interferometer for observing the interference pattern of a surface under test utilizing an adjustable aperture stop|
|US5621530||Apr 26, 1995||Apr 15, 1997||Texas Instruments Incorporated||Apparatus and method for verifying the coplanarity of a ball grid array|
|US5636025||Jun 17, 1994||Jun 3, 1997||Medar, Inc.||System for optically measuring the surface contour of a part using more fringe techniques|
|US5644392||Sep 12, 1995||Jul 1, 1997||U.S. Natural Resources, Inc.||Scanning system for lumber|
|US5646733||Jan 29, 1996||Jul 8, 1997||Medar, Inc.||Scanning phase measuring method and system for an object at a vision station|
|US5680215||Feb 27, 1995||Oct 21, 1997||Lockheed Missiles & Space Company, Inc.||Vision inspection system and method|
|US5687209||Sep 13, 1996||Nov 11, 1997||Hewlett-Packard Co.||Automatic warp compensation for laminographic circuit board inspection|
|US5734475||Oct 15, 1996||Mar 31, 1998||Ceridian Corporation||Process of measuring coplanarity of circuit pads and/or grid arrays|
|US5745986||Jul 24, 1995||May 5, 1998||Lsi Logic Corporation||Method of planarizing an array of plastically deformable contacts on an integrated circuit package to compensate for surface warpage|
|US5753904||Dec 29, 1995||May 19, 1998||Motorola, Inc.||Tool for detecting missing balls using a photodetector|
|US5766021||Oct 1, 1996||Jun 16, 1998||Augat Inc.||BGA interconnectors|
|1||Mather, D.R., et al., "Machine Vision Inspection for the Challenges of Ramped-Up DVD Production", Advanced Imaging, 55-57, (May 1997).|
|2||Yang, H.S., et al., "Determination of the identity, position and orientation of the topmost object in a pile: some further experiments", 1986 IEEE Int'l Conf on Robotics and Automation (Cat No. 86CH2282-2) San Francisco, CA, vol. 1 XP002118970, 293-298, (4-7-10-86).|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8570530 *||Jun 3, 2009||Oct 29, 2013||Carestream Health, Inc.||Apparatus for dental surface shape and shade imaging|
|US9228957 *||Aug 1, 2014||Jan 5, 2016||Gii Acquisition, Llc||High speed method and system for inspecting a stream of parts|
|US20100311005 *||Jun 3, 2009||Dec 9, 2010||Carestream Health, Inc.||Apparatus for dental surface shape and shade imaging|
|US20140346097 *||Aug 1, 2014||Nov 27, 2014||Gii Acquisition, Llc Dba General Inspection, Llc||High speed method and system for inspecting a stream of parts|
|U.S. Classification||356/604, 250/237.00G|
|International Classification||G01B11/24, G01B11/25|
|Sep 5, 2008||AS||Assignment|
Owner name: ISMECA SEMICONDUCTOR HOLDING SA, SWITZERLAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ISMECA EUROPE SEMICONDUCTOR SA;REEL/FRAME:021489/0199
Effective date: 20080703
|Dec 12, 2008||FPAY||Fee payment|
Year of fee payment: 12