CA2355757C - High speed camera based sensors - Google Patents

High speed camera based sensors Download PDF

Info

Publication number
CA2355757C
CA2355757C CA002355757A CA2355757A CA2355757C CA 2355757 C CA2355757 C CA 2355757C CA 002355757 A CA002355757 A CA 002355757A CA 2355757 A CA2355757 A CA 2355757A CA 2355757 C CA2355757 C CA 2355757C
Authority
CA
Canada
Prior art keywords
image data
pixels
array
data
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA002355757A
Other languages
French (fr)
Other versions
CA2355757A1 (en
Inventor
Leonard Metcalfe
Cash Reuser
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LMI Technologies Inc
Original Assignee
LMI Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LMI Technologies Ltd filed Critical LMI Technologies Ltd
Publication of CA2355757A1 publication Critical patent/CA2355757A1/en
Application granted granted Critical
Publication of CA2355757C publication Critical patent/CA2355757C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving

Abstract

Disclosed herein are TV Camera based electro-optical sensors, providing affordable methods and apparatus for high speed determination of dimensions and other features of objects. Preferred embodiments utilize addressable line or pixel based cameras to effect triangulation based and other machine vision sensing of objects.

Description

HIGH SPEED CAMERA BASED SENSORS

[0001] This application claims benefit of U.S. Provisional Application 60/227,018, filed Aug. 23, 2000.

CROSS REFERENCES TO RELATED CO-PENDING APPLICATIONS
[0002] Reference is made to U.S. Ser. No. 09/931,179 "High Speed and Reliable Determination of Lumber Quality Using Grain Influenced Distortion Effects", and U.S. Ser. No. 09/931,178 "Method and Apparatus for Scanning Lumber and Other Objects", filed the same day.

FIELD OF THE INVENTION
[0003] The invention relates to TV Camera based and other electro-optical sensors and systems, providing affordable methods and apparatus for high speed scanning of dimensions and other features of objects.
[0004] The invention uses one or more light sources combined with TV
cameras which may be randomly scanned in some form, and whose output is used as input to a computer, such as a PC. This data is analyzed to typically provide data concerning the location or dimension of objects or parts of objects and/or the presence or characteristics of certain features of objects.
[0005] The invention is particularly useful for applications in which high measurement speed is needed, such as determining the shape of boards moving at high speed on conveyor lines in sawmills. Data taken with such sensors is used to control sophisticated sawing operations aimed at maximizing yield from boards of variant shape, particularly in their edge regions. Utmost accuracy, and high data density are both required, which in turn demands very fast sensing devices, particularly of the triangulation type.

I

BACKGROUND OF THE INVENTION
[0006] Triangulation based sensors have found favor in industry, particularly to date in the Wood processing, Automotive and Electronics industries, and for programmable contouring and robotic guidance in general.
[0007] Historic sensor devices employing image scanning based cameras with photodetector arrays to address these industries are typified by Pryor et al:
U.S. Pat. No. 5,734,172 entitled Method and apparatus for electro optically determining the dimension, location and attitude of objects, and other patents by the same inventors. Typically such devices are built with laser light sources, generally semi-conducting diode lasers, which provide cost effective delivery of concentrated optical energy. However, projection of suitable zones onto parts using standard projection optics can also be used, particularly where grid or other two dimensional patterns of zones are employed. Grids can be projected for example by creating same electronically in an LCD or DLP projector, commonly available for power point presentations. Alternatively, a standard slide projector with a Ronchi ruling in place of a slide can be used in some cases.
[0008] Further references disclosing triangulation measurements with photo-detector arrays are; U.S. Pat. No. 4,891,772 Case, et al. entitled Point and line range sensors; and Liptay-Wagner et al, U.S. Pat. No. 4,394,683 entitled New photodetector array based optical measurement systems.
[0009] In the specific area wood measurement, examples of laser triangulation based sensor units specifically designed for this are Leong et al: US Patent 4,937,445, entitled Apparatus for determining the distances of points on a surface from a reference axis and Cielo et al, US Patent 5,056,922 entitled Method and apparatus for monitoring the surface profile of a moving workpiece, and Chasson, US Patent 4,188,544 entitled Method and Apparatus for Automatically Processing a workpiece employing calibrated scanning.
[0010] Speed of data acquisition is critical to many industrial applications.
However the sensors described above which are capable of scanning an image in order to determine information concerning the object at high speed, use linear photodetector arrays to achieve the desired speed. This allows only one scan line of potential data to be interrogated, limiting the versatility of the sensor considerably, and increasingly adding extra cost (due to the relative expense of the lower volume linear array devices).
[0011] To-date the only known effort to achieve high speed triangulation with matrix arrays has been by IVP corporation, whose commercial literature today addresses a particular method of scanning individual lines of an array and processing onboard . A
US patent; 5,982,393 by Forchheimer et al of IVP describes methods by which computing can be done directly on pixel data using processors located on the same image chip, but discloses little about the methods for sensing and processing the data desired. To our knowledge, no other IVP information available to the public exists.
[0012] We do not believe the IVP camera is a totally random access camera, but rather a "smart" camera that can select one line (column) and parallel process (thresholding etc.) on every pixel in the line. But it cannot randomly read out a single pixel, which is desirable in many embodiments of our invention.
[0013] US Patent 5,717,199 by Carbone et al. discloses methods and apparatus by which data can be read randomly from pixels of a camera, in a manner different than IVP . However, this patent does not disclose methods by which such devices can actually be used to make practical triangulation or other measurements required in industry in an apparatus such as disclosed herein.

SUMMARY OF THE INVENTION
[0014] This invention relates to a significant advance over the state of the art in for example, the sensors used in the wood processing business, such as disclosed in Leong et al, Cielo et al, and others for measuring boards moving transversely at the high speeds needed to provide information to real time sawing and other operations in lumber mills.
[0015] A preferred embodiment utilizes laser triangulation with one or more points or lines, and specialized camera scanning methods and apparatus to provide meaningful answers as rapidly as possible of the line or point image locations in the field of view of the camera.
[0016] The invention particularly concerns the use of random access photo-detector arrays, combined with detection and computational algorithms to increase the speed of triangulation devices 10 to 100 times.
[0017] It is noted that in the following discussion, the word "laser" is meant to connote not only the laser device itself of whatever kind (typically a semi-conducting diode laser), but also any associated optics and power sources needed to assure that reliable optical energy can be delivered to a zone on the surface of the object to be measured. Typically, but not necessarily, such a zone is produced by focusing the radiation emanating from the laser to a small zone at the mean point of object location in the laser projection direction. In other cases cylindrical optics are used to create line projections. Optics may be either refractive, reflective or diffractive/holographic in nature.
[0018] It should also be noted that light sources other than lasers can be used, such as LEDs. However, laser sources are generally preferable in all applications except where large areas are to be illuminated, such as with structured light grids or other patterns.
[0019] The application is particularly, but not exclusively, concerned with sensors using photo-detector arrays with randomly programmable addressing of pixel elements.
Increasingly these arrays are made by the CMOS process, but any suitable array with similar random addressing capability such as CCD or CID types can be used.
CMOS
types are especially attractive as they permit random addressing, and are inexpensive.
GOALS OF THE INVENTION
[0020] It is a goal of the invention to provide a sensory device, employing at least one photo-detector array camera operating at the highest possible operational speed, with acceptable accuracy and at reasonable cost.
[0021] It is another goal of the invention to provide method and apparatus for selectively analyzing certain portions of an image in order to improve sensor response speed, and for predicting the portions to analyze in the future.

[00221 It is another goal of the invention to provide method and apparatus for improving sensor response speed by using low resolution A-D conversion, with a subsequent re-reading of pixel intensity values in the areas of interest.

[00231 It is another goal of the invention to provide method and apparatus for selectively on a pixel basis, controlling laser power or integration time.

[00241 It is also a goal of the invention to provide a method for increasing the reliability of detection of projected zones on objects with variant reflectance characteristics at different portions of their surface.

[0025] It is another goal of the invention to provide method and apparatus for increasing the speed of camera based sensors in determining the location of features in an image.

[0026] It is a further goal of the invention to provide means for high speed contouring of objects, especially those moving with respect to a sensor.

BRIEF DESCRIPTION OF THE DRAWINGS

[0027] Figures 1A-B illustrates pixel addressable triangulation sensor embodiments of the invention employing a spot shaped zone, either circular or elongate.

[0028] Figures 2A-D illustrate pixel addressable triangulation sensor of the invention employing a multi-spot group of zones for board measurement.

[0029] Figures 3A-C illustrate pixel addressable triangulation sensor of the invention employing a line zone or a multi-line group of zones.

[0030] Figures 4A-B illustrate a pixel scan and processing embodiment using rough approximations of analog pixel values to increase scan speeds.

[0031] Figure 5 illustrates methods to control sensor characteristics such as light power, exposure or data density.

PREFERRED EMBODIMENTS OF THE INVENTION
Figure 1 [0032] Figure 1A illustrates triangulation sensor embodiments of the invention employing a spot shaped zone. For example, consider image 100 of a spot type projected zone 101, projected by laser 102 on object 105, which is imaged by lens 106 on to pixel addressable photo-detector matrix array 107, for example a Photon Vision Systems (Homer, NY.) ACS-1 active column imager. It is desirable for many applications that the pixels of the array are able to be read in a non-destructive manner, such that one can re-read their values based on intelligence gathered in a first reading. Such readout is relatively common with CMOS type photo-detector arrays, such as those made by Photobit company.

[0033] Typically such sensors employ photo-detectors which are photo-detector arrays of either linear or matrix types. Processing to determine zone image position can be using thresholded centroids and multiple centroids as described Pryor et al, derivatives as described in Liptay-Wagner et al, US Patent 4394683 entitled New photo-detector array based optical measurement systems, or first moment calculations as described in US Patent 4,219,847 by Pinkney et al. entitled Method and apparatus of determining the center of area or centroid of a geometrical area of unspecified shape lying in a larger x-y scan field, or by any other suitable means.

[0034] In the first mode of operation of the instant invention, scan lines of the array are chosen specifically to correspond to the expected location in the x direction of the zone (typically a spot) image across the array. This expected location is advantageously predicted in many instances by determining where the image was on a previous scan, and assuming it is in the immediate vicinity. Or it may be predicted by first determining the location of some other feature on the object, from whence it would be logical to assume the location of the projected zone on the surface to lie.

[0035] To detect the zone image, one just has to determine pixel intensities on a few scan lines of the photo-detector array, such as the four scan lines 110-113, in order to characterize the location of the zone in the y direction of the array (the direction of change of zone position with change in board dimension). One can determine the centroid (or other delineator of image location) of the zone image 100 seen on the scan lines, using methods described in the referenced patents above, and if desired, average the results to substantially improve resolution and eliminate errors due to surface condition, as taught in US Patent 4,667,231.

[0036] The choice of which lines to scan depends on the zone image location, and thence on the projected zone spacing on the object. Choosing to scan only where zone image information lies, results in a speed advantage, in direct proportion to the number of lines of the array divided by the number only needed to be scanned. In this instant example of two zone images, with 4 lines each needed or desired to find and characterize same, this means that a 1000 line array operating at 10 scans per second, which only needed to scan 4 lines, would be able to run 2500 scans per second.

[0037] However it is not necessary to scan the whole line. As shown, the spot image 100 is clearly only located let us say within a pixel band 121, comprising, for example, pixel rows # 200 to #230 in the y direction of array. If we can assure the spot is in this band, we only need to scan these pixels in the x direction, to find the zone image. If we thus know that it is within 4 scan lines ( of a total say of 1000 in the y direction), and in the x direction within 30 pixels out of say 1000, we then only need scan 33x 4 pixels of 1000x1000, or about 1/10,000 of a normal readout of such a 1"MEGA-Pixel"
array. At scans/sec normally, this means that with random pixel addressing, one can readout the pixel intensities at the spot location in 10 microseconds. (in other words a 100khz update rate).

[0038] This readout can be even faster if one doesn't wait for the A-D device converting the analog pixel intensities to settle. This is discussed in figure 4 below.
[0039] Its noted that the scanned lines or pixels do not necessarily have to be contiguous in order to allow a zone location determination to be made (via computation of the centroid of intensity or other criteria). This can speed up the rate at which points can be found. In general however, the less pixels operative per zone image, the more reduced the resolution of its location.

[0040] Shown in figure 1 B is an alternate example of an elongate spot zone image, 140 on array 141, in which a window 145 comprising a rectangular matrix of individual pixels are addressed in order to determine the centroid (or other determiner of location) of a single elongate type zone image 140.

[0041] If the array is aligned as shown, at different ranges the spot image falls along the same scan lines of the array, since they are parallel to the plane formed by laser projection of the zone and the imaging lens axis.

Figure 2 [0042] Figure 2A illustrates pixel addressable matrix photo-detector array triangulation sensor of the invention similar to figure 1, employing multiple laser sources disposed in the x, or transverse direction to that of a moving object such as a board which it is desired to measure in the profile sections corresponding to the laser beam locations.

[0043] As shown, three independent laser sources 200, 201 and 202 direct zones of light, 210-212, in this case spot shaped, at a board 220 as shown, traveling on a chain conveyor 223 in the y direction at 30 inches/second. The zones are in a line, 225, parallel to the long axis (x axis) of the board, and perpendicular to the conveyor flow.
[0044] A camera 230, composed of photo-detector matrix array 235 and lens 238 is used to scan the images 240-242 of the zones on the array. Each zone is analyzed for its position on the array as described above, by scanning only certain lines or pixels of the array, thus providing rapid location data of the board with respect to the sensor composed of the laser/camera combination. By taking sequential scans, a complete profile section for the board at each axial zone x location, can be built up during the rapid traverse of the board past the sensor station.

[0045] In this particular example, illustrated in figure 2B, lines #250-260 of the array were scanned to find one of the zone images, in this case zone image 240 chosen as a master. This image was determined from the scan of these lines, to lie between pixel row locations #265 and 275. Since the board is of reasonably similar shape in the longitudinal (Y) direction of the zones projected along its length, it was determined that the same pixel row data (e.g. between rows 265 and 275) could be scanned to find and determine the zone centroids of the other two zone images 241 and 242. The scan lines used to find the other two images are constant in bands 276 and 277, each of which could also be chosen for example to be 10 scan lines wide.

[0046] In another representative example, shown in figure 2C, the centroid of the zone image of 241 was found to lie at a location centroid on pixel row 272, where as 240 was on 268. Since the board was assumed linear, but in this case tilted with respect to the camera, or the conveyor or both, the third zone image was determined by scanning a different set of pixels in the y direction, predicted by the slope between 240 and 241, and represented by band 278. Thus knowledge of some results in the image can be used to govern the scan of further data of an image- even in the same camera scan.

[0047] It can also affect sequential scans. For example if the zone image positions are moving with time in a monotonic way, this could indicate a change in board thickness occurring. The pixels interrogated for further boards could accordingly be chosen differently and predicted as a result of the instant board pixel centroid location.
[0048] Figure 2D illustrates another situation where a rectangular matrix of points 290-293 are projected (by projection source not shown) on a board 289, forming, using lens 294, zone images 295-298 on photo-detector array 299. Each of the board locations corresponding to the points 290-293 can thus be determined by triangulation through interrogation of the location of the corresponding zone images on the array. The projected grid of points can be other that the square grid of round zones shown, for example lines of rectangular shaped zones.
[0049] In the case of an object, such as a board or other objects which do not vary much across their surface, the location of image points in both directions is approximately known, thus allowing the scanning of pixels to be contained to only certain regions of the array on any one scan.

Figure 3 [0050] FIG. 3B illustrates pixel addressable triangulation sensor of the invention employing a line laser source and a single array. As shown laser beam 300 from laser 301 is expanded in to a fan by cylindrical optical element 305 (refractive, reflective or diffractive, as desired) to form a line, 310, disposed longitudinally in the X axis down the long axis of a board 320 moving transversely in Y, typically by movement of a chain conveyor such as 343. Typically the angle of the fan is such, that the light falls off the end of the board in both directions --except when multiple sensors of the type shown are employed to cover very long boards (see for example FIG. 7 of U.S. Ser. No. 09/931,178 "Improved method and apparatus for scanning lumber and other objects", filed the same day).

[0051] An addressable matrix array camera 335 of the invention has its optical axis 336 oriented at an angle theta to the projection axis 337 of laser radiation, and is used to sense the image 340 formed of line 310 on the board. Such a system is capable of generating a large number of points along the line very quickly- the number equal, if desired, to the number of pixel lines in the array in the x direction, assuming the data as to line centroid location at a given x location on the board can be achieved rapidly.
[0052] This can be preferably achieved though apriori knowledge of line location, realizing that the centroid of the line in the y direction is similar in nature to that of the spot centroids discussed above. Indeed the line is in some regards, just a succession of spots, which like the spots, give range to the object from knowledge of their y axis centroids or other parameters and any given x location along the array or the object (which is mapped onto the detector array 345 by the lens 350 of the camera).

[0053] The first example of intelligent processing is to find the edge of the line, or any other position such as the line location in the center of the array, and from there simply move to adjacent points in succession, using the change from one point to the next to predict where a subsequent point should likely be. For many objects with smoothly changing surfaces, this is a monotonic function.

[0054] With reference to the expanded view of figure 3A, an algorithm I have found useful is to look for the edge "P" of the line image 340 on array 345 , as shown, corresponding to the edge Po on the board object. Then once found, say to lie in a band of pixels of width in the pixel row direction 380, (typically centered on point P and the row of pixels represented by it at that moment) , we need only look for other points such as P' which lie with in this band - assuming the object is of relatively flat shape.
Note that in this case the camera has been aligned such that the pixel rows are generally parallel to the typical undisturbed board surface and to the conveyor. It can be appreciated that if such is not the case, that mathematical alignment in the readout computer can be used to normalize to this condition.

[0055] If the shape is less known, the band of scanned pixels can be centered on the last point on the line identified, which is not thought to be too far from the point immediately desired. Thus the band at instant point P' would be as shown 360, where as at P", the band to be interrogated would be 365.

[0056] If the line is reasonably monotonic, such a strategy works well. If the slope or some other characteristic can be predicted from one or more previous results, this information can be used to predict the desired band of future pixel interrogation at another point on the line. This same argument holds for changes in time as the object moves relative to the sensor.

[0057] It should also be noted that one can also project multiple lines, as shown in figure 3C, in a manner related to the multi-point group of zones of figure 2B.
In this manner a complete grid of points on the object surface can be acquired simultaneously if desired. In this case a cylindrical fan multi-line projector 385 such as made by Lasiris company is used, projection 3 lines 386-388 as shown on object 390. The grid spacing is the separation s of the lines projected.

Figure 4 [0058] Figure 4A illustrates another pixel scan and processing embodiment using rough approximations of analog pixel values to increase scan speeds. As shown an addressable matrix array 401 of the invention is controlled by scan controller 405 to read out the analog voltages of each pixel commanded by the controller. The pixel voltages are fed to A-D converter 410 and processed to get 12 bit (246 point dynamic range) digital values 415 for each of the pixel locations read. By allowing the A-D to settle such a result typically takes 100 nanoseconds.

[0059] The invention however, comprehends a rough image location step which can be used to speed the process. For example, if we read the conversion value before complete settling takes place, we may only get 8 bit resolution of pixel detected light intensity, but we can complete the process in 40 nanoseconds- an important 2.4X
speed improvement.

[0060] For applications in which a substantial number of pixels are used to calculate a centroid location, the reduction in intensity digitization accuracy may not result in significant reduction in centroid location accuracy. And even where it might, the invention comprehends a first scan step to locate the zone image, using a reduced resolution of 8 or even 6 bits, followed by a subsequent scan if desired at the full resolution. But this scan would only be of substantially the identified pixels from the first scan at which the zone was found to lie. If the larger initial scan field was much larger than the ultimate zone image scan area, the speed improvement can still be 2X
or more under the 8 bit scenario example above.

[0061] This is illustrated in figure 4B, where in step 450 the desired pixel elements of the photodetector array are scanned with a settling time less than the optimum desired.
In step 455, the location of a desired projected zone or natural object feature image is determined, by whatever means is practicable. In step 460, a determination of accuracy of the determination is made. If accuracy is sufficient the process of step 465 is complete for this scan, and the reading transmitted to a host device. If accuracy is not sufficient, settling time is reset to a longer time in step 470, and the process repeated, with a longer settling time used. With some D-A conversion devices, rescan is not needed, and the result just taken from the same D-A signal at a later (and more settled) time.

Figure 5 [0062] Figure 5 illustrates methods to control sensor characteristics such as light power, exposure or data density.

[0063] In addition to scan issues, it should be noted that sometimes severe lighting problems can occur, due to the reflection of the incident light from the part, both due to the angle of reflection from the surface with respect to the camera, and the sometimes discolored nature of the object surface. In operation of such sensors, it is desirable to control the control the light power and or the array integration time to give the best performance in all the regions of interest in the image. This can be varied on a sectional basis by choosing which lines are used to provide data to set the laser power, or by using data from one section or one line of scan to provide integration time for the next, or for a rescan of the same line.

[0064] Figure 5 illustrates pixel addressable camera further incorporating means to control sensor characteristics such as light power, exposure or data density.
As shown, the computer 501 commands via camera controller 502 the reading of pixels in window 505 of addressable photodetector array 510. The pixels in this window are used as the reference pixels from which to calculate light intensity, and thus integration time for the next scan of the array, and/ or to set the power or duration of light source (laser or other) 515 via light source controller 520. The window can be rectangular, square, circular, or any shape desired. In an extreme case, even one pixel can be used to effect such control.

[0065] In many cases lighting varies considerably through the complete image.
For example, in the line image of fig 3, it is commonplace on highly curved objects to have much more light in one area than another. Since the ratio of such light intensity is typically much the same from part to part, the integration time of readout of pixels say in a second window 530, can be set differently than that of window 505, in order to keep the light intensity within the operable limits of the photodetector array.

[0066] In one embodiment, an integration period is started at time zero, with the window 505 read first at time T1 since the surface reflects more in this region, and then at time T2 the pixels in window 530, covering a different region of interest in the image are read after they have had more time to integrate light on their surface, or on capacitors or other electrical devices connected therewith.

[0067] In the case where the light intensity provided by light source 515 is controllably increased (or decreased), under control of power source 520 (itself under control of computer 501), a similar tack can be taken. In a first example, as the light is increasing, information in the first window 505 is read, since the surface reflects more in this region. Then as the light approaches a maximum value of surface irradiation, information in a second window 530, for example, can be read. Multiple windows can be employed in this manner to optimally take data from all parts of the image as desired.
[0068] It is noted that where color filters or polarizing filters are used on such arrays, it is possible to address specific colors or polarizations randomly.

[00691 It is also noted that summing of multiple frames each individually taken at different exposure times or illumination light intensities (or both) can be used to achieve wider dynamic range when imaging varying target surfaces, which using the invention can be rapidly examined in the areas of interest desired.

Claims (26)

What is claimed is:
1. An improved method of triangulation comprising the steps of:

providing an addressable matrix array type TV camera capable of scanning individual pixels or groups of pixels;

providing an image on said array indicative of a location of at least one illuminated zone on an object to be measured;

scanning a limited number of pixels of said matrix array to determine image data relating to said zone, wherein the pixels to be scanned are selected based on knowledge of data from pixels previously scanned during the scanning of said object; and from said data determining dimension or location of said object.
2. A method according to claim 1 wherein said zone is provided by a laser.
3. A method according to claim 1 wherein the choice of pixel data of said array to be scanned is based on knowledge of image data taken on a previous scan.
4. A method according to claim 1 wherein the image data of said array is acquired at higher resolution than data acquired from a previous scan.
5. A method according to claim 4 wherein higher A-D resolution is used.
6. A method according to claim 4 wherein higher pixel density resolution is used.
7. A method according to claim 1 including the further step of controlling the illumination energy of said zone using said image data.
8. A method according to claim 1 including the further step of controlling the integration time of said pixels using said image data.
9. A method according to claim 1 wherein the choice of pixel data of said array to be scanned is based on knowledge of image data at another location in said image.
10. An improved triangulation sensor for measuring location or dimension of an object, comprising:

an addressable matrix array type TV camera capable of scanning individual pixels or groups of pixels;

a light source means for illuminating at least one point on an object;

lens means to provide an image on said array indicative of a location of at least one illuminated zone on an object to be measured;

means for scanning a limited number of pixels of said matrix array to determine image data relating to said zone, wherein the pixels to be scanned are selected based on knowledge of data from pixels previously scanned during the scanning of said object; and means for analyzing said data to determine location or dimension of said object.
11. Apparatus according to claim 10 wherein said light source is a laser.
12. Apparatus according to claim 10 wherein the choice of pixel data of said array to be scanned is based on knowledge of image data taken on a previous scan.
13. Apparatus according to claim 10 wherein the pixel data of said array is acquired at higher resolution than previous data.
14. Apparatus according to claim 13 wherein higher A-D resolution is used.
15. Apparatus according to claim 13 wherein higher pixel density resolution is used.
16. Apparatus according to claim 10 including the further step of controlling the illumination energy of said zone using said image data.
17. Apparatus according to claim 10 including the further step of controlling the integration time of said pixels using said image data.
18. Apparatus according to claim 10 wherein the choice of pixel data of said array to be scanned is based on knowledge of image data at another location in said image.
19. An improved method of triangulation sensor operation comprising the steps of:

providing an addressable matrix array type TV camera capable of scanning individual rows or columns;

providing an image on said matrix array indicative of a location of at least one zone projected on an object to be measured;

scanning a first plurality of rows or columns of said array to determine first image data, said first plurality being less than the total number of rows or columns;

using said first image data, scanning a second plurality of rows or columns of said array to determine second image data, said second plurality being less than the total number of rows or columns, to determine from said second image data the location of a second point of interest; and from said second image data, determining range to an object at said at least one position on said object.
20. A method according to claim 19 wherein said second image data relates to data from a different position on said object than said first image data.
21. A method according to claim 20 wherein said second image data relates to data from a different object than said first image data.
22. A method according to claim 21 wherein said second image data is acquired at higher resolution than said first image data.
23. A method according to claim 22 wherein said second image data resolution is improved in density of pixels.
24. A method according to claim 23 wherein said second image data resolution is improved in analog to digital conversion of pixels.
25. A method according to claim 24 wherein one or more individual pixels in a column or row are addressed, without interrogating a complete row or column.
26. An improved method of triangulation comprising the steps of:

providing an addressable matrix array type camera capable of scanning individual pixels or groups of pixels;

in the course of scanning an object to be measured, providing an image on said array indicative of at least one zone on said object;

reading out the values of less than all pixels of said matrix array to determine image data relating to said zone, the pixels whose values are to be read being selected based on knowledge of data from at least one pixel whose value was previously read during the scanning of said object; and from said data determining a dimension or location of said object.
CA002355757A 2000-08-23 2001-08-23 High speed camera based sensors Expired - Lifetime CA2355757C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US22701800P 2000-08-23 2000-08-23
US60/227,018 2000-08-23

Publications (2)

Publication Number Publication Date
CA2355757A1 CA2355757A1 (en) 2002-02-23
CA2355757C true CA2355757C (en) 2007-11-13

Family

ID=22851410

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002355757A Expired - Lifetime CA2355757C (en) 2000-08-23 2001-08-23 High speed camera based sensors

Country Status (2)

Country Link
US (1) US6825936B2 (en)
CA (1) CA2355757C (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2378625A1 (en) * 2002-03-20 2003-09-20 Martin Castonguay High-performance grade optimizer
WO2004071069A2 (en) * 2003-02-03 2004-08-19 Goodrich Corporation Random access imaging sensor
JP3741282B2 (en) * 2003-07-28 2006-02-01 セイコーエプソン株式会社 INPUT DEVICE, ELECTRONIC DEVICE, AND DRIVE METHOD FOR INPUT DEVICE
US7095002B2 (en) 2004-02-23 2006-08-22 Delphi Technologies, Inc. Adaptive lighting control for vision-based occupant sensing
DE102005011344B4 (en) * 2004-03-15 2014-02-13 Omron Corporation sensor device
US7130725B2 (en) * 2004-12-17 2006-10-31 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method for correcting control surface angle measurements in single viewpoint photogrammetry
DE102005051318B4 (en) * 2005-10-26 2011-11-17 Mathias Reiter Optical shape determination method
US7525114B2 (en) 2006-02-14 2009-04-28 Lmi Technologies Ltd. Multiple axis multipoint non-contact measurement system
CA2536411C (en) * 2006-02-14 2014-01-14 Lmi Technologies Inc. Multiple axis multipoint non-contact measurement system
US7755770B2 (en) * 2006-03-24 2010-07-13 Schlumberger Technology Corporation Method for mapping geometrical features with opto-electronic arrays
US20070273894A1 (en) * 2006-05-23 2007-11-29 Johnson James T Method and apparatus for remote spatial calibration and imaging
US20080094476A1 (en) * 2006-10-18 2008-04-24 Southern Vision Systems, Inc. System and Method of High-Speed Image-Cued Triggering
US20080174691A1 (en) * 2007-01-19 2008-07-24 Quality Vision International Inc. Strobed image acquisition guided by range sensor
US7684030B2 (en) 2007-05-04 2010-03-23 Vab Solutions Inc. Enclosure for a linear inspection system
US20080273760A1 (en) * 2007-05-04 2008-11-06 Leonard Metcalfe Method and apparatus for livestock assessment
CA2683206C (en) * 2009-10-17 2018-07-03 Hermary Opto Electronics Inc. Enhanced imaging method and apparatus
US20110149269A1 (en) * 2009-12-17 2011-06-23 Tom Van Esch Method and device for measuring the speed of a movable member relative a fixed member
US9052497B2 (en) 2011-03-10 2015-06-09 King Abdulaziz City For Science And Technology Computing imaging data using intensity correlation interferometry
US9099214B2 (en) 2011-04-19 2015-08-04 King Abdulaziz City For Science And Technology Controlling microparticles through a light field having controllable intensity and periodicity of maxima thereof
CN104471348A (en) * 2012-03-26 2015-03-25 曼蒂斯影像有限公司 Three dimensional camera and projector for same
US10368053B2 (en) 2012-11-14 2019-07-30 Qualcomm Incorporated Structured light active depth sensing systems combining multiple images to compensate for differences in reflectivity and/or absorption
CN104200503B (en) * 2014-09-02 2017-02-15 广东省宜华木业股份有限公司 Method for making large-breadth wood-grain digital images
US10455216B2 (en) 2015-08-19 2019-10-22 Faro Technologies, Inc. Three-dimensional imager
US10444006B2 (en) 2015-08-19 2019-10-15 Faro Technologies, Inc. Three-dimensional imager
DE102015121673B4 (en) 2015-12-11 2019-01-10 SmartRay GmbH shape investigation
EP3249348B1 (en) * 2016-05-26 2019-07-03 Baumer Electric AG Sensor device for measurement of a surface
US10378889B2 (en) 2017-08-22 2019-08-13 Faro Technologies, Inc. Measurement system having a cooperative robot and three-dimensional imager
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
US10955236B2 (en) 2019-04-05 2021-03-23 Faro Technologies, Inc. Three-dimensional measuring system

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1488841A (en) 1974-01-18 1977-10-12 Plessey Co Ltd Optical detection apparatus
US4188544A (en) 1977-08-22 1980-02-12 Weyerhaeuser Company Method and apparatus for automatically processing a workpiece employing calibrated scanning
CA1103803A (en) 1978-03-01 1981-06-23 National Research Council Of Canada Method and apparatus of determining the center of area or centroid of a geometrical area of unspecified shape lying in a larger x-y scan field
US4373804A (en) 1979-04-30 1983-02-15 Diffracto Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US4667231A (en) 1979-09-07 1987-05-19 Diffracto Ltd. Electro-optical part inspection in the presence of contamination and surface finish variation
US4394683A (en) 1980-06-26 1983-07-19 Diffracto Ltd. New photodetector array based optical measurement systems
US4606645A (en) 1984-10-29 1986-08-19 Weyerhaeuser Company Method for determining localized fiber angle in a three dimensional fibrous material
CA1266562A (en) 1986-09-24 1990-03-13 Donald Stewart Distance measuring apparatus
US4891772A (en) 1987-04-15 1990-01-02 Cyberoptics Corporation Point and line range sensors
US4916629A (en) 1987-06-26 1990-04-10 Weyerhaeuser Company Method for determination of pith location relative to lumber surfaces
CA1307051C (en) 1988-02-26 1992-09-01 Paolo Cielo Method and apparatus for monitoring the surface profile of a moving workpiece
US5252836A (en) 1991-03-07 1993-10-12 U.S. Natural Resources, Inc. Reflective grain defect scanning
SE9402551L (en) 1994-07-22 1995-10-30 Integrated Vision Prod Device for an image processing processor
NZ270892A (en) 1994-08-24 1997-01-29 Us Natural Resources Detecting lumber defects utilizing optical pattern recognition algorithm
US5717199A (en) 1996-01-26 1998-02-10 Cid Technologies, Inc. Collective charge reading and injection in random access charge transfer devices
US6064759A (en) * 1996-11-08 2000-05-16 Buckley; B. Shawn Computer aided inspection machine
US6297488B1 (en) * 1999-04-29 2001-10-02 National Research Council Of Canada Position sensitive light spot detector

Also Published As

Publication number Publication date
US20020060795A1 (en) 2002-05-23
CA2355757A1 (en) 2002-02-23
US6825936B2 (en) 2004-11-30

Similar Documents

Publication Publication Date Title
CA2355757C (en) High speed camera based sensors
US6618155B2 (en) Method and apparatus for scanning lumber and other objects
AU616731B2 (en) Method and apparatus for monitoring the surface profile of a moving workpiece
CA2380917C (en) Optical sub-pixel parts inspection system
CA2152914C (en) Image multispectral sensing
JP2954708B2 (en) Multifocal imaging system
US20020025061A1 (en) High speed and reliable determination of lumber quality using grain influenced distortion effects
CA2379561A1 (en) Optical beam shaper
KR970705304A (en) AN INTELLIGENT SENSOR FOR OPTICAL WHEEL ALIGNMENT
NL8102813A (en) NON-CONTACT MEASUREMENT OF THE PROFILE OF A SURFACE.
JP2004163435A (en) Absolute position detector and measuring method
US4735508A (en) Method and apparatus for measuring a curvature of a reflective surface
EP1269397A2 (en) Large depth of field line scan camera
US20200319315A1 (en) Chip scale integrated scanning lidar sensor
US6031225A (en) System and method for selective scanning of an object or pattern including scan correction
US6556307B1 (en) Method and apparatus for inputting three-dimensional data
US11293877B2 (en) Defect detecting device and defect detecting method
CA1298380C (en) Method and system for increasing the effective dynamic range of a photosensor
JPH05332733A (en) Detection optical system and method for detecting three-dimensional form
EP0871008B1 (en) Device for measuring the dimensions of an object that is very extensive longitudinally and whose cross section has a curved contour
IL277134B1 (en) Method and device for distance measurement
CN107193428B (en) Optical touch screen, touch positioning method thereof and optical distortion calibration method
CA2355756C (en) Method and apparatus for scanning lumber and other objects
CA2599472C (en) Method and apparatus for scanning lumber and other objects
US6411918B1 (en) Method and apparatus for inputting three-dimensional data

Legal Events

Date Code Title Description
EEER Examination request
MKEX Expiry

Effective date: 20210823