WO1988005904A1 - Object locating system - Google Patents

Object locating system Download PDF

Info

Publication number
WO1988005904A1
WO1988005904A1 PCT/US1988/000153 US8800153W WO8805904A1 WO 1988005904 A1 WO1988005904 A1 WO 1988005904A1 US 8800153 W US8800153 W US 8800153W WO 8805904 A1 WO8805904 A1 WO 8805904A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
normal
light source
recited
Prior art date
Application number
PCT/US1988/000153
Other languages
French (fr)
Inventor
Robert F. Barry
Samuel Kang
Original Assignee
Westinghouse Electric Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Westinghouse Electric Corporation filed Critical Westinghouse Electric Corporation
Publication of WO1988005904A1 publication Critical patent/WO1988005904A1/en
Priority to KR1019880701244A priority Critical patent/KR890700806A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light

Definitions

  • the present invention is directed to a method an apparatus for locating an object to be used for roboti machinery.
  • the invention resides in a method for locating an object an any feature thereon comprising the steps of (a) projectin a light pattern onto the object with a light source charac ⁇ terized by (b) digitizing the light pattern reflected from the object using a camera, where the camera and the light source are in a known geometric relationship to each other; (c) determining a normal to a surface of the object; (d) determining the intersection of the normal with the surface of the object; (e) moving the light source toward the camera by a predetermined amount; (f) determining another normal and intersection; and (g) repeating steps (d) and (e) until a contour of the object has been mapped by the intersections.
  • the invention also resides in an apparatus for locating an object characterized by a light source project ⁇ ing a predetermined light pattern having intersecting lines where one of the lines lies in a predetermined plane; a rotatable camera having a rotational axis rotating in the predetermined plane and receiving an image of the light pattern reflected off the object; a frame carrying said light source and said camera in a fixed geometrical rela ⁇ tionship; camera control means for controlling camera rotation; and means for determining a normal to the surface of the object from the image and the fixed geometrical relationship between said light source and said camera.
  • Figure 1 is a schematic diagram of the components of the system according to the present invention.
  • Fig. 2 illustrates the image detected by the camera 18 along with geometry of the present invention
  • Fig. 3 depicts values from which a surface normal is determined
  • Figs. 4A and 4B depict a flowchart of a process for determining the normal to a surface;
  • Fig. 5 illustrates a hole 80;
  • Fig. 6 depicts a hole location routine;
  • Fig. 7 illustrates the scanning of a contour to map a contour 120 and determine its length
  • Fig. 8 depicts a contour length determination routine
  • Fig. 9 illustrates a method for determining relative angles between surfaces
  • Fig. 10 depicts a routine for relative angl determination
  • Fig. 11 illustrates an object with severa surfaces
  • Fig. 12 depicts a routine for identifying a object using relative angles
  • Fig. 13 illustrates a Gaussian representation o an object
  • Fig. 14 depicts a routine for identifying a object using a Gaussian representation.
  • a computer 10 such as an IBM PC/AT controls light source/frame controller 12 such as a Unimate control ler model Puma A-75 to project a light pattern from ligh source 14 onto the surface of an object, as illustrated i Fig. 1.
  • the computer 10 through a camera controller 16 which can be another Unimate controller, controls th orientation and distance of camera 18 from the light sourc 14.
  • a suitable self-focusing camera is a CCD Model 34 camera available from Pulinex.
  • a frame grabber 20 couple to the camera, such as an Imaging Technology IT-100 device captures the image of the light source- on the object an provides the camera image to the computer 10.
  • the compute 10 calculates the location of the object and can send movement command to a robot controller 22 which moves robot 24 to -the object of interest.
  • the camera 18 and light source 14, as illustrate in Fig. 2, are mounted on an indexable encoder equippe frame or track 30 which allows the frame 30 to be rotated; the light source to be moved along the frame 30 in bot directions; the camera 18 to move in both directions alon the frame 30; and the camera 18 to be rotated.
  • Thi combination of movements along the encoder equipped fram permits all angles and distances to be recorded by compute 10.
  • the camera 18 only rotates in the light source projec- tion plane so that the camera 18 will always cross the light source image on the object.
  • the angle ⁇ betwee light source 14 and frame 30 is known because both the camera 18 and light source 14 are indexably controlled and is preferably 90°.
  • the distance d between the image surface of the camera 18 and the light source 14 is also known.
  • the angle e between the axis of the camera 18 and the frame 30 is also known. It is possible to project several different types of image patterns onto the object being viewed, preferably using a laser light source and cylindrical mirrors, as described in U.S. Patent Applica ⁇ tion Serial No. 897,473, filed August 18, 1986, and incor- porated by reference herein. A lens is used to collimate each pattern and a half reflecting mirror to combine light patterns. An alternative is to use a non-coherent light source and a ' mask. It is also possible to add a focusing lens to the mask arrangement if a ranging device, such as an ultrasonic distance sensor is used to adjust focus.
  • a ranging device such as an ultrasonic distance sensor
  • One type of projected image pattern is an L-shaped image pattern which, when projected onto the surface of the object can produce a distorted L-shape in the image plane 32 of the camera 18, as illustrated in Fig. 2.
  • the image examined by the computer 10 is a matrix of ones and zeros where the ones represent the reflected image. From the matrix, the angels ⁇ o, the intersection point Po, the end points Ao, Bo, the intersection point An and the lengths LI and L2 can be determined. From this camera image, the following values can be determined independent of any scale value:
  • the object plane and its normal can be determined.
  • the object plane is determined by two angles ⁇ 2 and (52 and the real coordinates of points PL (the location of t light source) (o,o,o), the camera location (d,o,o) and Po (the location of the object) (0,0,d
  • the normal determined in accordance with the following equation
  • Figs. 4A and 4B depict a flowchart of a process performed by computer 10 to determine the surface normal.
  • the computer obtains 52, the camera image from the frame grabber 20. This image is in the form of a matrix of gray scale values for each pixel produced by the camera 18.
  • the gray scale array is compared 54 one data point at a time to a threshold.
  • a corresponding bit of a line image array is set to a value of one, indicating the existence of a line point at the corresponding position in the camera image. If the gray scale value is below the threshold, a zero is entered in the line image array indicating darkness.
  • the horizontal image lines on which the horizontal line of the object image should appear are scanned 56 using a chain coding algorithm to find the endpoihts and the line length of the horizontal line.
  • An appropriate chain coding routine can be found in Pavlidis, Algorithms for Graphic and Image Processing, Computer Science Press, Rockville, Maryland, 1982, which is incorpo- rated by reference herein. Since the camera scans a constant plane with reference to the light source plane, the horizontal line should appear at approximately the same fixed location (row) in the camera image at all times. This allows the chain coding routine to begin scanning from a known point in the camera image.
  • the endpoints of the horizontal line are compared 58 to the camera frame edge. If the endpoints correspond to the camera frame edge, the camera is rotated 60 or 62 _by a predetermined angle depending on which edge is coincident with the endpoint. This rotation loop forces the camera 18 to rotate until the horizontal line segment of the image is completely within the frame of the camera image. Once the image of the horizontal line is completely within the camera image frame, the camera 18 is further rotated to center the image in the camera frame. This can be accom ⁇ plished by calculating the distances between endpoints of the horizontal line and the frame edge and rotating the camera until the distances are equal. Once the image is centered within the camera frame, the camera is activated to self find focus. As a consequence of the self-focus, the distance between the camera and the image on the object is known.
  • the computer scans 64 the camera image to find the non-horizontal line.
  • This vertical scanning can be accomplished in several different ways. For example, one endpoint of the line can be used as a horizontal index for a vertical scan of the image array which examines each pixel representation data point vertically until a light point is encountered. If the scan starting with the endpoint does not detect an image point, the horizontal coordinate is indexed and another vertical scan occurs. Once a single image point is found during the vertical scanning, the chain coding algorithm is performe 66 again to determine the endpoints and line length of the non-horizontal line.
  • the intersection point of the two lines can b determined 68 in accordance with the well-known slop intersect equation, even if the intersection point happen to fall in a hole or other cavity and is therefore no directly detectable.
  • intersection point . is determined, th normal to the surface can be calculated 70 in accordanc with the equations previously discussed.
  • the normal and transformed intersection poin define the location and orientation of the object surface
  • any of several different analysis routine can be executed 72.
  • the system i ideally suited for transforming object coordinates int another coordinate system, so that template matching i simplified.
  • hole location routine which determines the center an orientation of a hole 80, such as illustrated in Fig. 5.
  • the preferred light source image is a cross, however, any image that provides at least three legs that intersect will be acceptable. It is assumed for this routine that the hole 80 is located in a generally predict ⁇ able location such that the projected cross will encounter the hole 80. Since the location and orientation of the line segments 82 and 84 and their intersection are known, it is possible to continue scanning in the camera image along the known lines to detect line segments 86 and 88 on the other side of the hole 80. Once these segments are found 102, the routine of Fig.
  • the interior endpoints of the line segments are determined 106 by, for example, comparing horizontal coordinates of the horizontal line segments 82 and 88 and designating the two closest points W2 and W7 as interior points and performing a similar operation with respect to the vertical segments 84 and 86 to obtain the interior points W4 and W5. It is also possible to find the interior points by first executing an edge finding routine, men ⁇ tioned later and then comparing the hole edge to line segment coordinates. Coincidence between coordinates indicates an interior endpoint.
  • the center (x, y) of the hole 80 is then calcu- lated 108 using the appropriate coordinate values of the points W2, W4, W5 and W7.
  • the computer 10 can perform a transformation 110 from the object plane coordinate system to the robot plane coordinate system to obtain the appropriate values from the perspective of the robot.
  • the computer 10 can then command 112 the robot 24 through the robot controller 22 to insert an appropriate size bolt into the hole 80 along the known normal towards the center of the hole 80. Since the three-dimensional coordinates of the center of the hole 80 are known, the computer 10 can also appropriately command the robot 24 to begin spinning the bolt as it approaches the hole 80.
  • the format of such a command and a description of how the command is communicated to the robot controller 22 can be found in the operating manual for the robot controller 22 and the details of such are within the ordinary skill i the art.
  • Fig. 7 depicts a contour mapping and measuremen method whereby a contour F-G 120 is mapped and measured b moving the light source 14 from point F' to point G' alon the frame 30 while determining the * normals and intersectio points of normals on the surface 120.
  • the intersectio points provide mapping points for the contour 120.
  • the distances between the intersection points can be calculate and summed to determine the length of the contour 120.
  • Fig. 8 illustrates a routine for performing the contour mapping and length measurement. After the normal is determined, the intersection point and normal are stored 132 followed by a determination 134 of whether or not the end point G' has been reached. If not, the light source is moved 136 by a predetermined amount along the frame 30. If the scan endpoint G' has been reached, the contour length is calculated 138 by a simple summation.
  • the normal 142 can be determined from the normals. During this process, illustrated in Fig. 10, as the camera 18 is moved from one surface to another surface, the normal for the current point is saved 152 and then compared 154 with the prior normal. If the normals are equal, the camera 18 and light source 14 are moved 156 as a unit by a predetermined amount. If the normals are not equal then the angle between the two surfaces is calculated 158.
  • the identity of the object can be determined by comparing the relative angles between the surfaces to relative angles for reference objects.
  • a routine for identifying an object from surface normals is illustrated in Fig. 12. In this routine, the camera 18 must be in a position to capture 192 the image of the entire object. After the image of the entire object is captured, an edge finding routine such as described in Sobel, "Neighborhood Coding of Binary Images for Fast Contour Following and General Binary Array Process", CGIP, No. 8, August, 1978, pp. 127-135 r can ⁇ be executed 194 to determine or identify the regions of the object being examined.
  • well-known center determination algorithms can be executed 196 to locate the center of each region and the light source 14 is then moved 198 to the center of a region.
  • the normal for that region is determined 200 using the procedure of Figs. 4A and 4B. If the last region has been examined 202, the routine calculates 204 the angles between all the regions per the equation of Fig. 10 and then compares 206 the relative angles of the object with the relative angles of known reference objects. When the relative angles are equal, the object is identified. By using the relative angles of the objects for object identi- fication, object size and orientation information are not necessary.
  • the Gaussian image is determined by scanning the object three-dimensionally and recording the surface normals for the object.
  • the surface normals are then plotted in a spherical coordinate system, as illustrated in Fig. 13, where the coordinate value at each angle in the spherical coordinate system represents the magnitude or number of normals determined for the object in this direction.
  • the Gaussian map image is a map of the densities of normal vectors for the surfaces of the object.

Abstract

Method and apparatus for projecting a known geometry image such as an L from a light source (14) such as a laser onto the surface of an object. A camera (18) with a known distance and angular orientation with respect to the light source is used with an image storage device (20) to capture the light source image on the object. The normal to the surface of the object is then determined from the known size of the projected image, the angles of the intersecting lines in the projected image and the corresponding values in the received image. Relative angles between surfaces can be determined from the normals. An object can be identified by comparing the relative angles between the surfaces of the object to relative angles for known reference objects.

Description

OBJECT LOCATING SYSTEM
The present invention is directed to a method an apparatus for locating an object to be used for roboti machinery.
There is a need in robot applications to provid a method and apparatus which will locate objects such a plates and locate holes in plates with a high degree o precision. For example, to put a bolt in a hole, it i necessary to locate the hole and approach the hole with th bolt from the proper direction with the bolt lined up alon the axis of the hole. Since robot vision systems generall require an extensive length of time to process a camer image, it is important that the equipment used to determin object location be simple and fast, so that overall syste performance will be acceptable. It is a principal object of the invention t provide a method and apparatus which will accurately locate objects and features thereon for robot handling machinery.
Accordingly, with this object in view, the invention resides in a method for locating an object an any feature thereon comprising the steps of (a) projectin a light pattern onto the object with a light source charac¬ terized by (b) digitizing the light pattern reflected from the object using a camera, where the camera and the light source are in a known geometric relationship to each other; (c) determining a normal to a surface of the object; (d) determining the intersection of the normal with the surface of the object; (e) moving the light source toward the camera by a predetermined amount; (f) determining another normal and intersection; and (g) repeating steps (d) and (e) until a contour of the object has been mapped by the intersections.
The invention also resides in an apparatus for locating an object characterized by a light source project¬ ing a predetermined light pattern having intersecting lines where one of the lines lies in a predetermined plane; a rotatable camera having a rotational axis rotating in the predetermined plane and receiving an image of the light pattern reflected off the object; a frame carrying said light source and said camera in a fixed geometrical rela¬ tionship; camera control means for controlling camera rotation; and means for determining a normal to the surface of the object from the image and the fixed geometrical relationship between said light source and said camera. The preferred embodiment of the invention will be described, by way of example, with reference to the accom- panying drawings in which:
Figure 1 is a schematic diagram of the components of the system according to the present invention;
Fig. 2 illustrates the image detected by the camera 18 along with geometry of the present invention; Fig. 3 depicts values from which a surface normal is determined;
Figs. 4A and 4B depict a flowchart of a process for determining the normal to a surface; Fig. 5 illustrates a hole 80; Fig. 6 depicts a hole location routine;
Fig. 7 illustrates the scanning of a contour to map a contour 120 and determine its length;
Fig. 8 depicts a contour length determination routine; Fig. 9 illustrates a method for determining relative angles between surfaces; Fig. 10 depicts a routine for relative angl determination;
Fig. 11 illustrates an object with severa surfaces; Fig. 12 depicts a routine for identifying a object using relative angles;
Fig. 13 illustrates a Gaussian representation o an object; and
Fig. 14 depicts a routine for identifying a object using a Gaussian representation.
A computer 10 such as an IBM PC/AT controls light source/frame controller 12 such as a Unimate control ler model Puma A-75 to project a light pattern from ligh source 14 onto the surface of an object, as illustrated i Fig. 1. The computer 10, through a camera controller 16 which can be another Unimate controller, controls th orientation and distance of camera 18 from the light sourc 14. A suitable self-focusing camera is a CCD Model 34 camera available from Pulinex. A frame grabber 20 couple to the camera, such as an Imaging Technology IT-100 device captures the image of the light source- on the object an provides the camera image to the computer 10. The compute 10 calculates the location of the object and can send movement command to a robot controller 22 which moves robot 24 to -the object of interest.
The camera 18 and light source 14, as illustrate in Fig. 2, are mounted on an indexable encoder equippe frame or track 30 which allows the frame 30 to be rotated; the light source to be moved along the frame 30 in bot directions; the camera 18 to move in both directions alon the frame 30; and the camera 18 to be rotated. Thi combination of movements along the encoder equipped fram permits all angles and distances to be recorded by compute 10. The camera 18 only rotates in the light source projec- tion plane so that the camera 18 will always cross the light source image on the object. The angle δ betwee light source 14 and frame 30 is known because both the camera 18 and light source 14 are indexably controlled and is preferably 90°. The distance d between the image surface of the camera 18 and the light source 14 is also known. The angle e between the axis of the camera 18 and the frame 30 is also known. It is possible to project several different types of image patterns onto the object being viewed, preferably using a laser light source and cylindrical mirrors, as described in U.S. Patent Applica¬ tion Serial No. 897,473, filed August 18, 1986, and incor- porated by reference herein. A lens is used to collimate each pattern and a half reflecting mirror to combine light patterns. An alternative is to use a non-coherent light source and a' mask. It is also possible to add a focusing lens to the mask arrangement if a ranging device, such as an ultrasonic distance sensor is used to adjust focus.
One type of projected image pattern is an L-shaped image pattern which, when projected onto the surface of the object can produce a distorted L-shape in the image plane 32 of the camera 18, as illustrated in Fig. 2. The image examined by the computer 10 is a matrix of ones and zeros where the ones represent the reflected image. From the matrix, the angels θo, the intersection point Po, the end points Ao, Bo, the intersection point An and the lengths LI and L2 can be determined. From this camera image, the following values can be determined independent of any scale value:
cos (0o) = PoAn/AoPo (1)
No = PoBo/AoPo = L2/L1 (2)
where No is ratio of line lengths and θo is the angle between the lines in the image plane.
From the known relationship between the image plane 32 image and the known size of the light source image 34 projected onto the object plane 36, as illustrated in Fig. 3, the object plane and its normal can be determined. The object plane is determined by two angles α2 and (52 and the real coordinates of points PL (the location of t light source) (o,o,o), the camera location (d,o,o) and Po (the location of the object) (0,0,d|tan|e) . The normal determined in accordance with the following equation
coso2 = - (sin e)/(l+cos 2"e~~ tan θo) (
where o2 is indicated in Fig. 3 and e is the angle betwe the camera 18 and frame 30;
cos β2 = (sin θ - No sin e) (VNo, - 2No sin e sin 0o + sin 0o) (4)
tanα = cos e tan θo (5)
where a is shown in Fig. 3; and
tan β = (cos e sin Θ)/(No - sin e sin θo) (6)
where β is shown in Fig. 3.
The object plane and the normal to it are define as follows:
x/tan β + y/tanα + (d tan e - z) = 0 (7)
Normal (8)
Figure imgf000007_0001
where li, lj and Ik are unit vectors.
From the normal to the object surface all of th points in the image plane can be transformed into points o the object plane in accordance with the following transfor mation equations: where
Figure imgf000008_0001
xo and yo are points on the image plane; and q is a scalin factor;
all = (/l + cos e tan θo)/(cos e sec θo) ( 10) al2 = ( sin e/sin θo)//l + cos e (11) a21 = 0 ( 12 )
2 = (No Isin- e - sin θo) (1/sin θo.) (_ '1 - x (13)
( No - 2Nosin e sin θo + sin Θo)
(No cos e) (cos β sec Θo) x = (14)
(No -2No sin e sin θo + sin 20o) ( l + cos e tan θo)
If the coordinates from the object plane need to be trans formed into a real coordinate system such as a robo coordinate system where the geometric relationship betwee the frame 30 and the robot coordinate system is known, th following equations can be used:
Figure imgf000008_0002
z = (x/tanβ) + (y/tano) + (d tan e) (16) where x, y and z are coordinates in the robot coordinat system. It is, of course, possible to produce transforma tion equations for a transformation directly from the imag plane coordinate system to the real or robot coordinat system, and such a transform would merely be the combina tion of the above transformations into a singl transformation. The scaling factor q set forth in equation (9) above can be determined in accordance with the followin equation, since the distance between Po and Bo in Fig. 2 can be measured in pixel units and the distance PI and Bl for the light source is known:
g " Po o" " L2 (17) which is the ratio of the line lengths. Since PoBo can be measured in pixel units and PB = (a/sinβ), then q = (a/sinβ)/L2 where a is the original beam width PL BL. Figs. 4A and 4B depict a flowchart of a process performed by computer 10 to determine the surface normal. First, the computer obtains 52, the camera image from the frame grabber 20. This image is in the form of a matrix of gray scale values for each pixel produced by the camera 18. The gray scale array is compared 54 one data point at a time to a threshold. If the gray scale value is above the threshold, a corresponding bit of a line image array is set to a value of one, indicating the existence of a line point at the corresponding position in the camera image. If the gray scale value is below the threshold, a zero is entered in the line image array indicating darkness.
Once the image is transformed into a binary representation, the horizontal image lines on which the horizontal line of the object image should appear are scanned 56 using a chain coding algorithm to find the endpoihts and the line length of the horizontal line. An appropriate chain coding routine can be found in Pavlidis, Algorithms for Graphic and Image Processing, Computer Science Press, Rockville, Maryland, 1982, which is incorpo- rated by reference herein. Since the camera scans a constant plane with reference to the light source plane, the horizontal line should appear at approximately the same fixed location (row) in the camera image at all times. This allows the chain coding routine to begin scanning from a known point in the camera image.
When the endpoints of the horizontal line are found, the endpoints are compared 58 to the camera frame edge. If the endpoints correspond to the camera frame edge, the camera is rotated 60 or 62 _by a predetermined angle depending on which edge is coincident with the endpoint. This rotation loop forces the camera 18 to rotate until the horizontal line segment of the image is completely within the frame of the camera image. Once the image of the horizontal line is completely within the camera image frame, the camera 18 is further rotated to center the image in the camera frame. This can be accom¬ plished by calculating the distances between endpoints of the horizontal line and the frame edge and rotating the camera until the distances are equal. Once the image is centered within the camera frame, the camera is activated to self find focus. As a consequence of the self-focus, the distance between the camera and the image on the object is known.
Next, the computer scans 64 the camera image to find the non-horizontal line. This vertical scanning can be accomplished in several different ways. For example, one endpoint of the line can be used as a horizontal index for a vertical scan of the image array which examines each pixel representation data point vertically until a light point is encountered. If the scan starting with the endpoint does not detect an image point, the horizontal coordinate is indexed and another vertical scan occurs. Once a single image point is found during the vertical scanning, the chain coding algorithm is performe 66 again to determine the endpoints and line length of the non-horizontal line. Once the line length and endpoints o both the horizontal and non-horizontal lines are deter- mined, the intersection point of the two lines can b determined 68 in accordance with the well-known slop intersect equation, even if the intersection point happen to fall in a hole or other cavity and is therefore no directly detectable.
An alternative to the method discussed above fo determining the intersection point is a Hough transforma tion described in detail in U.S. Patent No. 3,069,654 Once the intersection point .is determined, th normal to the surface can be calculated 70 in accordanc with the equations previously discussed. After the inter section point is transformed into the object plane coordi nate system, the normal and transformed intersection poin define the location and orientation of the object surface After the normal and intersection point in the object plan are determined, any of several different analysis routine can be executed 72. For image interpretation tasks, the system i ideally suited for transforming object coordinates int another coordinate system, so that template matching i simplified.
Another of the possible analysis routines is hole location routine which determines the center an orientation of a hole 80, such as illustrated in Fig. 5. In this routine, the preferred light source image is a cross, however, any image that provides at least three legs that intersect will be acceptable. It is assumed for this routine that the hole 80 is located in a generally predict¬ able location such that the projected cross will encounter the hole 80. Since the location and orientation of the line segments 82 and 84 and their intersection are known, it is possible to continue scanning in the camera image along the known lines to detect line segments 86 and 88 on the other side of the hole 80. Once these segments are found 102, the routine of Fig. 6 transforms 104 the seg¬ ments 82-88, in accordance with the equations previously discussed, into segments in the object plane, so that the hole 80 will appear in a normal perspective, allowing standard circle equations to be used during calculations instead of the ellipse equations which would have to be used otherwise.
Next, the interior endpoints of the line segments are determined 106 by, for example, comparing horizontal coordinates of the horizontal line segments 82 and 88 and designating the two closest points W2 and W7 as interior points and performing a similar operation with respect to the vertical segments 84 and 86 to obtain the interior points W4 and W5. It is also possible to find the interior points by first executing an edge finding routine, men¬ tioned later and then comparing the hole edge to line segment coordinates. Coincidence between coordinates indicates an interior endpoint.
The center (x, y) of the hole 80 is then calcu- lated 108 using the appropriate coordinate values of the points W2, W4, W5 and W7.
The center (x, y), along with the surface normal, define the location and orientation of the hole 80. From the center and at least one intersection point, for exam- pie, W2, the radius and diameter of the hole 80 can be determined.
When the geometric relationship between the frame 30 and a robot coordinate system is known, the computer 10 can perform a transformation 110 from the object plane coordinate system to the robot plane coordinate system to obtain the appropriate values from the perspective of the robot. The computer 10 can then command 112 the robot 24 through the robot controller 22 to insert an appropriate size bolt into the hole 80 along the known normal towards the center of the hole 80. Since the three-dimensional coordinates of the center of the hole 80 are known, the computer 10 can also appropriately command the robot 24 to begin spinning the bolt as it approaches the hole 80. The format of such a command and a description of how the command is communicated to the robot controller 22 can be found in the operating manual for the robot controller 22 and the details of such are within the ordinary skill i the art.
Fig. 7 depicts a contour mapping and measuremen method whereby a contour F-G 120 is mapped and measured b moving the light source 14 from point F' to point G' alon the frame 30 while determining the* normals and intersectio points of normals on the surface 120. The intersectio points provide mapping points for the contour 120. The distances between the intersection points can be calculate and summed to determine the length of the contour 120. Fig. 8 illustrates a routine for performing the contour mapping and length measurement. After the normal is determined, the intersection point and normal are stored 132 followed by a determination 134 of whether or not the end point G' has been reached. If not, the light source is moved 136 by a predetermined amount along the frame 30. If the scan endpoint G' has been reached, the contour length is calculated 138 by a simple summation.
Once the normal Hn for a first surface 140 is determined, it is also possible to determine the normal Ln for a second surface 142, as illustrated in Fig. 9, by moving the camera 18 and light source 14 onto the second surface 142. The angle between the two surfaces 140 and
142 can be determined from the normals. During this process, illustrated in Fig. 10, as the camera 18 is moved from one surface to another surface, the normal for the current point is saved 152 and then compared 154 with the prior normal. If the normals are equal, the camera 18 and light source 14 are moved 156 as a unit by a predetermined amount. If the normals are not equal then the angle between the two surfaces is calculated 158.
If the normals for a plurality of surfaces R-V, as illustrated in Fig. 11, are known, the identity of the object can be determined by comparing the relative angles between the surfaces to relative angles for reference objects. A routine for identifying an object from surface normals is illustrated in Fig. 12. In this routine, the camera 18 must be in a position to capture 192 the image of the entire object. After the image of the entire object is captured, an edge finding routine such as described in Sobel, "Neighborhood Coding of Binary Images for Fast Contour Following and General Binary Array Process", CGIP, No. 8, August, 1978, pp. 127-135 r can^be executed 194 to determine or identify the regions of the object being examined.
Next, well-known center determination algorithms can be executed 196 to locate the center of each region and the light source 14 is then moved 198 to the center of a region. The normal for that region is determined 200 using the procedure of Figs. 4A and 4B. If the last region has been examined 202, the routine calculates 204 the angles between all the regions per the equation of Fig. 10 and then compares 206 the relative angles of the object with the relative angles of known reference objects. When the relative angles are equal, the object is identified. By using the relative angles of the objects for object identi- fication, object size and orientation information are not necessary.
It is also possible to determine the Gaussian distribution of the normals for an object which then can be used to identify the object as described in Home, Robot Vision, McGraw-Hill, New York, 1986. The Gaussian image is determined by scanning the object three-dimensionally and recording the surface normals for the object. The surface normals are then plotted in a spherical coordinate system, as illustrated in Fig. 13, where the coordinate value at each angle in the spherical coordinate system represents the magnitude or number of normals determined for the object in this direction.
A routine for identifying an object using a Gaussian map image is illustrated in Fig. 14. The Gaussian map image is a map of the densities of normal vectors for the surfaces of the object. Once the normal and intersec¬ tion point of a particular position of the camera and light source are determined, they are stored 222 followed b determination 224 of whether or not the horizontal scan o the object has been completed. If the horizontal scan ha not been completed, the camera 18 and light source 14 ar moved 226 as a unit by a predetermined amount. If th horizontal scan has been completed, the- camera is reset 22 to the beginning of the horizontal scan so that anothe horizontal scan can begin, if all vertical scans have no been finished. A determination is then made 230 of whethe or not the end of the vertical scan has been reached and if not, the camera 18 and light source 14 are moved verti cally 232 by a predetermined amount. If the scans hav been completed, indicating that the object has been com pletely scanned, the normals are totaled 234 for each angl in the spherical coordinate system. Then the normal total for each angle of the object are compared 236 with refer ence normals for reference objects at corresponding angles If the normal totals are substantially the same, the objec has been identified. Other routines can also be provided to transfor all image coordinates into coordinates in the object o robot coordinate systems to provide same to an operator fo viewing or a robot for possibly picking up the object. It is possible to substitute a single point scanning light source if the frame grabber is used to combine plural images. It is also possible to allow the angle δ to be other than 90°. Another possible use for the present invention is in a position identification device where a touch sensitive screen can be used to input real coordi- nates of an object, thereby defining the location of the object and its orientation in a real coordinate system. IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
COMPUTER 10 1
LIGHT SOURCE/FRAME CONTROLLER 12 1
LIGHT SOURCE 14 1
CAMERA CONTROLLER 16 1
CAMERA 18 1
FRAME GRABBER 20 1
ROBOT CONTROLLER 22 1
ROBOT 24 1
START 50 4A
OBTAIN CAMERA IMAGE FROM FRAME
GRAPHER 52 4A
COMPARE IMAGE DATA POINTS TO
THRESHOLD AND CREATE LINE IMAGE 54 4A
PERFORM CHAIN CODING TO FIND
ENDPOINTS AND LINE LENGTH OF
HORIZONTAL LINE 56 4A
ROTATE CAMERA COUNTER CLOCKWISE BY
PREDETERMINED ANGLE 60 4A
ROTATE CAMERA CLOCKWISE BY
PREDETERMINED ANGLE 62 4A
STORE DISTANCE d, ANGLE e AND SCAN
VERTICALLY TO FIND NON-HORIZONTAL
LINE 64 4B
PERFORM CHAIN CODING TO FIND
ENDPOINTS AND LENGTH OF
NON-HORIZONTAL LINE 66 4B
ENDPOINT EQUAL FRAME EDGE POINT 68 4A
DETERMINE POINT OF INTERSECTION ax + b = y 68 4B
DETERMINE NORMAL VECTOR OF OBJECT
PLANE 70 4B
EXECUTE ANALYSIS ROUTINE 72 4B
HOLE LOCATION ROUTINE 100 6
FIND REMAINING LINE SEGMENTS 102 6 IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
TRANSFORM LINE SEGMENTS FROM
IMAGE PLANE TO OBJECT PLANE 104 6
DETERMINE INTERIOR END POINTS
OF LINE SEGMENTS 106 6
CALCULATE CENTER OF CIRCLE
W?+W2/2=X
W.+WC/2=Y 108 4 5
PERFORM OBJECT PLANE TO ROBOT PLANE
TRANSFORMATION OF CIRCLE CENTER AND
NORMAL 110 6
COMMAND ROBOT TO INSERT BOLT IN HOLE 112 6
REMOTE CONTOUR MEASUREMENT ROUTINE 130 8
STORE INTERSECTION POINT AND NORMAL 132 8
END OF CONTOUR SCAN 134 8
MOVE LIGHT SOURCE BY A
PREDETERMINED AMOUNT 136
SURFACE ANGLES DETERMINATION
ROUTINE 150 10
STORE NORMAL AND INTERSECTION POINT / 152 10
CURRENT NORMAL EQUALS PRIOR NORMAL 154 10
MOVE CAMERA AND LIGHT SOURCE AS A
UNIT BY A PREDETERMINED AMOUNT 156 10
CALCULATE ANGLE
COSINE ANGLE - CURRENT NORMAL
PRIOR NORMAL 158 10
OBJECT IDENTIFICATION ROUTINE 190 12
CAPTURE IMAGE OF ENTIRE OBJECT 192 12
EXECUTE EDGE FINDING ROUTINE 194 12
DETERMINE CENTER OF NTH REGION 196 12
MOVE SOURCE TO CENTER OF NTH REGION 198 12
DETERMINE NORMAL IN OBJECT PLANE 200 12
LAST REGION 202 12
CALCULATE ANGLES BETWEEN ALL
REGIONS 204 12 IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
COMPARE ANGLES TO ANGLE OF KNOWN
OBJECTS 206 12
GAUSSION IMAGE ROUTINE 220 14
STORE NORMAL AND INTERSECTION POINT 222 14
END OF HORIZONTAL SCAN 224 14
MOVE CAMERA AND LIGHT SOURCE AS A
UNIT BY A PREDETERMINED HORIZONTAL
AMOUNT 226 14
RESET HORIZONTAL SCAN 228 14
END OF VERTICAL SCAN 230 14
MOVE CAMERA AND LIGHT SOURCE AS A
UNIT BY A PREDETERMINED VERTICAL
AMOUNT 232 14
COUNT NORMALS FOR EACH ANGLE 234 14
COMPARE NORMALS FOR IMAGE WITH
REFERENCE NORMALS - 236 14

Claims

CLAIMS:
1. A method for locating an object and any feature thereon comprising the steps of
(a) projecting a light pattern (34) onto the object with a light source (14) characterized by (b) digitizing the light pattern reflected from the object using a camera (18), where the camera and the light source are in a known geometric relationship to each other;
(c) determining a normal to a surface of the object;
(d) determining the intersection of the normal with the surface of the object;
(e) moving the light source toward the camera by a predetermined amount; (f) determining another normal and intersection; and
(g) repeating steps (d) and (e) until a contour of the object has been mapped by the intersections.
2. A method as recited in claim 1, characterized by the light pattern being at least two intersecting lines and step (b) comprises:
(bl) comparing the camera image point by point to a threshold; and
(b2) chain coding the image to determine end points of the lines.
3. A method as recited in claim 2, characterized by including rotating the camera until the entire light pattern is in the camera image field.
4. A method as recited in claim 1, characterized by the geometric relationship between the light source/camera and an arbitrary coordinate system being known and transforming the normal into a normal in the arbitrary coordinate system.
5. A method as recited in claim 1, characterized by the object feature being a hole and the light pattern being a cross projected onto the hole including: finding all line segments of the cross; transforming the line segments into the object plane; determining the interior end points of each line segment; and determining the center of the hole.
6. A method as recited in claim 5, further characterized by transforming the center and normal into a robot coordinate system; and commanding the robot to approach the transformed center of the hole along the transformed normal.
7. A method as recited in claim 1, further characterized by calculating the length of the contour from the intersections.
8. A method as recited in claim 1, characterized by including calculating a relative angle between the normals when the normals are not equal and comparing the calculated relative angles to known relative angles for known objects and identifying the object when the calculat¬ ed and known relative angles for a known object are sub¬ stantially identical.
9. A method as recited in claim 8 characterized by counting the normals at each angle over the entire object in a spherical coordinate system producing a Gaussian image of the object and comparing the Gaussian .image with known Gaussia images for known objects and identifying the object whe the Gaussian image and a known Gaussian image are substan tially identical.
10. An apparatus for locating an object charac terized by a light source (14) projecting a predetermine light pattern having intersecting lines where one of th lines lies in a predetermined plane; a rotatable camera (18) having a rotational axi rotating in the predetermined plane and receiving an imag of the light pattern reflected off the object; a frame (30) carrying said light source and sai camera in a fixed geometrical relationship; camera control means (16) for controlling camer rotation; and means (10) for determining a normal to th surface of the object from the image and the fixed geomet rical relationship between said light source and sai camera.
11. An apparatus as recited in claim 10 charac¬ terized by said normal determination means including means (16) for rotating the camera until the image of the light pattern is completely within the camera viewing range, and a computer for digitizing the image, finding the endpoints of the lines in the digi¬ tized image, finding the intersection of the lines in the image, and determining the normal to the object from the fixed geometric relationship and the intersection.
PCT/US1988/000153 1987-02-06 1988-01-11 Object locating system WO1988005904A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1019880701244A KR890700806A (en) 1987-02-06 1988-10-06 Method and device for positioning objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US07/011,988 US4791482A (en) 1987-02-06 1987-02-06 Object locating system
US011,988 1987-02-06

Publications (1)

Publication Number Publication Date
WO1988005904A1 true WO1988005904A1 (en) 1988-08-11

Family

ID=21752840

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1988/000153 WO1988005904A1 (en) 1987-02-06 1988-01-11 Object locating system

Country Status (4)

Country Link
US (1) US4791482A (en)
KR (1) KR890700806A (en)
IE (1) IE880172L (en)
WO (1) WO1988005904A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5714762A (en) * 1993-11-09 1998-02-03 British Nuclear Fuels Plc Determination of the surface properties of an object

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI74556C (en) * 1986-04-11 1988-02-08 Valtion Teknillinen FOERFARANDE FOER TREDIMENSIONELL OEVERVAKNING AV ETT MAOLUTRYMME.
JP2739130B2 (en) * 1988-05-12 1998-04-08 株式会社鷹山 Image processing method
JPH0766445B2 (en) * 1988-09-09 1995-07-19 工業技術院長 Image processing method
US4942539A (en) * 1988-12-21 1990-07-17 Gmf Robotics Corporation Method and system for automatically determining the position and orientation of an object in 3-D space
US4979815A (en) * 1989-02-17 1990-12-25 Tsikos Constantine J Laser range imaging system based on projective geometry
DE69033269T2 (en) * 1989-06-20 2000-01-05 Fujitsu Ltd Method and device for measuring the position and position of an object
US4988202A (en) * 1989-06-28 1991-01-29 Westinghouse Electric Corp. Solder joint inspection system and method
US5231678A (en) * 1989-11-15 1993-07-27 Ezel, Inc. Configuration recognition system calculating a three-dimensional distance to an object by detecting cross points projected on the object
US4980971A (en) * 1989-12-14 1991-01-01 At&T Bell Laboratories Method and apparatus for chip placement
JPH041505A (en) * 1990-04-18 1992-01-07 Matsushita Electric Ind Co Ltd Three-dimensional position measuring method and acquiring method for work
DE59008672D1 (en) * 1990-09-27 1995-04-13 Siemens Ag Plant documentation procedures.
US5619587A (en) * 1991-05-10 1997-04-08 Aluminum Company Of America System and method for contactlessly gauging the thickness of a contoured object, such as a vehicle wheel
US5245409A (en) * 1991-11-27 1993-09-14 Arvin Industries, Inc. Tube seam weld inspection device
JPH05266146A (en) * 1992-03-19 1993-10-15 Matsushita Electric Ind Co Ltd Representing device for body shape
US5420689A (en) * 1993-03-01 1995-05-30 Siu; Bernard High speed illumination system for microelectronics inspection
US5424838A (en) * 1993-03-01 1995-06-13 Siu; Bernard Microelectronics inspection system
GB2276446B (en) * 1993-03-26 1996-07-03 Honda Motor Co Ltd Method of measuring the position of a hole
US6407817B1 (en) 1993-12-20 2002-06-18 Minolta Co., Ltd. Measuring system with improved method of reading image data of an object
FR2720155B1 (en) * 1994-05-19 1996-06-28 Lorraine Laminage Three-dimensional measurement of the surface of a large object.
US5901273A (en) * 1995-10-17 1999-05-04 Fuji Xerox Co., Ltd. Two-dimensional position/orientation measuring mark, two-dimensional position/orientation measuring method and apparatus, control apparatus for image recording apparatus, and control apparatus for manipulator
US5969317A (en) * 1996-11-13 1999-10-19 Ncr Corporation Price determination system and method using digitized gray-scale image recognition and price-lookup files
DE50009995D1 (en) * 1999-04-30 2005-05-12 Wagner Christoph METHOD FOR THE OPTICAL FORMULATION OF OBJECTS
AU2003239171A1 (en) 2002-01-31 2003-09-02 Braintech Canada, Inc. Method and apparatus for single camera 3d vision guided robotics
US7289230B2 (en) 2002-02-06 2007-10-30 Cyberoptics Semiconductors, Inc. Wireless substrate-like sensor
US20050224902A1 (en) * 2002-02-06 2005-10-13 Ramsey Craig C Wireless substrate-like sensor
US20070050089A1 (en) * 2005-09-01 2007-03-01 Yunquan Sun Method for detecting the position and orientation of holes using robotic vision system
US7893697B2 (en) 2006-02-21 2011-02-22 Cyberoptics Semiconductor, Inc. Capacitive distance sensing in semiconductor processing tools
JP2009527764A (en) 2006-02-21 2009-07-30 サイバーオプティクス セミコンダクタ インコーポレイテッド Capacitance distance detection in semiconductor processing tools
US20070276539A1 (en) * 2006-05-25 2007-11-29 Babak Habibi System and method of robotically engaging an object
WO2008036354A1 (en) * 2006-09-19 2008-03-27 Braintech Canada, Inc. System and method of determining object pose
CN101517701B (en) 2006-09-29 2011-08-10 赛博光学半导体公司 Substrate-like particle sensor
US7778793B2 (en) * 2007-03-12 2010-08-17 Cyberoptics Semiconductor, Inc. Wireless sensor for semiconductor processing systems
US7957583B2 (en) * 2007-08-02 2011-06-07 Roboticvisiontech Llc System and method of three-dimensional pose estimation
US8559699B2 (en) 2008-10-10 2013-10-15 Roboticvisiontech Llc Methods and apparatus to facilitate operations in image based systems
KR101748180B1 (en) * 2010-12-31 2017-06-16 주식회사 케이티 Method and apparatus of measuring size of object in image
US9200899B2 (en) 2012-03-22 2015-12-01 Virtek Vision International, Inc. Laser projection system and method
WO2014201303A2 (en) * 2013-06-13 2014-12-18 Edge Toy, Inc. Three dimensional scanning apparatuses and methods for adjusting three dimensional scanning apparatuses
US10435279B2 (en) 2015-03-06 2019-10-08 Walmart Apollo, Llc Shopping space route guidance systems, devices and methods
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
US20180099846A1 (en) 2015-03-06 2018-04-12 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
CA2961938A1 (en) 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
US10303154B2 (en) 2016-10-11 2019-05-28 The Boeing Company Surface based hole target for use with systems and methods for determining a position and a vector of a hole formed in a workpiece
CN112629432B (en) * 2019-09-24 2022-07-15 杭州思看科技有限公司 Interactive hole site multi-angle scanning control method and device
CN110990660A (en) * 2019-11-21 2020-04-10 北京明略软件系统有限公司 Data graph display method, storage medium and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3069654A (en) * 1960-03-25 1962-12-18 Paul V C Hough Method and means for recognizing complex patterns
EP0157299A1 (en) * 1984-03-26 1985-10-09 Hitachi, Ltd. Image processing apparatus

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5241015B2 (en) * 1972-10-13 1977-10-15
US4115803A (en) * 1975-05-23 1978-09-19 Bausch & Lomb Incorporated Image analysis measurement apparatus and methods
US4105925A (en) * 1977-03-14 1978-08-08 General Motors Corporation Optical object locator
US4187051A (en) * 1978-05-26 1980-02-05 Jerry Kirsch Rotary video article centering, orienting and transfer device for computerized electronic operating systems
US4287769A (en) * 1978-06-01 1981-09-08 Massachusetts Institute Of Technology Apparatus and method whereby wave energy is correlated with geometry of a manufactured part or the like or to positional relationships in a system
US4295740A (en) * 1978-09-05 1981-10-20 Westinghouse Electric Corp. Photoelectric docking device
US4367465A (en) * 1980-04-04 1983-01-04 Hewlett-Packard Company Graphics light pen and method for raster scan CRT
US4316189A (en) * 1980-05-08 1982-02-16 Westinghouse Electric Corp. Electromechanical display apparatus
US4326155A (en) * 1980-06-03 1982-04-20 Griebeler Elmer L Shockwave probe
JPS5726706A (en) * 1980-07-24 1982-02-12 Mitsubishi Electric Corp Detector for shape of body
US4377810A (en) * 1980-12-04 1983-03-22 Data General Corporation Light pen detection circuit and method
JPS57164310A (en) * 1981-04-03 1982-10-08 Hitachi Ltd Automatic assembling device
US4405238A (en) * 1981-05-20 1983-09-20 Ibm Corporation Alignment method and apparatus for x-ray or optical lithography
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4488173A (en) * 1981-08-19 1984-12-11 Robotic Vision Systems, Inc. Method of sensing the position and orientation of elements in space
US4492847A (en) * 1981-09-30 1985-01-08 Unimation, Inc. Manipulator welding apparatus with sensing arrangements for weld slam tracking
US4438567A (en) * 1981-12-07 1984-03-27 Raiha A P Center locator for alignment of work to machine spindle
US4437114A (en) * 1982-06-07 1984-03-13 Farrand Optical Co., Inc. Robotic vision system
US4628469A (en) * 1982-09-29 1986-12-09 Technical Arts Corporation Method and apparatus for locating center of reference pulse in a measurement system
US4550374A (en) * 1982-11-15 1985-10-29 Tre Semiconductor Equipment Corporation High speed alignment method for wafer stepper
US4567348A (en) * 1983-01-25 1986-01-28 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Automated weld torch guidance control system
FI68131C (en) * 1983-06-30 1985-07-10 Valtion Teknillinen REFERENCE FOR A WINDOW MACHINE WITH A GLASS LED WITH A LASER INDICATOR
US4672564A (en) * 1984-11-15 1987-06-09 Honeywell Inc. Method and apparatus for determining location and orientation of objects
US4672562A (en) * 1984-12-11 1987-06-09 Honeywell Inc. Method and apparatus for determining location and orientation of objects
JPS61241612A (en) * 1985-04-19 1986-10-27 Kinkashiya:Kk Three-dimensional form measuring system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3069654A (en) * 1960-03-25 1962-12-18 Paul V C Hough Method and means for recognizing complex patterns
EP0157299A1 (en) * 1984-03-26 1985-10-09 Hitachi, Ltd. Image processing apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5714762A (en) * 1993-11-09 1998-02-03 British Nuclear Fuels Plc Determination of the surface properties of an object

Also Published As

Publication number Publication date
KR890700806A (en) 1989-04-27
US4791482A (en) 1988-12-13
IE880172L (en) 1988-08-06

Similar Documents

Publication Publication Date Title
WO1988005904A1 (en) Object locating system
Nitzan Three-dimensional vision structure for robot applications
EP1305567B1 (en) Measurement of cylindrical objects through laser telemetry
US5642293A (en) Method and apparatus for determining surface profile and/or surface strain
JP2919284B2 (en) Object recognition method
US4412121A (en) Implement positioning apparatus and process
Trucco et al. Model-based planning of optimal sensor placements for inspection
US4967370A (en) Robot and sensor error determination system
Alexander et al. 3-D shape measurement by active triangulation using an array of coded light stripes
US10591289B2 (en) Method for measuring an artefact
US4760269A (en) Method and apparatus for measuring distance to an object
JPS6332306A (en) Non-contact three-dimensional automatic dimension measuring method
Furukawa et al. Dense 3D reconstruction with an uncalibrated stereo system using coded structured light
El-Hakim A hierarchical approach to stereo vision
JPH04237310A (en) Three-dimensional positioning method
Stella et al. Self-location of a mobile robot by estimation of camera parameters
JPH0814858A (en) Data acquisition device for three-dimensional object
Uyanik et al. A method for determining 3D surface points of objects by a single camera and rotary stage
Fisher et al. Recognintion of Complex 3-D Objects from Range Data
Cardenas-Garcia et al. Projection moiré as a tool for the automated determination of surface topography
JPH0886616A (en) Method and apparatus for measuring three-dimensional image
Singh et al. Digital photogrammetry for automatic close range measurement of textureless and featureless objects
Huo et al. A Robotic Line Scan System with Adaptive ROI for Inspection of Defects over Convex Free-form Specular Surfaces
Prieto et al. Tolerance control with high resolution 3D measurements
JPH0668765B2 (en) 3D measuring method of circle

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE