WO1999040729A1 - Multilinear array sensor with an infrared line - Google Patents

Multilinear array sensor with an infrared line Download PDF

Info

Publication number
WO1999040729A1
WO1999040729A1 PCT/US1999/001674 US9901674W WO9940729A1 WO 1999040729 A1 WO1999040729 A1 WO 1999040729A1 US 9901674 W US9901674 W US 9901674W WO 9940729 A1 WO9940729 A1 WO 9940729A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
color
infrared
light
line
Prior art date
Application number
PCT/US1999/001674
Other languages
French (fr)
Inventor
Albert D. Edgar
Steven C. Penn
Original Assignee
Applied Science Fiction, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Science Fiction, Inc. filed Critical Applied Science Fiction, Inc.
Priority to JP2000531014A priority Critical patent/JP2002503066A/en
Priority to AU25631/99A priority patent/AU2563199A/en
Priority to EP99905481A priority patent/EP1053644A1/en
Publication of WO1999040729A1 publication Critical patent/WO1999040729A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/11Scanning of colour motion picture films, e.g. for telecine
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/48Picture signal generators
    • H04N1/486Picture signal generators with separate detectors, each detector being used for one specific colour component

Definitions

  • This invention relates to the scanning of photographic images, and more particularly in a primary application, to scanning in infrared and visible light in order to prepare for correction of surface defects.
  • FIG. 1 shows a prior art trilinear film scanner, and also introduces some terms that will be used in this application.
  • a lamp 102 transilluminates a filmstrip 104 containing an image 106 to be scanned. Normally the light from the lamp 102 would be diffused or directed by additional optics, not shown, positioned between the lamp 102 and film 104 in order to illuminate the image 106 more uniformly.
  • the image 106 on the film 104 is focused by lens 108 onto a sensor line 110 in a circuit package 112.
  • the sensor line 110 projects back through lens 108 as a line 116 across the image 106.
  • This line 116 is composed of many individual points, or pixels.
  • the film 104 is moved perpendicularly to the line 116 to scan a two dimensional area, such as image 106. Because the sensors of the sensor line 110 are positioned in lines, this arrangement is called a linear, or line, sensor.
  • the sensor line 110 may be of a form known in the art as a "trilinear", or three line, array. As shown magnified at 120, the sensor line 110 actually consists of three parallel lines of sensors. In this prior art embodiment, one line of sensors 122 is behind a line of red filters 124. This arrangement could consist of a series of independent filters, but is normally a single long red filter 124 which covers all of the sensors of line 122. Another line of sensors 126 is behind a green filter line 128, and a third line of sensors 130 is behind a blue filter line 132.
  • the three lines 122, 126, and 130 each provide an individual image of the film seen with a different color of light.
  • the data from the circuit package 112 is sent along cable 136 to supporting electronics and computer storage and processing means, shown together as computer 138. Inside computer 138 the data for each color image is grouped together, and the three images are registered as the three color planes 140, 142, and 144 of a full color image. Each of these color planes 140, 142 and 144 consists of pixels describing with a number the intensity of the light at each point in the film.
  • pixel 150 of the red color plane 144 may contain the number "226" to indicate a near white light intensity at point 152 on the film 104, as measured at a specific sensor 154 in the array 110, shown enlarged in circle 120 as sensor 156 behind the red filter line.
  • FIG. 1 it is noted that there is a spacing between sensor lines 122, 126 and 130, and therefore the same point on the film 104 is not sensed by all three color lines at the same point in time.
  • FIG. 2 illustrates this registration problem in more detail.
  • FIG. 2 there is a trilinear array (not shown) with red, green, and blue sensor lines
  • each line is projected onto a substrate (not shown) which is moved in the direction of the arrow to scan out regions of the image on the substrate.
  • the region seen by each line is different from the region seen by the other lines.
  • sensor 210 of the blue line 206 may see point 212 of the substrate, while at the end of the time interval, it may see point 214. It is apparent that each of the different sensors 210, 220 and 230 sees a different area during the same time interval.
  • sensor 220 of the red sensor array 202 sees point
  • the time interval is long enough, there will exist a region of overlap 224 over which all array lines have passed. If the interval between measurements is an integer submultiple of the spacing between the arrays, then there exists a time at which sensor 230 of the green line 204 sees the same point 232 on the substrate as 214, and another time at which sensor 220 of the red line 202 sees point 234, the same as point 214, which in turn will be seen by the blue sensor 210 at a later time.
  • the computer system 138 receiving the information from the scans made by the trilinear array registers the data representing the three color images by shifting the data an amount corresponding to the distance between sensor lines, and discarding the part of each color record outside the full color range overlap 224.
  • a lamp 302 transilluminates filmstrip 304 containing an image 306.
  • An electronic camera 308 views the image 306 and outputs red, green, and blue digitized records 310, 312, and 314.
  • the electronic camera 308 outputs an infrared record 316.
  • a conventional camera can be made responsive to selectively visible and infrared light.
  • One way is to provide a filter wheel 320 with four filters: red 322, green 324, blue 326, and infrared 328. If the camera 308 is a monochrome camera whose sensitivity extends into infrared, then the three visible colors and infrared may be captured at four different times, each time illuminating the film with a different filter in the filter wheel 320.
  • the cyan, magenta, and yellow dyes that create the image 306 are all transparent to infrared light, and therefore the film 304 appears clear to camera 308 when viewed under infrared light.
  • surface defects such as dust, scratches, and fingerprints refract the light passing through the film 304 away from the camera 308, and therefore appear as darkened points under both visible and infrared light. Because refraction under infrared light is nearly equal to refraction under visible light, the defects appear nearly as dark in the infrared as in the visible spectrum.
  • infrared record 316 is effectively of a clear piece of film including defects, and image 310 contains the same defects plus the red image.
  • the infrared image 316 provides a pixel by pixel "norming" for the effect of defects.
  • defect-free pixel 340 in the red record 310 may contain a 50% brightness measurement.
  • the corresponding defect-free pixel 342 in the infrared record 316 contains 100% brightness because no defect has attenuated the light.
  • Function block 344 divides the 50% brightness level from the red record 310 by the norming 100%) brightness level from the infrared record 316 to give a 50% brightness measurement for corrected pixel 346.
  • pixel 350 under scratch 352 in the red record 310 may contain a 40% brightness measurement.
  • the corresponding pixel 354 in the infrared record 316 seeing the same scratch may contain 80% brightness because the scratch has refracted 20% of the light.
  • function block 344 divides 40% by 80%, a corrected brightness value of 50% is determined for pixel 356.
  • corrected pixels 346 and 356 within the same background area of the image now both contain the same brightness value of 50%, so the effect of the scratch has disappeared.
  • This division is repeated for each pixel to produce the corrected red record 360; and the same division by infrared is applied to the green record 312, and blue record 314, to produce the corrected green and blue records 362 and 364, resulting in a full color corrected image.
  • the present invention adds a line to a conventional multilinear sensor array.
  • the added line is specific to the infrared scan. In the most direct embodiment, the added line makes what was a trilinear array containing three lines, one for each of three primary colors, into a quadrilinear array. In a second embodiment, the red and blue sensor lines are combined
  • This second embodiment uses only two lines for sensing full color, allowing the third line of existing trilinear layouts to be devoted to infrared.
  • FIG. 1 shows a conventional trilinear film scanner
  • FIG. 2 illustrates registration of a multilinear array
  • FIG. 3 explains the operation of infrared surface defect correction
  • FIG. 4 shows the present invention with a quadrilinear line and infrared filter
  • FIG. 5 graphs the color transmission of available filters
  • FIG. 6 shows the preferred embodiment with a dichroic prism
  • FIG. 7 illustrates the infrared focus shift common to imaging lenses
  • FIG. 8 charts the filter arrangement of a trilinear infrared sensor
  • FIG. 9 charts the filter arrangement of an alternate trilinear infrared sensor.
  • FIG. 10 presents missing color recovery used with the sensor shown in FIG. 7.
  • FIG. 4 shows a prior art trilinear array with the addition of the novel fourth sensor line for infrared scanning.
  • sensor line 402 contains individual photosites 404, each behind a red filter.
  • the line 402 consists of a row of silicon photosensor sites 406 behind red filter material 408 to render the row of photosites 406 responsive primarily to red light.
  • the photosites in sensor lines 410 and 412 are made primarily responsive to green and blue light, respectively.
  • These three silicon lines with their overlaid filters are typically contained in a package 414 under cover glass 416.
  • the present invention teaches the addition of another line 420 to the device which is specific to infrared light.
  • Silicon sensor material is inherently sensitive to the lower end of the infrared spectrum, so the layout and construction of the extra sensor line 420, shown in perspective as line 424 having photosites 422, may be a copy of one of the other three lines, such as line 428 or 430.
  • the manufacture of a silicon sensor line, and the duplication of multiple copies of a silicon circuit, called "macros" onto a silicon die, are well known in the art.
  • Line 420 is made primarily responsive to infrared light by removing visible light reaching the photosensor sites 422 with a line of infrared-passing, visible-absorbing filter material 424.
  • Infrared filter material 424 would appear black to the human eye.
  • a number of such infrared materials are known in the art.
  • a double layer filter the first layer consisting of the filter material 408 printed to make the red line 402, overlaid with a second layer consisting of the filter material printed to make the green line 428, would together absorb visible light and transmit infrared. In fact, any two or three of the visible colors combined would absorb visible and pass infrared light. Because these filter materials are already present in the printing of the other two lines, this method would enable manufacture of the infrared filter 424 without requiring any additional dye types in the fabrication process.
  • FIG. 5 wherein the transmission of typical organic colored dyes commonly used to form the colored filter lines of FIG. 4 are graphed.
  • the infrared line is rendered primarily responsive to only infrared light by the infrared filter, the other colors are specific not only to their labeled visible color, but to infrared light as well, as is seen by observing that the labeled color graphs in FIG. 5 transmit infrared light.
  • a scan is made in which each of the visible records also contains infrared light, a faded and excessively surface-defect-sensitive image results.
  • the infrared blocking filter in order to overcome this problem and sense infrared in one sensor line, infrared light cannot be blocked before reaching the sensor package. Therefore, the infrared blocking filter must be combined with the visible filters at the sensor.
  • the visible color filters of lines 408, 428, and 430 of FIG. 4 are manufactured with a process that absorbs or reflects infrared light. Such a process may use multilayer interference filters commonly known in the art. These filters require many layers, and in order to produce three colors on the same substrate, the substrate would need to be deposited many times resulting in difficult and expensive manufacturing.
  • FIG. 4 teaches the use of a cut piece of infrared blocking material 431 laid over the three visible lines 408, 428, and 430, but having a terminating edge 432 between the last visible color sensor line 430 and the infrared line 424.
  • This blocking material 431, shown also as filter 434 in circuit package 414, could be placed on top of the cover glass 416, manufactured as part of the cover glass, or ideally placed under the cover glass and directly over the sensor lines.
  • An infrared absorbing filter such as is available from Schott Optical of Germany, typically is relatively much thicker than the organic colored filters, and therefore creates a shadowing parallax problem at the edge 432 over sensor lines that are very close. This shadowing may be minimized by moving the thick filter closer to the sensor lines when placing it under the cover glass 416, and is also made less objectionable by moving the infrared sensor line 424 further from the last visible line 430.
  • the spacing S2 between the visible and infrared lines is made greater than the spacing S 1 between the visible lines. In particular, for maximum step resolution flexibility, the ratio
  • S2/S1 should be an integer. In the specific illustration of FIG. 4, the ratio of S2/S1 is two.
  • FIG. 6 A preferred embodiment of the present invention which uses a prism to separate visible from infrared light is shown in FIG. 6. This embodiment has the added advantage of shifting the infrared focus plane to correct for common chromatic aberrations caused by imaging lenses.
  • the three visible color sensor lines 602, 604, and 606 lie under red, green, and blue filter lines 608, 610, and 612, respectively, as previously illustrated in FIG. 4.
  • novel infrared sensor line 614 is added to practice the present invention, but unlike FIG. 4, an infrared filter line over sensor line 614 is optional. All these lines are housed within circuit package 620 under a transparent cover glass 622. Also included is a dichroic prism 624 which may be mounted on the cover glass 622, incorporated as a part of the cover glass (shown
  • a first prism 630 has a surface 632 coated such that a light ray 634 is split into a transmitted visible component 636 and a reflected infrared component 638.
  • a surface coating is commonly known in the art as a "hot mirror", because the heat, or infrared, component is reflected. Hot mirrors are available from Edmund Scientific Corporation of Barrington, New Jersey.
  • Bonded to the first prism 630 is a second prism 640 constructed with parallel surfaces such that the reflected infrared component 638 is further reflected at surface 642 so as to be directed toward infrared sensor line 614.
  • the reflection surface 642 is preferably coated so as to enhance reflection while preventing light incident from above the second prism 640 from penetrating to the sensor line 614.
  • a coating material commonly used in mirrors for high infrared reflectivity is gold.
  • the apparatus of FIG. 6 has a significant advantage over the apparatus of FIG. 4, as explained with reference to FIG. 7.
  • FIG. 7 it is noted that the focus plane for infrared light is displaced relative to the focus plane for visible light.
  • the focus difference is shown greatly exaggerated in FIG. 7.
  • the focus shift 702 is about 0.25% of the visible focal length for infinity which would be about 0.5% of the combined visible focal distance 704 of the lens in FIG. 7 operating at a unity magnification.
  • the virtual image 650 of the infrared line 614 is displaced by distance 652 below the visible sensor lines.
  • Distance 652 is equal to Dl 654 divided by the index of refraction to infrared light of prism section 640.
  • the displacement distance Dl 654 is defined by the characteristics of prism section 640, and is preferably chosen to be the distance between the infrared sensor line and the middle visible sensor line.
  • the distance Dl 654 is easily controlled at manufacture as a function of the thickness of prism section 640 and is not affected by misalignments in laying the prism over the sensor lines. Therefore, an otherwise high level of precision is not required in the cleanroom where the circuits are packaged.
  • the displacement distance 652 also introduces a slight magnification of the infrared image which can be corrected through a resize algorithm, such as is commonly known in the art. Although it is conceptually simple to duplicate a fourth silicon sensor line, in practice it is very expensive to make any change to an existing silicon fabrication process. In addition,
  • FIG. 8 illustrates the application of the invention with other than a quadrilinear array.
  • a conventional trilinear silicon sensor is adapted to practice the current invention by altering only the filters deposited over the silicon sensors, and not the silicon layout itself.
  • one of the sensor lines 802 is rendered specific to infrared scanning by one of the methods discussed above to substantially block noninfrared light.
  • One such method, presented above, uses a prism with an infrared reflecting dichroic coating.
  • One line 804 receives a primary color, chosen as green in the preferred embodiment.
  • the last array 806 must therefore sense the remaining two colors. This can be done by alternating the remaining two colors interstitially at pixel boundaries, such as by making sensors receptive to red 810 in even rows such as row 16 and to blue 812 in odd rows such as row 15, as illustrated in FIG. 8.
  • the primary color may be green as presented above; however, in an embodiment more conservative of photons for use in low light, the primary color used for line 804 is chosen to be white, namely the visible light remaining after removal of infrared light with an infrared blocking filter, but with no auxiliary color filter.
  • the two alternating colors in line 806 are then chosen as either cyan and orange or cyan and light red.
  • Such a combination would approximately double the number of photons impinging on the silicon sensors after passing through the colored filters, and therefore the luminance noise would be less due to lowered shot noise. This lowering of luminance noise comes, however, at the expense of weaker color distinction, requiring color amplification and thereby causing increased color noise. In low light this has been found to be a practical trade-off.
  • the embodiment just described uses row 15 as an example of a row in which each point on a substrate, after scanning, has been sensed with infrared, green, and blue light, and row 16 as an example of a row in which each point on the substrate, after scanning, has been sensed with infrared, red, and green light. It does not matter significantly in what time order the colors are sensed.
  • the color filter topology of FIG. 9 is seen to produce the same color combinations, and in particular rows 15 and 16 of FIG. 9 are seen to sense the same colors as lines 15 and 16 of FIG. 8, albeit at different times.
  • the infrared sensors also could obviously be interspersed on the other lines in any of the embodiments illustrated in this application without departing from the scope of the invention.
  • the embodiment shown in FIG. 9 is still considered to have a green selective line; the line has just been interspersed in other lines.
  • the sensed colors are named for each pixel of each row, including rows 15 and 16 used in the above examples.
  • the missing colors for each pixel of each row are named in parentheses. It is those colors that need to be estimated using available known color information.
  • the specific color measurements are named using a nomenclature relative to the center pixel: "c" for center, "t” for top, “tr” for top right, “r” for right, “br” for bottom right, “b” for bottom, “bl” for bottom left, “1” for left, and “tl” for top left.
  • the unmeasured red value for the center pixel will be estimated.
  • this example will estimate the value for one red pixel, the same algorithm could apply to any red pixel within any even row, or the blue value on odd rows by interchanging red and blue wording.
  • the unmeasured value of Re will be calculated by combining estimates based on color information from the surrounding pixels. These estimates are named as the six estimates Et, Etr, Ebr, Eb, Ebl, and Etl.
  • the linear change is found to work better when Rt>Gt or Rb>Gb, and the percent change works better when Rt ⁇ Gt or Rb ⁇ Gb.

Abstract

Surface defect correction technology for photographic images requires an infrared scan along with a conventional color scan. In the present invention, the additional infrared scan needed for surface defect correction is obtained by adding a line of sensors specific to infrared light to a conventional multilinear color sensor array. The invention teaches a practical mode of distinguishing infrared light using a dichroic prism placed over the sensor. This mode has the additional advantage of placing the infrared-specific sensor line in a displaced focus plane to match conventional lenses. Adding a sensor line to a conventional trilinear sensor array requires a quadrilinear array topology. In addition to the direct quadrilinear topology, the invention teaches a method of obtaining full color image information with only two linear sensor lines by interstitially mixing red and blue sensors on a single sensor line, which, in conjunction with the additional infrared line, results in a conventional trilinear sensor topology with a different filter arrangement.

Description

MULTILINEAR ARRAY SENSOR WITH AN INFRARED LINE
TECHNICAL FIELD OF THE INVENTION
This invention relates to the scanning of photographic images, and more particularly in a primary application, to scanning in infrared and visible light in order to prepare for correction of surface defects.
BACKGROUND OF THE INVENTION
FIG. 1 shows a prior art trilinear film scanner, and also introduces some terms that will be used in this application. A lamp 102 transilluminates a filmstrip 104 containing an image 106 to be scanned. Normally the light from the lamp 102 would be diffused or directed by additional optics, not shown, positioned between the lamp 102 and film 104 in order to illuminate the image 106 more uniformly. The image 106 on the film 104 is focused by lens 108 onto a sensor line 110 in a circuit package 112. The sensor line 110 projects back through lens 108 as a line 116 across the image 106. This line 116 is composed of many individual points, or pixels. To scan the entire image 106, the film 104 is moved perpendicularly to the line 116 to scan a two dimensional area, such as image 106. Because the sensors of the sensor line 110 are positioned in lines, this arrangement is called a linear, or line, sensor.
The sensor line 110 may be of a form known in the art as a "trilinear", or three line, array. As shown magnified at 120, the sensor line 110 actually consists of three parallel lines of sensors. In this prior art embodiment, one line of sensors 122 is behind a line of red filters 124. This arrangement could consist of a series of independent filters, but is normally a single long red filter 124 which covers all of the sensors of line 122. Another line of sensors 126 is behind a green filter line 128, and a third line of sensors 130 is behind a blue filter line 132.
As the film 104 is moved, the three lines 122, 126, and 130 each provide an individual image of the film seen with a different color of light. The data from the circuit package 112 is sent along cable 136 to supporting electronics and computer storage and processing means, shown together as computer 138. Inside computer 138 the data for each color image is grouped together, and the three images are registered as the three color planes 140, 142, and 144 of a full color image. Each of these color planes 140, 142 and 144 consists of pixels describing with a number the intensity of the light at each point in the film. For example, pixel 150 of the red color plane 144 may contain the number "226" to indicate a near white light intensity at point 152 on the film 104, as measured at a specific sensor 154 in the array 110, shown enlarged in circle 120 as sensor 156 behind the red filter line.
In FIG. 1 it is noted that there is a spacing between sensor lines 122, 126 and 130, and therefore the same point on the film 104 is not sensed by all three color lines at the same point in time. FIG. 2 illustrates this registration problem in more detail. In FIG. 2 there is a trilinear array (not shown) with red, green, and blue sensor lines
202, 204, and 206. These lines are projected onto a substrate (not shown) which is moved in the direction of the arrow to scan out regions of the image on the substrate. The region seen by each line is different from the region seen by the other lines. For example, at the beginning of an arbitrary time interval, sensor 210 of the blue line 206 may see point 212 of the substrate, while at the end of the time interval, it may see point 214. It is apparent that each of the different sensors 210, 220 and 230 sees a different area during the same time interval.
For example, at the end of the time interval, sensor 220 of the red sensor array 202 sees point
222, which is different than point 214 seen by the blue sensor 210 at the same end time.
However, if the time interval is long enough, there will exist a region of overlap 224 over which all array lines have passed. If the interval between measurements is an integer submultiple of the spacing between the arrays, then there exists a time at which sensor 230 of the green line 204 sees the same point 232 on the substrate as 214, and another time at which sensor 220 of the red line 202 sees point 234, the same as point 214, which in turn will be seen by the blue sensor 210 at a later time. The computer system 138 receiving the information from the scans made by the trilinear array registers the data representing the three color images by shifting the data an amount corresponding to the distance between sensor lines, and discarding the part of each color record outside the full color range overlap 224.
Although this illustration has presented a so-called transmission, or film, scanner, a reflection, or print, scanner uses the same principles except that the source light is reflected from the same side as the imaging lens. As is explained later, there are uses for the present invention in both transmission and reflection scanners.
-2- The conventional scanners described above scan in the three visible colors, exclusive of the invisible infrared. There are several reasons that it would be useful to add an infrared record registered to the conventional colored records. For example, examination of old documents under infrared with a reflection scanner is proving useful in examination of historic works, such as the Dead Sea Scrolls, to disclose alterations. Another potential use presented here without admission that it is known in the art, is to distinguish the "K" or black channel from the cyan, magenta, and yellow channels in a four color print. Currently a major commercial use of infrared plus visible scans is a technology called infrared surface defect correction, as explained in FIG. 3. Current applications of infrared surface defect correction are limited to transmission scanners, although it may be extended to reflection scanners, and therefore the specific illustration of a transmission scanner given below is not to be considered a limitation.
In FIG. 3, a lamp 302 transilluminates filmstrip 304 containing an image 306. An electronic camera 308 views the image 306 and outputs red, green, and blue digitized records 310, 312, and 314. In addition the electronic camera 308 outputs an infrared record 316. There are several ways a conventional camera can be made responsive to selectively visible and infrared light. One way is to provide a filter wheel 320 with four filters: red 322, green 324, blue 326, and infrared 328. If the camera 308 is a monochrome camera whose sensitivity extends into infrared, then the three visible colors and infrared may be captured at four different times, each time illuminating the film with a different filter in the filter wheel 320.
The cyan, magenta, and yellow dyes that create the image 306 are all transparent to infrared light, and therefore the film 304 appears clear to camera 308 when viewed under infrared light. On the other hand, surface defects such as dust, scratches, and fingerprints refract the light passing through the film 304 away from the camera 308, and therefore appear as darkened points under both visible and infrared light. Because refraction under infrared light is nearly equal to refraction under visible light, the defects appear nearly as dark in the infrared as in the visible spectrum.
Therefore infrared record 316 is effectively of a clear piece of film including defects, and image 310 contains the same defects plus the red image. The infrared image 316 provides a pixel by pixel "norming" for the effect of defects. For example, defect-free pixel 340 in the red record 310 may contain a 50% brightness measurement. The corresponding defect-free pixel 342 in the infrared record 316 contains 100% brightness because no defect has attenuated the light. Function block 344 divides the 50% brightness level from the red record 310 by the norming 100%) brightness level from the infrared record 316 to give a 50% brightness measurement for corrected pixel 346. On the other hand, pixel 350 under scratch 352 in the red record 310 may contain a 40% brightness measurement. The corresponding pixel 354 in the infrared record 316 seeing the same scratch may contain 80% brightness because the scratch has refracted 20% of the light. When function block 344 divides 40% by 80%, a corrected brightness value of 50% is determined for pixel 356. Note that corrected pixels 346 and 356 within the same background area of the image now both contain the same brightness value of 50%, so the effect of the scratch has disappeared. This division is repeated for each pixel to produce the corrected red record 360; and the same division by infrared is applied to the green record 312, and blue record 314, to produce the corrected green and blue records 362 and 364, resulting in a full color corrected image.
There are several ways of generating an infrared scan in conjunction with a visible scan. One method makes four passes across the original image using a light that changes color between passes, as was shown in FIG. 3. Unfortunately, this can take four times as long as a single pass scanner. Alternately, one can make a single pass while flashing four lights in rapid succession, but again the hardware may need to move at one fourth the speed. None of these prior art methods combines the speed obtained with a single pass multilinear array with the image clarity possible in the prior art attained by making multiple scans. It is apparent that the introduction of such a system would provide an improvement to the state of the art in infrared surface defect correction, as well as to the other uses of combined infrared and visible scans mentioned above.
SUMMARY OF THE INVENTION
The present invention adds a line to a conventional multilinear sensor array. The added line is specific to the infrared scan. In the most direct embodiment, the added line makes what was a trilinear array containing three lines, one for each of three primary colors, into a quadrilinear array. In a second embodiment, the red and blue sensor lines are combined
-4- into one line that alternates between red and blue sensors. This second embodiment uses only two lines for sensing full color, allowing the third line of existing trilinear layouts to be devoted to infrared.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present invention and for further advantages thereof, reference is now made to the following Description of the Preferred Embodiments taken in conjunction with the accompanying Drawings in which: FIG. 1 shows a conventional trilinear film scanner;
FIG. 2 illustrates registration of a multilinear array; FIG. 3 explains the operation of infrared surface defect correction; FIG. 4 shows the present invention with a quadrilinear line and infrared filter; FIG. 5 graphs the color transmission of available filters; FIG. 6 shows the preferred embodiment with a dichroic prism;
FIG. 7 illustrates the infrared focus shift common to imaging lenses;
FIG. 8 charts the filter arrangement of a trilinear infrared sensor;
FIG. 9 charts the filter arrangement of an alternate trilinear infrared sensor; and
FIG. 10 presents missing color recovery used with the sensor shown in FIG. 7.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
FIG. 4 shows a prior art trilinear array with the addition of the novel fourth sensor line for infrared scanning. In this figure, sensor line 402 contains individual photosites 404, each behind a red filter. In perspective, it is seen that the line 402 consists of a row of silicon photosensor sites 406 behind red filter material 408 to render the row of photosites 406 responsive primarily to red light. Similarly, the photosites in sensor lines 410 and 412 are made primarily responsive to green and blue light, respectively. Together these three lines 402, 410 and 412, in conjunction with their overlaid filters, form a prior art trilinear sensor array. These three silicon lines with their overlaid filters are typically contained in a package 414 under cover glass 416.
-5- The present invention teaches the addition of another line 420 to the device which is specific to infrared light. Silicon sensor material is inherently sensitive to the lower end of the infrared spectrum, so the layout and construction of the extra sensor line 420, shown in perspective as line 424 having photosites 422, may be a copy of one of the other three lines, such as line 428 or 430. The manufacture of a silicon sensor line, and the duplication of multiple copies of a silicon circuit, called "macros" onto a silicon die, are well known in the art.
Line 420 is made primarily responsive to infrared light by removing visible light reaching the photosensor sites 422 with a line of infrared-passing, visible-absorbing filter material 424. Infrared filter material 424 would appear black to the human eye. A number of such infrared materials are known in the art. As an example, a double layer filter, the first layer consisting of the filter material 408 printed to make the red line 402, overlaid with a second layer consisting of the filter material printed to make the green line 428, would together absorb visible light and transmit infrared. In fact, any two or three of the visible colors combined would absorb visible and pass infrared light. Because these filter materials are already present in the printing of the other two lines, this method would enable manufacture of the infrared filter 424 without requiring any additional dye types in the fabrication process.
Unfortunately, a problem arises when using the device described thus far. This problem is explained using FIG. 5 wherein the transmission of typical organic colored dyes commonly used to form the colored filter lines of FIG. 4 are graphed. Although the infrared line is rendered primarily responsive to only infrared light by the infrared filter, the other colors are specific not only to their labeled visible color, but to infrared light as well, as is seen by observing that the labeled color graphs in FIG. 5 transmit infrared light. When a scan is made in which each of the visible records also contains infrared light, a faded and excessively surface-defect-sensitive image results. Therefore, in the prior art, either a light source was used that had no infrared content, or an infrared blocking filter was placed somewhere in the light path, such as at the light source itself or as a component of the cover glass in the sensor circuit package. In the present invention, in order to overcome this problem and sense infrared in one sensor line, infrared light cannot be blocked before reaching the sensor package. Therefore, the infrared blocking filter must be combined with the visible filters at the sensor. In one embodiment of the present invention, the visible color filters of lines 408, 428, and 430 of FIG. 4 are manufactured with a process that absorbs or reflects infrared light. Such a process may use multilayer interference filters commonly known in the art. These filters require many layers, and in order to produce three colors on the same substrate, the substrate would need to be deposited many times resulting in difficult and expensive manufacturing.
As an improved embodiment, FIG. 4 teaches the use of a cut piece of infrared blocking material 431 laid over the three visible lines 408, 428, and 430, but having a terminating edge 432 between the last visible color sensor line 430 and the infrared line 424. This blocking material 431, shown also as filter 434 in circuit package 414, could be placed on top of the cover glass 416, manufactured as part of the cover glass, or ideally placed under the cover glass and directly over the sensor lines.
An infrared absorbing filter, such as is available from Schott Optical of Germany, typically is relatively much thicker than the organic colored filters, and therefore creates a shadowing parallax problem at the edge 432 over sensor lines that are very close. This shadowing may be minimized by moving the thick filter closer to the sensor lines when placing it under the cover glass 416, and is also made less objectionable by moving the infrared sensor line 424 further from the last visible line 430. In the arrangement of FIG. 4, the spacing S2 between the visible and infrared lines is made greater than the spacing S 1 between the visible lines. In particular, for maximum step resolution flexibility, the ratio
S2/S1 should be an integer. In the specific illustration of FIG. 4, the ratio of S2/S1 is two.
A preferred embodiment of the present invention which uses a prism to separate visible from infrared light is shown in FIG. 6. This embodiment has the added advantage of shifting the infrared focus plane to correct for common chromatic aberrations caused by imaging lenses.
In FIG. 6 the three visible color sensor lines 602, 604, and 606 lie under red, green, and blue filter lines 608, 610, and 612, respectively, as previously illustrated in FIG. 4. Also novel infrared sensor line 614 is added to practice the present invention, but unlike FIG. 4, an infrared filter line over sensor line 614 is optional. All these lines are housed within circuit package 620 under a transparent cover glass 622. Also included is a dichroic prism 624 which may be mounted on the cover glass 622, incorporated as a part of the cover glass (shown
-7- separately at 626), or incorporated under the cover glass in contact with the sensor lines as illustrated in the enlarged illustration 600 of FIG. 6.
In the enlarged illustration 600, a first prism 630 has a surface 632 coated such that a light ray 634 is split into a transmitted visible component 636 and a reflected infrared component 638. Such a surface coating is commonly known in the art as a "hot mirror", because the heat, or infrared, component is reflected. Hot mirrors are available from Edmund Scientific Corporation of Barrington, New Jersey. Bonded to the first prism 630 is a second prism 640 constructed with parallel surfaces such that the reflected infrared component 638 is further reflected at surface 642 so as to be directed toward infrared sensor line 614. The reflection surface 642 is preferably coated so as to enhance reflection while preventing light incident from above the second prism 640 from penetrating to the sensor line 614. A coating material commonly used in mirrors for high infrared reflectivity is gold.
The apparatus of FIG. 6 has a significant advantage over the apparatus of FIG. 4, as explained with reference to FIG. 7. In FIG. 7, it is noted that the focus plane for infrared light is displaced relative to the focus plane for visible light. The focus difference is shown greatly exaggerated in FIG. 7. In a typical achromatic lens, the focus shift 702 is about 0.25% of the visible focal length for infinity which would be about 0.5% of the combined visible focal distance 704 of the lens in FIG. 7 operating at a unity magnification. Returning to FIG. 6, it is noted that the virtual image 650 of the infrared line 614 is displaced by distance 652 below the visible sensor lines. Distance 652 is equal to Dl 654 divided by the index of refraction to infrared light of prism section 640. The displacement distance Dl 654 is defined by the characteristics of prism section 640, and is preferably chosen to be the distance between the infrared sensor line and the middle visible sensor line.
Note that the distance Dl 654 is easily controlled at manufacture as a function of the thickness of prism section 640 and is not affected by misalignments in laying the prism over the sensor lines. Therefore, an otherwise high level of precision is not required in the cleanroom where the circuits are packaged. The displacement distance 652 also introduces a slight magnification of the infrared image which can be corrected through a resize algorithm, such as is commonly known in the art. Although it is conceptually simple to duplicate a fourth silicon sensor line, in practice it is very expensive to make any change to an existing silicon fabrication process. In addition,
-8- more silicon is required for the extra line along with more electronics to support the extra data. Accordingly it would be an advantage if an existing trilinear line could be adapted to the present invention. Such an adaptation is shown in FIG. 8 which illustrates the application of the invention with other than a quadrilinear array. In FIG. 8, a conventional trilinear silicon sensor is adapted to practice the current invention by altering only the filters deposited over the silicon sensors, and not the silicon layout itself. To practice the invention, one of the sensor lines 802 is rendered specific to infrared scanning by one of the methods discussed above to substantially block noninfrared light. One such method, presented above, uses a prism with an infrared reflecting dichroic coating.
With one of the three sensor lines dedicated to infrared, only two sensor lines remain to receive three visible colors. One line 804 receives a primary color, chosen as green in the preferred embodiment. The last array 806 must therefore sense the remaining two colors. This can be done by alternating the remaining two colors interstitially at pixel boundaries, such as by making sensors receptive to red 810 in even rows such as row 16 and to blue 812 in odd rows such as row 15, as illustrated in FIG. 8.
The primary color may be green as presented above; however, in an embodiment more conservative of photons for use in low light, the primary color used for line 804 is chosen to be white, namely the visible light remaining after removal of infrared light with an infrared blocking filter, but with no auxiliary color filter. The two alternating colors in line 806 are then chosen as either cyan and orange or cyan and light red. Such a combination would approximately double the number of photons impinging on the silicon sensors after passing through the colored filters, and therefore the luminance noise would be less due to lowered shot noise. This lowering of luminance noise comes, however, at the expense of weaker color distinction, requiring color amplification and thereby causing increased color noise. In low light this has been found to be a practical trade-off.
The embodiment just described uses row 15 as an example of a row in which each point on a substrate, after scanning, has been sensed with infrared, green, and blue light, and row 16 as an example of a row in which each point on the substrate, after scanning, has been sensed with infrared, red, and green light. It does not matter significantly in what time order the colors are sensed. The color filter topology of FIG. 9 is seen to produce the same color combinations, and in particular rows 15 and 16 of FIG. 9 are seen to sense the same colors as lines 15 and 16 of FIG. 8, albeit at different times. In fact, the infrared sensors also could obviously be interspersed on the other lines in any of the embodiments illustrated in this application without departing from the scope of the invention. The embodiment shown in FIG. 9 is still considered to have a green selective line; the line has just been interspersed in other lines.
In the reproduction of an image made with the color topology taught in FIG. 8, it is necessary to recreate the missing color in each row. Turning to FIG. 10, the sensed colors are named for each pixel of each row, including rows 15 and 16 used in the above examples. The missing colors for each pixel of each row are named in parentheses. It is those colors that need to be estimated using available known color information. The specific color measurements are named using a nomenclature relative to the center pixel: "c" for center, "t" for top, "tr" for top right, "r" for right, "br" for bottom right, "b" for bottom, "bl" for bottom left, "1" for left, and "tl" for top left. In this example Re, the unmeasured red value for the center pixel, will be estimated. Although this example will estimate the value for one red pixel, the same algorithm could apply to any red pixel within any even row, or the blue value on odd rows by interchanging red and blue wording.
The unmeasured value of Re will be calculated by combining estimates based on color information from the surrounding pixels. These estimates are named as the six estimates Et, Etr, Ebr, Eb, Ebl, and Etl. In a preferred embodiment, the diagonal estimates contribute to the determination of the unmeasured value just as do the vertical estimates, but at a reduced strength, namely Rc=(Et+Eb)/4+(Etl+Etr+Ebl -Ebr)/8. In another embodiment, more efficient of computer time, the diagonal estimates may be completely ignored, namely Rc=(Et+Eb)/2. Next, calculation of the estimates using Et and Eb by name will be disclosed. Any of the other estimates Etl, Etr, Ebl, and Ebr may be similarly calculated with the change in nomenclature. Most basically, each estimate can be simply the red value of the adjacent pixel, namely Et=Rt and Eb=Rb. It is possible to improve the estimate by using the green value that is known for all pixels. In particular, it can be assumed, because the real world tends to be rather monochromic, that the red value changes with position about as fast as the green value changes. Therefore, a particular estimate, such as Et, will use not only the red value of the
-10- adjacent pixel, but will adjust this value by the amount green changes from the adjacent pixel to the center pixel for which red is being estimated.
For example, Et=Rt+(Gc-Gt) and Eb=Rb+(Gc-Gb) to use the linear change in color.
Alternately Et=Rt*Gc/Gt and Eb=Rb*Gc/Gb to use the percent change in color. When the color measurements are directly proportional to lumens, the linear change is found to work better when Rt>Gt or Rb>Gb, and the percent change works better when Rt<Gt or Rb<Gb.
The equations produce the same result when Rt=Gt or Rb=Gb. When the color measurements are expressed in the common gamma correct space wherein values are proportional to the square root of lumens, the linear change is found to be acceptable in all cases, but with the resultant values clamped so as not to go negative.
Whereas the present invention has been described with respect to specific embodiments thereof, it will be understood that various changes and modifications will be suggested to one skilled in the art and it is intended to encompass such changes and modifications as fall within the scope of the appended claims.
-11-

Claims

1. A system for scanning a substrate in visible and infrared light, the substrate including a plurality of parallel sensor lines, the system including: means for substantially blocking noninfrared light to a first sensor line of the plurality of sensor lines; and means for substantially blocking infrared light to a second sensor line of the plurality of sensor lines.
2. The system of claim 1 and further including means for substantially blocking infrared light to a third sensor line of the plurality of sensor lines, wherein the second sensor line is made primarily responsive to visible light of a first color, and the third sensor line is primarily responsive to visible light of a second color.
3. The system of claim 2 and further including: a first filter for passing light of the visible light of the first color, free of infrared light to the second sensor line, and a second filter for passing light of the visible light of the second color, free of infrared light to the third sensor line.
4. The system of claim 2 and further including: an infrared blocking filter for blocking infrared light to the second and third sensor lines, and a first filter for passing light of the visible light of the first color plus infrared light to the second sensor line, and a second filter for passing light of the visible light of the second color to the third sensor line.
5. The system of claim 2 and further including: means for substantially blocking infrared light to a fourth sensor line of the plurality of sensor lines, wherein the fourth sensor line is primarily responsive to a visible light of a third color.
-12-
6. The system of claim 4 wherein the infrared blocking filter absorbs infrared light.
7. The system of claim 4 wherein the infrared blocking filter reflects infrared light.
8. The system of claim 4 wherein the infrared blocking filter reflects infrared light toward the first sensor line.
9. The system of claim 1 wherein said infrared light blocking and noninfrared light blocking means comprise a prism.
10. The system of claim 9 wherein the prism includes a first surface having an interference filter to transmit noninfrared light and reflect infrared light.
11. The system of claim 10 wherein the prism further includes a second surface substantially parallel to the first surface which redirects the reflected infrared light to the first sensor line.
12. The system of claim 2 wherein the third sensor line further comprises a plurality of individual sensors wherein a first individual sensor of the third sensor line is primarily responsive to a high color within the visible light of the second color, and a second individual sensor of the third sensor line is primarily responsive to a low color with the visible light of the second color.
13. The system of claim 12 wherein the color of the visible light of the second color is magenta, the high color is blue, and the low color is red.
14. The system of claim 13 wherein the color of the visible light of the first color is green.
Γûá13-
15. The system of claim 12 wherein the color of the visible light of the first color is white.
16. The system of claim 15 wherein the high color is cyan.
17. The system of claim 12 wherein the low color corresponding to the position of the first individual sensor is estimated as a function of the low color measured by the second individual sensor, and the difference in the first color measured by respective individual sensors of the second sensor lines at positions corresponding to the first individual sensor and the second individual sensor.
18. A sensor responsive to visible light and infrared light, comprising: a substrate; a first linear sensor disposed on said substrate and being primarily responsive to visible light of a first color; a second linear sensor disposed on said substrate and being primarily responsive to visible light of a second color; a third linear sensor disposed on said substrate and being primarily responsive to visible light of a third color; a fourth linear sensor disposed on said substrate and being primarily responsive to infrared light; means for substantially blocking infrared light from said first, second, and third linear sensors; and means for substantially blocking visible light from said fourth linear sensor.
19. The sensor of Claim 18 wherein said linear sensors are spaced apart by a distance, such that the distance between said first and second sensors and the distance between said second and third sensors is less than the distance between said third and fourth sensors.
20. The sensor of Claim 18 wherein said blocking means includes a prism.
-14-
PCT/US1999/001674 1998-02-04 1999-01-26 Multilinear array sensor with an infrared line WO1999040729A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2000531014A JP2002503066A (en) 1998-02-04 1999-01-26 Multiple linear array sensor with infrared line
AU25631/99A AU2563199A (en) 1998-02-04 1999-01-26 Multilinear array sensor with an infrared line
EP99905481A EP1053644A1 (en) 1998-02-04 1999-01-26 Multilinear array sensor with an infrared line

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US7360298P 1998-02-04 1998-02-04
US60/073,602 1998-02-04

Publications (1)

Publication Number Publication Date
WO1999040729A1 true WO1999040729A1 (en) 1999-08-12

Family

ID=22114684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/001674 WO1999040729A1 (en) 1998-02-04 1999-01-26 Multilinear array sensor with an infrared line

Country Status (6)

Country Link
US (1) US6590679B1 (en)
EP (1) EP1053644A1 (en)
JP (1) JP2002503066A (en)
AU (1) AU2563199A (en)
TW (1) TW401676B (en)
WO (1) WO1999040729A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001020425A2 (en) * 1999-09-13 2001-03-22 Applied Science Fiction, Inc. Method and apparatus for scanning images
EP1100254A1 (en) * 1999-11-12 2001-05-16 Noritsu Koki Co., Ltd. Apparatus for reading images from photographic film
WO2002025345A2 (en) * 2000-09-22 2002-03-28 Applied Science Fiction Lens focusing device, system and method for use with multiple light wavelengths
US6369873B1 (en) 2000-06-13 2002-04-09 Eastman Kodak Company Thermal processing system and method including a kiosk
EP1241866A2 (en) 2001-03-15 2002-09-18 Canon Kabushiki Kaisha Image processing method for correcting defects of scanned image
US6630319B1 (en) 1998-08-24 2003-10-07 Bioline Limited Thermostable DNA polymerase
US6781724B1 (en) 2000-06-13 2004-08-24 Eastman Kodak Company Image processing and manipulation system
US6862117B1 (en) 1999-12-30 2005-03-01 Eastman Kodak Company Method and apparatus for reducing the effect of bleed-through on captured images
US6950608B2 (en) 2003-12-23 2005-09-27 Eastman Kodak Company Capture of multiple interlaced images on a single film frame using micro-lenses and method of providing multiple images to customers
CN100344143C (en) * 1999-11-12 2007-10-17 诺日士钢机株式会社 Apparatus for reading images from photographic film

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7164510B1 (en) 1998-09-17 2007-01-16 Canon Kabushiki Kaisha Image scanning apparatus and method, and storage medium
JP3775312B2 (en) * 2002-03-04 2006-05-17 ノーリツ鋼機株式会社 Image processing method, image processing program, and computer-readable recording medium recording the program
US7551772B2 (en) * 2004-11-30 2009-06-23 Hewlett-Packard Development Company, L.P. Blur estimation in a digital image
JP2006333078A (en) 2005-05-26 2006-12-07 Canon Inc Image reader
JP2007267359A (en) * 2006-03-03 2007-10-11 Ricoh Co Ltd Image reading apparatus and image forming apparatus
US7737394B2 (en) * 2006-08-31 2010-06-15 Micron Technology, Inc. Ambient infrared detection in solid state sensors
US20090159799A1 (en) * 2007-12-19 2009-06-25 Spectral Instruments, Inc. Color infrared light sensor, camera, and method for capturing images
KR20090120159A (en) * 2008-05-19 2009-11-24 삼성전자주식회사 Apparatus and method for combining images
JP5333964B2 (en) 2008-06-27 2013-11-06 株式会社ジャパンディスプレイ Photodetection device, electro-optical device, and electronic apparatus
US8357899B2 (en) * 2010-07-30 2013-01-22 Aptina Imaging Corporation Color correction circuitry and methods for dual-band imaging systems
KR102071325B1 (en) * 2013-09-27 2020-04-02 매그나칩 반도체 유한회사 Optical sensor sensing illuminance and proximity
US8988539B1 (en) * 2013-09-27 2015-03-24 The United States Of America As Represented By Secretary Of The Navy Single image acquisition high dynamic range camera by distorted image restoration
US20160255323A1 (en) 2015-02-26 2016-09-01 Dual Aperture International Co. Ltd. Multi-Aperture Depth Map Using Blur Kernels and Down-Sampling
CN112804426B (en) * 2020-12-30 2022-09-30 凌云光技术股份有限公司 Multi-line linear array camera capable of correcting defective pixels, method and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4670779A (en) * 1984-01-10 1987-06-02 Sharp Kabushiki Kaisha Color-picture analyzing apparatus with red-purpose and green-purpose filters
US4680638A (en) * 1982-07-16 1987-07-14 British Broadcasting Corporation Concealment of defects in a video signal
US5003379A (en) * 1989-10-16 1991-03-26 Eastman Kodak Company Telecine scanning apparatus with spectrally-shifted sensitivities responsive to negative or print film dyes
EP0716538A2 (en) * 1994-12-06 1996-06-12 Canon Kabushiki Kaisha Image pickup device

Family Cites Families (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2821868A1 (en) 1978-05-19 1979-11-22 Karl Sirowatka Recording damage on moving film or bands - using infrared scanning and pen recorders or warning devices
US4302108A (en) 1979-01-29 1981-11-24 Polaroid Corporation Detection of subsurface defects by reflection interference
US4260899A (en) 1979-06-14 1981-04-07 Intec Corporation Wide web laser scanner flaw detection method and apparatus
US4301469A (en) 1980-04-30 1981-11-17 United Technologies Corporation Run length encoder for color raster scanner
US4462860A (en) 1982-05-24 1984-07-31 At&T Bell Laboratories End point detection
US4442454A (en) 1982-11-15 1984-04-10 Eastman Kodak Company Image processing method using a block overlap transformation procedure
JPS61275625A (en) 1985-05-31 1986-12-05 Fuji Photo Film Co Ltd Calibrating method for color photographic image information
NL8501956A (en) 1985-07-09 1987-02-02 Philips Nv IMAGE RECOVERY CIRCUIT.
DE3534019A1 (en) 1985-09-24 1987-04-02 Sick Optik Elektronik Erwin OPTICAL RAILWAY MONITORING DEVICE
US4677465A (en) 1985-11-01 1987-06-30 Eastman Kodak Company Digital color image processing method with shape correction of histograms used to produce color reproduction functions
JPS62116937A (en) 1985-11-16 1987-05-28 Dainippon Screen Mfg Co Ltd Film attaching and detaching device for drum type image scanning and recording device
JPH01143945A (en) 1987-11-30 1989-06-06 Fuji Photo Film Co Ltd Detecting method for defect in tape
US4858003A (en) 1988-01-12 1989-08-15 Eastman Kodak Company Mechanism for handling slides and film strips
US5194950A (en) 1988-02-29 1993-03-16 Mitsubishi Denki Kabushiki Kaisha Vector quantizer
US5047968A (en) 1988-03-04 1991-09-10 University Of Massachusetts Medical Center Iterative image restoration device
CA1309166C (en) 1988-05-20 1992-10-20 Toshinobu Haruki Image sensing apparatus having automatic iris function of automatically adjusting exposure in response to video signal
US4977521A (en) 1988-07-25 1990-12-11 Eastman Kodak Company Film noise reduction by application of bayes theorem to positive/negative film
US5010401A (en) 1988-08-11 1991-04-23 Mitsubishi Denki Kabushiki Kaisha Picture coding and decoding apparatus using vector quantization
US4989973A (en) 1988-10-03 1991-02-05 Nissan Motor Co., Ltd. Surface condition estimating apparatus
DE68926785T2 (en) 1989-03-28 1997-01-02 Yokogawa Medical Syst IMAGE PROCESSING DEVICE
US4994918A (en) 1989-04-28 1991-02-19 Bts Broadcast Television Systems Gmbh Method and circuit for the automatic correction of errors in image steadiness during film scanning
US4972091A (en) 1989-05-16 1990-11-20 Canadian Patents And Development Limited/Societe Canadienne Des Brevets Et D'exploitation Limitee Method and apparatus for detecting the presence of flaws in a moving sheet of material
US5058982A (en) 1989-06-21 1991-10-22 Orbot Systems Ltd. Illumination system and inspection apparatus including same
US4937720A (en) 1989-10-13 1990-06-26 Sundstrand Corporation PWM inverter circuit analytically compensating for DC link distortion
JPH07115534A (en) 1993-10-15 1995-05-02 Minolta Co Ltd Image reader
US5264924A (en) 1989-12-18 1993-11-23 Eastman Kodak Company Mechanism for deriving noise-reduced estimates of color signal parameters from multiple color/luminance image sensor outputs
DE3942273A1 (en) 1989-12-21 1991-06-27 Broadcast Television Syst METHOD FOR HIDDEN ERRORS IN A VIDEO SIGNAL
US5267030A (en) 1989-12-22 1993-11-30 Eastman Kodak Company Method and associated apparatus for forming image data metrics which achieve media compatibility for subsequent imaging application
US5091972A (en) 1990-09-17 1992-02-25 Eastman Kodak Company System and method for reducing digital image noise
GB9023013D0 (en) 1990-10-23 1990-12-05 Crosfield Electronics Ltd Method and apparatus for generating representation of an image
JP2528383B2 (en) 1990-11-22 1996-08-28 大日本スクリーン製造株式会社 Pinhole erasing method
US5155596A (en) 1990-12-03 1992-10-13 Eastman Kodak Company Film scanner illumination system having an automatic light control
US5465163A (en) 1991-03-18 1995-11-07 Canon Kabushiki Kaisha Image processing method and apparatus for processing oversized original images and for synthesizing multiple images
JPH04291139A (en) 1991-03-20 1992-10-15 Nippon Steel Corp Method for reporting portion with flaw on strip
US5452018A (en) 1991-04-19 1995-09-19 Sony Electronics Inc. Digital color correction system having gross and fine adjustment modes
US5149960B1 (en) 1991-07-03 1994-08-30 Donnelly R R & Sons Method of converting scanner signals into colorimetric signals
EP0527097A3 (en) 1991-08-06 1995-03-01 Eastman Kodak Co Apparatus and method for collectively performing tile-based image rotation, scaling and digital halftone screening
US5200817A (en) 1991-08-29 1993-04-06 Xerox Corporation Conversion of an RGB color scanner into a colorimetric scanner
JP2549479B2 (en) 1991-12-06 1996-10-30 日本電信電話株式会社 Motion compensation inter-frame band division coding processing method
US5266805A (en) * 1992-05-05 1993-11-30 International Business Machines Corporation System and method for image recovery
US5371542A (en) 1992-06-23 1994-12-06 The United States Of America As Represented By The Secretary Of The Navy Dual waveband signal processing system
CA2093840C (en) 1992-07-17 1999-08-10 Albert D. Edgar Duplex film scanning
US5583950A (en) 1992-09-16 1996-12-10 Mikos, Ltd. Method and apparatus for flash correlation
US5300381A (en) 1992-09-24 1994-04-05 Eastman Kodak Company Color image reproduction of scenes with preferential tone mapping
US5568270A (en) 1992-12-09 1996-10-22 Fuji Photo Film Co., Ltd. Image reading apparatus which varies reading time according to image density
US5453611A (en) * 1993-01-01 1995-09-26 Canon Kabushiki Kaisha Solid-state image pickup device with a plurality of photoelectric conversion elements on a common semiconductor chip
EP0606654B1 (en) * 1993-01-01 2000-08-09 Canon Kabushiki Kaisha Image reading device
JPH06311307A (en) 1993-04-22 1994-11-04 Minolta Camera Co Ltd Image forming device
EP0624848A3 (en) 1993-05-04 1994-11-30 Eastman Kodak Company A technique for the detection and removal of local defects in digital continuous-tone images
KR950004881A (en) 1993-07-31 1995-02-18 김광호 Color image processing method and device
JPH0766977A (en) 1993-08-24 1995-03-10 Minolta Co Ltd Picture processing unit
US6128416A (en) 1993-09-10 2000-10-03 Olympus Optical Co., Ltd. Image composing technique for optimally composing a single image from a plurality of digital images
GB2283633B (en) 1993-11-05 1997-10-29 Sony Uk Ltd Anti-alias filter control for a split picture
US5729631A (en) 1993-11-30 1998-03-17 Polaroid Corporation Image noise reduction system using a wiener variant filter in a pyramid image representation
US5477345A (en) 1993-12-15 1995-12-19 Xerox Corporation Apparatus for subsampling chrominance
US5509086A (en) 1993-12-23 1996-04-16 International Business Machines Corporation Automatic cross color elimination
KR100300950B1 (en) 1994-01-31 2001-10-22 윤종용 Method and apparatus for correcting color
US5516608A (en) 1994-02-28 1996-05-14 International Business Machines Corporation Method for controlling a line dimension arising in photolithographic processes
DE4424577A1 (en) 1994-07-13 1996-01-18 Hoechst Ag Transporter protein for cationic xenobiotic(s) and pharmaceuticals, and related DNA and transformed cells
JPH0877341A (en) 1994-08-29 1996-03-22 Xerox Corp Equipment and method for color image processing
DE4432787A1 (en) 1994-09-15 1996-03-21 Philips Patentverwaltung Method and circuit for detecting and masking errors in a video signal
US5561611A (en) 1994-10-04 1996-10-01 Noran Instruments, Inc. Method and apparatus for signal restoration without knowledge of the impulse response function of the signal acquisition system
EP0707408A1 (en) 1994-10-11 1996-04-17 International Business Machines Corporation Optical scanner device for transparent media
US5565931A (en) 1994-10-31 1996-10-15 Vivo Software. Inc. Method and apparatus for applying gamma predistortion to a color image signal
US5649032A (en) 1994-11-14 1997-07-15 David Sarnoff Research Center, Inc. System for automatically aligning images to form a mosaic image
CH690639A5 (en) 1994-11-29 2000-11-15 Zeiss Carl Fa Apparatus for scanning digitize image templates and processes for their operation.
US5771107A (en) 1995-01-11 1998-06-23 Mita Industrial Co., Ltd. Image processor with image edge emphasizing capability
US5979011A (en) 1995-04-07 1999-11-09 Noritsu Koki Co., Ltd Dust removing apparatus
US5582961A (en) 1995-06-06 1996-12-10 Eastman Kodak Company Photographic elements which achieve colorimetrically accurate recording
US5710643A (en) 1995-06-29 1998-01-20 Agfa Divisionn, Bayer Corporation Optical path for a scanning system
US6104839A (en) 1995-10-16 2000-08-15 Eastman Kodak Company Method and apparatus for correcting pixel values in a digital image
JP3669448B2 (en) 1995-10-31 2005-07-06 富士写真フイルム株式会社 Image reproduction method and apparatus
US5641596A (en) 1995-12-05 1997-06-24 Eastman Kodak Company Adjusting film grain properties in digital images
US5892595A (en) 1996-01-26 1999-04-06 Ricoh Company, Ltd. Image reading apparatus for correct positioning of color component values of each picture element
EP0794454B1 (en) 1996-03-04 2005-05-11 Fuji Photo Film Co., Ltd. Film scanner
US5982951A (en) 1996-05-28 1999-11-09 Canon Kabushiki Kaisha Apparatus and method for combining a plurality of images
US5963662A (en) 1996-08-07 1999-10-05 Georgia Tech Research Corporation Inspection system and method for bond detection and validation of surface mount devices
GB9613685D0 (en) 1996-06-28 1996-08-28 Crosfield Electronics Ltd An illumination unit
US6075905A (en) 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
US5808674A (en) 1996-09-09 1998-09-15 Eastman Kodak Company Producing and improved digital image from digital signals corresponding to pairs of photosites
JPH10178564A (en) 1996-10-17 1998-06-30 Sharp Corp Panorama image generator and recording medium
JP3493104B2 (en) 1996-10-24 2004-02-03 シャープ株式会社 Color image processing equipment
US5982941A (en) 1997-02-07 1999-11-09 Eastman Kodak Company Method of producing digital image with improved performance characteristic
EP0893914A3 (en) 1997-07-24 2002-01-02 Nikon Corporation Image processing method, image processing apparatus, and storage medium for storing control process
US6078701A (en) 1997-08-01 2000-06-20 Sarnoff Corporation Method and apparatus for performing local to global multiframe alignment to construct mosaic images
US5969372A (en) 1997-10-14 1999-10-19 Hewlett-Packard Company Film scanner with dust and scratch correction by use of dark-field illumination
US6078051A (en) 1998-01-08 2000-06-20 Xerox Corporation Image input device and method for providing scanning artifact detection
US6239886B1 (en) 1998-01-08 2001-05-29 Xerox Corporation Method and apparatus for correcting luminance and chrominance data in digital color images
US6057040A (en) 1998-01-22 2000-05-02 Vision--Ease Lens, Inc. Aminosilane coating composition and process for producing coated articles
JP4096407B2 (en) 1998-04-22 2008-06-04 株式会社ニコン Image processing apparatus and computer-readable storage medium
JP2000196813A (en) 1998-12-25 2000-07-14 Canon Inc Image reader
ES2234554T3 (en) 1999-07-20 2005-07-01 Tecnoforming S.P.A. LIGHT ALLOY RIM WITH A STAINLESS STEEL FRONT COVERING ELEMENT.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4680638A (en) * 1982-07-16 1987-07-14 British Broadcasting Corporation Concealment of defects in a video signal
US4670779A (en) * 1984-01-10 1987-06-02 Sharp Kabushiki Kaisha Color-picture analyzing apparatus with red-purpose and green-purpose filters
US5003379A (en) * 1989-10-16 1991-03-26 Eastman Kodak Company Telecine scanning apparatus with spectrally-shifted sensitivities responsive to negative or print film dyes
EP0716538A2 (en) * 1994-12-06 1996-06-12 Canon Kabushiki Kaisha Image pickup device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6630319B1 (en) 1998-08-24 2003-10-07 Bioline Limited Thermostable DNA polymerase
WO2001020425A3 (en) * 1999-09-13 2001-10-25 Applied Science Fiction Inc Method and apparatus for scanning images
WO2001020425A2 (en) * 1999-09-13 2001-03-22 Applied Science Fiction, Inc. Method and apparatus for scanning images
EP1100254A1 (en) * 1999-11-12 2001-05-16 Noritsu Koki Co., Ltd. Apparatus for reading images from photographic film
CN100344143C (en) * 1999-11-12 2007-10-17 诺日士钢机株式会社 Apparatus for reading images from photographic film
US6862117B1 (en) 1999-12-30 2005-03-01 Eastman Kodak Company Method and apparatus for reducing the effect of bleed-through on captured images
US6369873B1 (en) 2000-06-13 2002-04-09 Eastman Kodak Company Thermal processing system and method including a kiosk
US6781724B1 (en) 2000-06-13 2004-08-24 Eastman Kodak Company Image processing and manipulation system
WO2002025345A3 (en) * 2000-09-22 2002-09-12 Applied Science Fiction Lens focusing device, system and method for use with multiple light wavelengths
WO2002025345A2 (en) * 2000-09-22 2002-03-28 Applied Science Fiction Lens focusing device, system and method for use with multiple light wavelengths
EP1241866A3 (en) * 2001-03-15 2002-10-09 Canon Kabushiki Kaisha Image processing method for correcting defects of scanned image
EP1241866A2 (en) 2001-03-15 2002-09-18 Canon Kabushiki Kaisha Image processing method for correcting defects of scanned image
US7006705B2 (en) 2001-03-15 2006-02-28 Canon Kabushiki Kaisha Image processing for correcting defects of read image
US7245784B2 (en) 2001-03-15 2007-07-17 Canon Kabushiki Kaisha Image processing for correcting defects of read image
US7391928B2 (en) 2001-03-15 2008-06-24 Canon Kabushiki Kaisha Image processing for correcting defects of read image
US6950608B2 (en) 2003-12-23 2005-09-27 Eastman Kodak Company Capture of multiple interlaced images on a single film frame using micro-lenses and method of providing multiple images to customers

Also Published As

Publication number Publication date
US6590679B1 (en) 2003-07-08
TW401676B (en) 2000-08-11
JP2002503066A (en) 2002-01-29
AU2563199A (en) 1999-08-23
EP1053644A1 (en) 2000-11-22

Similar Documents

Publication Publication Date Title
US6590679B1 (en) Multilinear array sensor with an infrared line
US5969372A (en) Film scanner with dust and scratch correction by use of dark-field illumination
US5266805A (en) System and method for image recovery
US7355193B2 (en) Dust and scratch detection for an image scanner
EP0901614B1 (en) Luminance-priority color sensor
EP1096785B1 (en) Method of scanning using a photosensor with multiple sensor areas of different sizes
JPH01237619A (en) Optical apparatus
JPS63214060A (en) Electronic picture forming apparatus
JP4497671B2 (en) Image processing method of image reading apparatus
US5750985A (en) High speed and high precisioin image scanning apparatus
JP3622391B2 (en) Color separation optical device
US5847754A (en) High resolution film scanner/telecine which uses a microlithographic diffuser laminate to aid in the reduction of visibility of surface imperfections in the film
US5675425A (en) Color image reading device having an optical element for creating double images
JPS62188952A (en) Film image information reader
US20030147562A1 (en) Method and device for identifying and/or correcting defects during digital image processing
JP3398163B2 (en) Image reading device
JPS5943869B2 (en) Mechanism for document illumination and imaging for sensor device
JP4231583B2 (en) Optical print head
JP2001045225A (en) Image reader
JP2798449B2 (en) Image reading device
JPH02281868A (en) Color image reader
JPS59143466A (en) Photoelectric conversion device
JPH01209878A (en) Color picture reader
JPS5957569A (en) Color picture reader
JPS60191547A (en) Color picture reading element

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1999905481

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 531014

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: KR

WWP Wipo information: published in national office

Ref document number: 1999905481

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 1999905481

Country of ref document: EP