WO2007110097A1 - Image capturing device with improved image quality - Google Patents

Image capturing device with improved image quality Download PDF

Info

Publication number
WO2007110097A1
WO2007110097A1 PCT/EP2006/002861 EP2006002861W WO2007110097A1 WO 2007110097 A1 WO2007110097 A1 WO 2007110097A1 EP 2006002861 W EP2006002861 W EP 2006002861W WO 2007110097 A1 WO2007110097 A1 WO 2007110097A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
capturing device
image capturing
projection system
detector
Prior art date
Application number
PCT/EP2006/002861
Other languages
French (fr)
Inventor
Gal Shabtay
Ephraim Goldenberg
Original Assignee
Tessera Technologies Hungary Kft.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tessera Technologies Hungary Kft. filed Critical Tessera Technologies Hungary Kft.
Priority to CN2006800540272A priority Critical patent/CN101438577B/en
Priority to US12/225,591 priority patent/US8233073B2/en
Priority to EP06723830A priority patent/EP1999947B1/en
Priority to JP2009501859A priority patent/JP5241700B2/en
Priority to PCT/EP2006/002861 priority patent/WO2007110097A1/en
Priority to KR1020087026489A priority patent/KR101388564B1/en
Publication of WO2007110097A1 publication Critical patent/WO2007110097A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/12Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having three components only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/0035Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having three lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/009Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras having zoom function
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image capturing device and particularly to an image capturing device with improved image quality for electronic devices .
  • image capturing devices have become widely used in portable and non-portable devices such as cameras, mobile phones, webcams and notebooks.
  • These image capturing devices conventionally include an electronic image detector such as a CCD or CMOS sensor, a lens system for projecting an object in a field of view onto the detector and an electronic circuitry for receiving and storing electronic data provided by the detector.
  • an electronic image detector such as a CCD or CMOS sensor
  • a lens system for projecting an object in a field of view onto the detector
  • an electronic circuitry for receiving and storing electronic data provided by the detector.
  • Resolution of an image capturing device means the minimum distance two point sources in an object plane can have such that the image capturing device is able to distinguish these point sources.
  • the resolution depends on the fact that due to diffraction and aberrations each optical system projects a point source not as a point but as a disc of predetermined width and having a certain light intensity distribution.
  • the response of an optical system to a point light source is known as point spread function (PSF) .
  • the overall resolution of an image capturing device mainly depends on the smaller one of two components: the optical resolution of the optical projection system and the resolution of the detector.
  • the optical resolution of an optical projection system shall be defined as the full width at half maximum (FWHM) of its PSF.
  • FWHM full width at half maximum
  • the resolution could also be defined as a different ' value depending on the PSF, e.g. 70% of the width at half maximum. This definition of the optical resolution might depend on the sensitivity of the detector and the evaluation of the signals received from the detector.
  • the resolution of the detector is defined herein as the pitch (i.e. distance middle to middle) of two adjacent sensor pixels of the detector.
  • Optical zoom signifies the capability of the image capturing device to capture a part of the field of view of an original image with better resolution compared with a non-zoomed image.
  • the overall resolution is usually limited by the resolution of the detector, i.e. that the FWHM of the PSF can be smaller than the distance between two neighbouring sensor pixels. Accordingly, the resolution of the image capturing device can be increased by selecting a partial field of view and increasing the magnification of the optical projection system for this partial field of view.
  • X2 optical zoom refers to a situation where all sensor pixels of the image detector capture half of the image, in each dimension, compared with that of Xl optical zoom.
  • 80 matter includes a magnification of the projected partial image and provides more information and better resolution.
  • Prior art in realizing optical zoom consists of either changing the distance between the lenses and/or changing 85 the focal length of some of the lenses in a lens module.
  • the mentioned optical systems typically require moving parts and/or special drivers with possibly high-voltage circuitry and do not result in a cost-effective solution.
  • an object to provide an image capturing 120 device having small size, no or few moving parts and being able to provide increased resolution compared to conventional image capturing devices having the same field of view.
  • the present invention is based on the finding that in simple triplet like imagers as they are conventionally used as image capturing devices for e.g. mobile phones, the PSF
  • the angle of incidence 130 is a function of the angle of incidence. This brings up the effect of a non-uniform resolution across a captured image. I.e., the maximum attainable resolution is a function of the location at the image plane. This effect is on-top of any other effect that limits the spatial resolution, such 135 as e.g. defocus .
  • the detector's sensor pixels sample the image of an object projected onto the detecting surface of the detector to form digital image signals thereof.
  • the sensor pixels sample the image signal at a higher rate than required by the sampling theorem.
  • the present invention provides an image capturing device, comprising: an electronic image detector having a detecting surface, an optical projection system for projecting an object within a field of view onto the detecting surface, and, optionally, a computing unit
  • the projection system is adapted to project the object in a distorted way such that the projected image is expanded in the center region of the field of view and is compressed in the border region of the 165 field of view.
  • the electronic image detector can be any detector being able to convert optical information projected onto its detecting surface into electrical signals. Examples are CCD 170 or CMOS detectors .
  • the detector can have one or more output lines to provide the electrical signals sequentially or parallely to the computing unit for processing the signals or to a memory for storing the signals.
  • the computing unit can be any electronic circuitry. E.g. it can be an integrated chip device. As described herein later in more detail, the computing unit can be adapted to process the signals received by the detector in order to obtain an undistorted image of high quality or to provide a
  • the optical projection system includes one or more optical elements like e.g. optical lenses or reflecting surfaces of mirrors or prisms or the like.
  • the optical elements are 185 arranged such that a predetermined field of view including the object can be projected onto the detecting surface of the image detector.
  • the optical projection system can 190 have a fixed focus. This means that its optical magnification can not be changed by external manipulation.
  • EFL effective focal length
  • the image capturing system of the present invention differs from prior art systems in that its projection system is adapted to project the object to be imaged in a distorted way.
  • the center of the field of view is expanded or
  • the projection system projects the object with a greater magnification at the center of the field of view and with a smaller magnification at the border of the field
  • the projection system according to the present invention differs mainly in that it provides a non-homogeneous
  • the overall resolution in this region can be enhanced. This is due to the fact that, in conventional image capturing devices, at the center of the field of view, the area of the PSF on the detecting surface is smaller than the sensor pixel size which 240 therefore limits the overall resolution.
  • 250 object plane can be narrow enough so that they are separated by a FWHM, and are therefore optically distinguishable, but the two points are projected onto the same pixel of the detector such that their optical information cannot be separated electronically.
  • the area of the PSF on the detecting surface and the distance between two PSFs of neighboring points are increased.
  • the above two neighboring points are then projected onto different sensor pixels and additional information on the projected image can be
  • magnification in the center is increased more than a predetermined limit at which the area (FWHM) of the PSF becomes larger than the pixel size, oversampling occurs and further increasing the magnification does not provide more 265 information .
  • the image capturing device of the present invention can provide a distorted image having a higher resolution at the center while the overall resolution of
  • the device is not decreased in any part of the field of view when compared to a conventional image capturing device having the same field of view.
  • the image is projected in a distorted way such that the resolution of the image detector is used more economically.
  • the projection system is adapted such that its point spread function in the border region of the field of view has a full width at half maximum being less than three times,
  • the local magnification of the projection system is chosen such that the FWHM of the PSF in the border region corresponds to the size of corresponding pixels of the image detector.
  • the "size of a pixel" can be defined as the distance between two neighboring pixels, from middle to middle.
  • a pixel is the smallest unit adapted to receive information about a minimum area in the object plane.
  • each pixel detects
  • a pixel can be composed of several subpixels, e.g. three or more pixels, each being adapted to detect information about the light intensity in a specific color range of such minimum area, e.g. for the red, green
  • a pixel can be composed of four subpixels arranged in a rectangle wherein in a first row there is a subpixel sensitive to a red light spectrum neighboring a subpixel sensitive to a green light spectrum and in the second row there is a
  • one pixel contains four subpixels.
  • Another example is where the different columns of the detector are sensitive to different colors, e.g., one column is sensitive to red
  • another column is sensitive to green and another column is sensitive to blue and vice versa.
  • a pixel is composed of three subpixels, each sensitive to a different color.
  • the projection system is adapted to magnify the center region of the projected image such that the optical magnification of the projected image in the center region of the field of view is more than two times, preferably more than three
  • the difference of magnification between the border region and the center of the field of view can be up to six times and depends mainly
  • the projection system is adapted such that a local
  • 335 magnification of any partial area of the field of view is selected such that the PSF in such partial area has a FWHM essentially corresponding to the size of corresponding pixels of the image detector onto which the partial area is projected.
  • the optical resolution is adapted in an optimum way to the resolution of the detector .
  • the computing unit is adapted to compute an undistorted picture
  • the image detector generates data corresponding to the distorted projection of the object to be captured. These data are provided to the computing unit.
  • This unit has been programmed, in software or in hardware, to
  • the 350 calculate the undistorted image from the distorted image data.
  • the precise way of distortion generated by the projection system must be known, estimated or measured.
  • undistorted pixels of an output image can be produced e.g. in raster order.
  • Each pixel has a certain magnification value that is a-priori known and is used to determine its value from the distorted image.
  • the magnification values are a-priori known and is used to determine its value from the distorted image.
  • 360 for each pixel can for example be calculated using an algorithm for reversing the distortion.
  • the values can be determined in advance experimentally, e.g. by using a test picture and comparing the original with the distorted projection of the picture, and can then be stored as look- 365 up table in a memory of which the values are retrieved when calculating the undistorted images.
  • Signal interpolation can be used to improve the quality of the distortion-corrected image. For example, as the centre
  • 370 of a pixel in a distorted image may not precisely correspond to the centre of a corresponding undistorted pixel, values of neighbouring pixels can be used to calculate an interpolated pixel value.
  • the type of interpolation can be bilinear or cubic or any other type.
  • the image detector has pixels including different types of subpixels each type being sensitive to a predetermined range of colors to detect different color components of the
  • the computing unit is adapted to compute data from different types of subpixels differently.
  • the detector is a color detector wherein each pixel comprises subpixels sensitive to a different color. For example, an R-subpixel
  • the computing unit is adapted to correct image 395 errors due to chromatic aberrations .
  • the optical projection system is adapted for projecting in a distorted way such that the distortion of the projected
  • the computing unit can include a transformation algorithm for computationally correcting the distortion of the detected image separably in a x-direction and in a y-direction
  • Such separable transformation can simplify and accelerate the process of correcting the distortion in the captured image.
  • the computing unit includes an algorithm to computationally compress data corresponding to the center region of the projected image and not to compress data corresponding to the border region of the projected image.
  • the computing unit includes an algorithm to computationally compress data corresponding to the center region of the projected image and not to compress data corresponding to the border region of the projected image.
  • the computing unit is adapted to crop and compute a zoomed, undistorted partial image from the center region of the projected image. For this purpose, it takes advantage of the fact that the projected image acquired by the detector
  • the center region is compressed computationally. However, if a zoomed partial image of a part of the image close to the center is t ⁇ be taken, this can be done by simply
  • a great advantage of the image capturing device of the present invention is that zoomed partial images at the center of the projected object can be obtained without
  • the zoomed image is not generated by expanding the original image by computational interpolation known as "digital zoom” . Instead, the original distorted image is simply compressed less while computationally correcting its
  • the computing unit is adapted to perform the manipulation of electronic information obtained from the image detector separately for separate information packages each package corresponding to a portion of the projected image.
  • the information provided by the image detector is divided into several packages.
  • the computing unit then does not manipulate the entire information provided by the image detector in one single step but manipulates one package after the other. Between the manipulation of two packages
  • an interruption which can e.g. be used for storing the data corresponding to the package of information of the undistorted picture obtained in the previous manipulation.
  • pipeline processing can make the process of manipulating and storing the data of 470 the undistorted image more flexible.
  • the optical projection system includes at least one lens formed by injection molding.
  • a lens consisting e.g. of
  • a resin material is much cheaper than lenses made of polished glass.
  • a low cost lens normally provides images of reduced quality due to aberrations and/or lens error resulting from production tolerances.
  • the image errors generated by the low cost lens are known a priori e.g. by measuring or by computational simulation. As the projected image is manipulated anyway by the computing unit while correcting its distortion, these
  • image errors can be accounted for in the same processing step without causing additional complexity or costs. I.e. image quality loss due to the low cost lens can be compensated with the image capturing device's computing unit .
  • the pixels of the image detector have uniform size over the entire detector surface. Therefore, conventional image detectors can be used. For example, for obtaining improved
  • an existing image capturing system can simply be retrofitted by an adapted projection system and a computing unit according to the present invention wherein the detector need not be replaced. Accordingly, costs can be saved when retrofitting or 500 replacing an existing image capturing device as a standard detector can be used for embodying the present invention.
  • an image capturing device of the present invention has a volume of less than 505 500 mm 3 , preferably less than 200 mm 3 and more preferably less than 100 mm 3 . Having such small volume, it can be easily incorporated in portable devices such as mobile phones, digital cameras or laptops.
  • the optical projection system comprises less than four lenses, preferably less than three lenses and more preferably only one lens. Although a reduced number of lenses induces more aberration, these aberrations can be
  • the optical projection system has a fixed focal length. 520 That means that there are no moving parts included. Accordingly, the costs of the projection system are reduced and the risk of failure of a mechanical moving mechanism is eliminated.
  • the image capturing system further includes a storing unit wherein electronic information obtained from the image detector corresponding to the projected distorted image is stored in the storing unit.
  • the information can be protected against unauthorized access to a certain degree.
  • the information cannot be accessed without knowing the algorithm for computing the undistorted picture. Therefore, an unauthorized person reading the 535 "encrypted" information cannot easily print or view the images .
  • the information can be stored in the "encrypted" format.
  • the stored information is processed externally in order to reproduce the undistorted images.
  • Such processing can only be performed by a person knowing the "key", i.e. the algorithm for reversing the distortion introduced by the
  • the optical projection system is adapted for projecting an object within a field of view with a distortion of radial 550 symmetry onto the detecting surface.
  • the optical projection system is adapted such that the field of view projected onto the detecting surface of the
  • the 555 detector and the detecting surface of the detector have the same shape.
  • the detector has a rectangular detecting surface with a given aspect ratio (height/width)
  • the field of view projected onto the detecting surface preferably is rectangular as well having the same aspect
  • the optical projection system can be adapted for projecting an object within a field of view with a distortion such that the geometry of the projected image 565 corresponds to the geometry of the electronic image detector.
  • the optical projection system can be adapted to distort the projected image in such a way that it matches the shape of the detector.
  • the 570 field of view of the image capturing system can preferably have a rectangular geometry but it is also possible to have a field of view of any other geometry.
  • a portable electronic device such as a mobile phone, a webcam or a portable computer including an image capturing device as described herein before .
  • the computing unit of the inventive image capturing device can be implemented into the processor
  • Fig. 1 schematically shows an embodiment of an image capturing device according to the present invention
  • FIGs. 2a and 2b show an example of a rectangular distortion pattern that is separable in X & Y coordinates and can be used in an embodiment of the present invention
  • Figs. 3a and 3b show an example of a distortion pattern 600 with circular symmetry that can be used in an embodiment of the present invention
  • Figs . 4a and 4b show curves of a separable X-Y- transformation similar to the one presented in Fig. 2b, 605 which can be used for designing the optical projection system and for programming the computing unit of the image capturing device according to an embodiment of the present invention
  • FIG. 5 shows a schematic illustration representing the PSF at a border region and at the centre of the image for a conventional image capturing system and an image capturing system according to an embodiment of the invention.
  • Fig. 6 shows a graph illustrating the inverse effective resolution versus the distance from the center of the object being captured for a conventional image capturing system compared with an image capturing system according to the present invention.
  • Fig. 7 shows an exemplary optical design of the present invention that provides image expansion at the center and image compression at the borders when compared with a standard imager.
  • FIG. 1 a schematic example of an image capturing device 1 of the present invention is shown wherein light rays 3
  • an optical projection system 5 comprising a first lens 7, an aperture 9, a second lens 11 behind the aperture 9 and a third lens 13. Finally, the light rays 3 impinge onto the detecting surface 15 of a 35 CCD or CMOS image detector 17.
  • the lenses 7, 11, 13 of the projection system 5 have surface geometries that are designed such that, while being projected onto the detector, the image of the object is
  • the lenses 7, 11, 13 are designed such that for small angles of incidence the structure of the three lenses resembles a
  • telephoto design i.e., effective focal length that is greater than the physical length of the lens module.
  • the lenses are designed to resemble a retro-photo system, i.e., the effective focal length is smaller than the physical length of the lens
  • the light of the object projected onto the detecting surface 15 is then captured by sensor pixels of the detector 17 and is transformed into electrical signals. 655 These signals are transmitted to a computing unit 19.
  • the computing unit 19 is connected with a memory 21 and a display unit 23.
  • the signals coming from the detector 17 can be stored in 660 the memory 21 and/or they can be directly displayed on the display unit 23. E.g. in video applications, the signals are normally both stored in a memory and displayed on a screen in real time.
  • the processing of the data may be performed before 665 or after storing the data and the stored data correspond to either the distorted image captured by the detector or the processed undi ⁇ torted image. Processing the signals can be performed by software or by dedicated hardware or off camera, e.g., for single use or multi use digital cameras. 670
  • the signal processing can be performed using pipelined architectures or without using pipelined architectures.
  • the image data can be read from the
  • the image data Before displaying the picture of the captured object, the image data have to be computed by the computing unit in order to reverse the distortion introduced by the projection system. Different signal processing schemes may be applied to different displays depending on the
  • Fig. 2a shows a rectangular pattern.
  • Fig. 2b a representation of the pattern of Fig. 2a is shown as it is
  • the projection is distorted such that the pattern is expanded in a center region and compressed in a border region.
  • the transformation representing the distortion is separable in
  • Fig. 3a shows a pattern of circular symmetry having equidistant rings.
  • Fig. 3b a representation of the pattern of Fig. 3a is shown as it is projected by the 700 optical projection system of an embodiment of the present invention.
  • the projection is distorted such that the pattern is expanded in a center region and compressed in a border region.
  • 705 Figs . 4a and 4b show exemplary transformation functions of a separable transformation similar to the one used in Fig. 2b, which can be used for computing an undistorted image from the signals provided by the image sensor.
  • 725 distortion is of further advantage since it transforms a rectangular detecting array that captures a distorted image to a rectangular non-distorting image after applying the required processing for correcting the distorting image.
  • a transfer function can be used which depends only on the radial distance of a pixel from the center.
  • polar coordinates can be used for calculating the coordinates of
  • FIG. 5 shows a schematic illustration representing the PSF at a border region and at the centre of the image for a conventional image capturing system (full line A) and an 740 image capturing system according to an embodiment of the invention (broken line B) .
  • full line A full line A
  • 740 image capturing system according to an embodiment of the invention
  • the FWHM of 745 the PSF is larger than the pixel size and oversampling occurs in the border region.
  • the FWHM of the PSF is smaller than the pixel size such that optical information is wasted.
  • the FWHM of the PSF for an image capturing system according to an 750 embodiment of the invention can be comparable to the size of the pixels both in the center and in the border region. In the optimum case, no oversampling occurs and no optical information is wasted.
  • the optical geometrical distortion matches the optical resolution (limited by the characteristics of the optics and the
  • Fig. 6 presents schematically this phenomenon in an exemplary embodiment of the present invention:
  • the y-axis shows the inverse "d" (in units of lines per mm) of the
  • the distance at the object plane from the centre of the object to a point on the object plane is assumed to be 4mm
  • the F-number (F/#) is assumed to be 3
  • the field of view of the lens module is approximately +/-30° and the
  • the resolution curve shows the resolution at the object plane (i.e., 400mm away from the lens module) .
  • the object plane i.e. 400mm away from the lens module
  • the diffraction limit is approximately 1.5 ⁇ m (lambda X F/#) ;
  • the FWHM of the PSF (at the sensor plane) of a 795 standard mass-produced triplet image capturing device can reach approximately 33% of the diffraction limit at the central part of the image, i.e., approximately 4.5 ⁇ m;
  • the resolution of the inventive image capturing device is approximately double that of the conventional image capturing device at the centre of the image and equal to
  • the width of the PSF of this embodiment is fairly constant across the entire capturing device and it equal to 4.5 ⁇ m. This allows utilizing the sensor pixels more efficiently.
  • the different magnification values result in different resolution values across the
  • Fig. 7 an exemplary optical design according to the invention that results in a radial distortion, which provides image expansion in the center and image compression at the borders for a standard field of view of
  • the spherical and ashperical coefficients and the apertures of all optical surfaces along with the materials from which the lenses are made are provided as follows :
  • CONIC CONSTANT (CC) -1.692041 SEMI-MAJOR AXIS (b) -3.156326 SEMI-MINOR AXIS (a) -2.625717 85 SURFACE NO. 2 — CONIC+POWER-SERIES ASPHERE
  • CONIC CONSTANT (CC) 4.121475 SEMI-MAJOR AXIS (b) -0.491762 SEMI-MINOR AXIS (a) 1.112892

Abstract

An image capturing device (1) is disclosed comprising an electronic image detector (17) having a detecting surface (15), an optical projection system (5) for projecting an object within a field of view onto the detecting surface (15), and, optionally, a computing unit (19) for manipulating electronic information obtained from the image detector (17), wherein, the projection system (5) is adapted to project the object in a distorted way such that, when compared with a standard lens system, the projected image is expanded in a center region of the field of view and is compressed in a border region of the field of view. Preferably, the projection system (5) is adapted such that its point spread function in the border region of the field of view has a full width at half maximum corresponding essentially to the size of corresponding pixels of the image detector (17).

Description

Image capturing device with improved image quality
The present invention relates to an image capturing device and particularly to an image capturing device with improved image quality for electronic devices .
Background of the Invention
Recently, image capturing devices have become widely used in portable and non-portable devices such as cameras, mobile phones, webcams and notebooks. These image capturing devices conventionally include an electronic image detector such as a CCD or CMOS sensor, a lens system for projecting an object in a field of view onto the detector and an electronic circuitry for receiving and storing electronic data provided by the detector.
Resolution and optical zoom are two important performance parameters of such image capturing devices.
Resolution of an image capturing device means the minimum distance two point sources in an object plane can have such that the image capturing device is able to distinguish these point sources. The resolution depends on the fact that due to diffraction and aberrations each optical system projects a point source not as a point but as a disc of predetermined width and having a certain light intensity distribution. The response of an optical system to a point light source is known as point spread function (PSF) .
The overall resolution of an image capturing device mainly depends on the smaller one of two components: the optical resolution of the optical projection system and the resolution of the detector.
Herein, the optical resolution of an optical projection system shall be defined as the full width at half maximum (FWHM) of its PSF. In other words, the peak values of the light intensity distribution of a projection of two point light sources must be spaced at least by the FWHM of the PSF in order for the image capturing device to be able to distinguish the two point light sources. However, the resolution could also be defined as a different ' value depending on the PSF, e.g. 70% of the width at half maximum. This definition of the optical resolution might depend on the sensitivity of the detector and the evaluation of the signals received from the detector.
The resolution of the detector is defined herein as the pitch (i.e. distance middle to middle) of two adjacent sensor pixels of the detector.
Optical zoom signifies the capability of the image capturing device to capture a part of the field of view of an original image with better resolution compared with a non-zoomed image. Herein, it is assumed that in conventional image capturing devices the overall resolution is usually limited by the resolution of the detector, i.e. that the FWHM of the PSF can be smaller than the distance between two neighbouring sensor pixels. Accordingly, the resolution of the image capturing device can be increased by selecting a partial field of view and increasing the magnification of the optical projection system for this partial field of view.
70
E.g., X2 optical zoom refers to a situation where all sensor pixels of the image detector capture half of the image, in each dimension, compared with that of Xl optical zoom.
75
The difference between "optical zoom" and "digital zoom" in this document is that applying "digital zoom" merely corresponds to signal interpolation where no additional information is actually provided. "Optical zoom" for that
80 matter includes a magnification of the projected partial image and provides more information and better resolution.
Prior art in realizing optical zoom consists of either changing the distance between the lenses and/or changing 85 the focal length of some of the lenses in a lens module.
One conventional way for obtaining optical zoom is through a mechanical apparatus which effectively changes the magnification of the optical system. This can be achieved
90 by changing the distance of lenses of the optical projection system by mechanically displacing one or more of the lenses while controlling the location of the image plane. However, such conventional image capturing systems require a complex mechanical system including several
95 lenses and a control to drive this lens system. The mechanical system is large, heavy, subject to mechanical failure and expensive. Alternatively, other prior art techniques for obtaining 100 optical zoom are based on variable focal length lenses. In such embodiments, individual lenses in a lens system are capable of changing their focal length in the presence of an electric field or mechanical pressure. These lenses are typically filled with fluid of one or more types and are 105 capable to change their shape and therefore the lens' focal length. Such solutions typically result in poor image quality in comparison with fixed focal length systems . Moreover, they are often prone to fatigue and aging effects . 110
The mentioned optical systems typically require moving parts and/or special drivers with possibly high-voltage circuitry and do not result in a cost-effective solution.
115 It is therefore an object of the present invention to provide an image capturing device having an improved image quality which is adapted to prevent the above drawbacks.
Particularly, it is an object to provide an image capturing 120 device having small size, no or few moving parts and being able to provide increased resolution compared to conventional image capturing devices having the same field of view.
125 Summary of the Invention
The present invention is based on the finding that in simple triplet like imagers as they are conventionally used as image capturing devices for e.g. mobile phones, the PSF
130 is a function of the angle of incidence. This brings up the effect of a non-uniform resolution across a captured image. I.e., the maximum attainable resolution is a function of the location at the image plane. This effect is on-top of any other effect that limits the spatial resolution, such 135 as e.g. defocus .
The detector's sensor pixels sample the image of an object projected onto the detecting surface of the detector to form digital image signals thereof. Typically, the pixel
140 dimensions are comparable with the width of the point spread function of the optical projection system at the centre of the image. In conventional image capturing devices the PSF at the borders of the image is wider than the PSF at the centre of the image due to the above
145 mentioned dependency of the resolution on the angle of incidence. Accordingly, there is commonly over-sampling at the border region of the image in conventional devices. That means that e.g. two or more sensor pixels are provided in the area of the PSF of a single object point in the
150 border region such that they do not provide distinguishable optical information. In other words, the sensor pixels sample the image signal at a higher rate than required by the sampling theorem.
155 According to a first aspect, the present invention provides an image capturing device, comprising: an electronic image detector having a detecting surface, an optical projection system for projecting an object within a field of view onto the detecting surface, and, optionally, a computing unit
160 for manipulating electronic information obtained from the image detector, wherein, the projection system is adapted to project the object in a distorted way such that the projected image is expanded in the center region of the field of view and is compressed in the border region of the 165 field of view.
The electronic image detector can be any detector being able to convert optical information projected onto its detecting surface into electrical signals. Examples are CCD 170 or CMOS detectors . The detector can have one or more output lines to provide the electrical signals sequentially or parallely to the computing unit for processing the signals or to a memory for storing the signals.
175 The computing unit can be any electronic circuitry. E.g. it can be an integrated chip device. As described herein later in more detail, the computing unit can be adapted to process the signals received by the detector in order to obtain an undistorted image of high quality or to provide a
180 .zoomed partial image.
The optical projection system includes one or more optical elements like e.g. optical lenses or reflecting surfaces of mirrors or prisms or the like. The optical elements are 185 arranged such that a predetermined field of view including the object can be projected onto the detecting surface of the image detector.
It is to be noted that the optical projection system can 190 have a fixed focus. This means that its optical magnification can not be changed by external manipulation.
Especially, there need not to be moving parts provided in the optical projection system for adapting the focus.
Generally, there is nothing dynamic in the system (i.e., no 195 change in time) , but the effective focal length (EFL) , which determines the magnification, changes as a function of the location across the sensor plane. The EFL is large at the center and small at the borders. The magnification changes across the image and causes the distortion.
200
The image capturing system of the present invention differs from prior art systems in that its projection system is adapted to project the object to be imaged in a distorted way. The center of the field of view is expanded or
205 stretched whereas the border region closer to the circumference of the field of view is compressed. In other words, the projection system projects the object with a greater magnification at the center of the field of view and with a smaller magnification at the border of the field
210 of view. As a result, the image projected onto the detecting surface is distorted. This is in contrast to conventional systems which usually aim for undistorted projection in order to avoid post-processing.
215 It is to be noted that the conventional projection system and the projection system according to the present invention can both have the same field of view. The projection system according to the present invention differs mainly in that it provides a non-homogeneous
220 magnification across the field of view wherein, compared to a conventional system, the center has a higher magnification and the border region has a lower magnification.
225 In this context it benefits from the fact that in a conventional image capturing system there is oversampling in the border region of the field of view. By choosing a smaller magnification in this region, the area of the PSF on the detecting surface can be reduced. No information
230 contained in the projected image is lost as long as the FWHM of the PSF is larger than the size of the sensor pixels .
On the other hand, by choosing a higher magnification in 235 the center of the field of view the overall resolution in this region can be enhanced. This is due to the fact that, in conventional image capturing devices, at the center of the field of view, the area of the PSF on the detecting surface is smaller than the sensor pixel size which 240 therefore limits the overall resolution.
Hence, when the magnification is changed at the center by increasing the focal length, the F-number is increased. This implies that the maximal limit of resolution is 245 decreased. However, since the F-number is larger, there are less aberrations and the optical resolution remains approximately the same .
In other words , the PSF of two neighboring points in the
250 object plane can be narrow enough so that they are separated by a FWHM, and are therefore optically distinguishable, but the two points are projected onto the same pixel of the detector such that their optical information cannot be separated electronically. By
255 increasing the magnification, the area of the PSF on the detecting surface and the distance between two PSFs of neighboring points are increased. The above two neighboring points are then projected onto different sensor pixels and additional information on the projected image can be
260 retrieved by the sensor pixels. However, if the magnification in the center is increased more than a predetermined limit at which the area (FWHM) of the PSF becomes larger than the pixel size, oversampling occurs and further increasing the magnification does not provide more 265 information .
Accordingly, the image capturing device of the present invention can provide a distorted image having a higher resolution at the center while the overall resolution of
270 the device is not decreased in any part of the field of view when compared to a conventional image capturing device having the same field of view. As a result, the image is projected in a distorted way such that the resolution of the image detector is used more economically.
275
According to one embodiment of the present invention the projection system is adapted such that its point spread function in the border region of the field of view has a full width at half maximum being less than three times,
280 preferably less than twice the size of pixels in the corresponding region of the image detector. Preferably, the local magnification of the projection system is chosen such that the FWHM of the PSF in the border region corresponds to the size of corresponding pixels of the image detector.
285 That means that the size S of the sensor pixels is essentially the same as the FWHM, e.g. FWHM < 2 * S, preferably FWHM < 1,5 * S or more preferred 0,8 * S < FWHM < 1,2 * S and most preferred FWHM = S.
290 Incidentally, the "size of a pixel" can be defined as the distance between two neighboring pixels, from middle to middle. Therein, a pixel is the smallest unit adapted to receive information about a minimum area in the object plane. In case of a monochrome detector, each pixel detects
295 information about the light intensity of such minimum area. In a color detector, a pixel can be composed of several subpixels, e.g. three or more pixels, each being adapted to detect information about the light intensity in a specific color range of such minimum area, e.g. for the red, green
300 and blue spectral range, respectively. For example, a pixel can be composed of four subpixels arranged in a rectangle wherein in a first row there is a subpixel sensitive to a red light spectrum neighboring a subpixel sensitive to a green light spectrum and in the second row there is a
305 subpixel sensitive to a green light spectrum neighboring a subpixel sensitive to a blue light spectrum. Accordingly, one pixel contains four subpixels. Another example is where the different columns of the detector are sensitive to different colors, e.g., one column is sensitive to red
310 another column is sensitive to green and another column is sensitive to blue and vice versa. In this case a pixel is composed of three subpixels, each sensitive to a different color.
315 In a further embodiment of the present invention, the projection system is adapted to magnify the center region of the projected image such that the optical magnification of the projected image in the center region of the field of view is more than two times, preferably more than three
320 times, and more preferably more than four times the optical magnification of the projected image in the border region of the field of view. In fact, the difference of magnification between the border region and the center of the field of view can be up to six times and depends mainly
325 on the difference between the PSFs in the two regions of a non-distorting projection system having the same field of view (or the willingness to lose some information at the borders). In other words, the greater the oversampling in the border region of the field of view in a corresponding
330 non-distorting system, the bigger the difference in magnification between center and border can be. In a further embodiment of the present invention, the projection system is adapted such that a local
335 magnification of any partial area of the field of view is selected such that the PSF in such partial area has a FWHM essentially corresponding to the size of corresponding pixels of the image detector onto which the partial area is projected. In an image capturing system having such
340 projection system the optical resolution is adapted in an optimum way to the resolution of the detector .
In a further embodiment of the present invention, the computing unit is adapted to compute an undistorted picture
345 of the projected object from data received from the image detector . The image detector generates data corresponding to the distorted projection of the object to be captured. These data are provided to the computing unit. This unit has been programmed, in software or in hardware, to
350 calculate the undistorted image from the distorted image data. For this purpose, the precise way of distortion generated by the projection system must be known, estimated or measured.
355 For example, starting with distorted image data, undistorted pixels of an output image can be produced e.g. in raster order. Each pixel has a certain magnification value that is a-priori known and is used to determine its value from the distorted image. The magnification values
360 for each pixel can for example be calculated using an algorithm for reversing the distortion. Or the values can be determined in advance experimentally, e.g. by using a test picture and comparing the original with the distorted projection of the picture, and can then be stored as look- 365 up table in a memory of which the values are retrieved when calculating the undistorted images.
Signal interpolation can be used to improve the quality of the distortion-corrected image. For example, as the centre
370 of a pixel in a distorted image may not precisely correspond to the centre of a corresponding undistorted pixel, values of neighbouring pixels can be used to calculate an interpolated pixel value. The type of interpolation can be bilinear or cubic or any other type.
375
In a further embodiment of the present invention, the image detector has pixels including different types of subpixels each type being sensitive to a predetermined range of colors to detect different color components of the
380 projected image, respectively. Therein, the computing unit is adapted to compute data from different types of subpixels differently. In other words, the detector is a color detector wherein each pixel comprises subpixels sensitive to a different color. For example, an R-subpixel
385 sensitive to red, a B-subpixel sensitive to blue and a G- subpixel sensitive to green are provided. For such color detector, it is convenient to calculate the undistorted image by taking into account that the distortion of the projection system is usually different for different
390 colors, an effect which is commonly known to generate so called chromatic aberrations.
Therefore, in one further embodiment of the present invention, the computing unit is adapted to correct image 395 errors due to chromatic aberrations .
According to a further embodiment of the present invention, the optical projection system is adapted for projecting in a distorted way such that the distortion of the projected
400 image is separable in a x-direction and in a y-direction perpendicular to the x-direction. Accordingly, the computing unit can include a transformation algorithm for computationally correcting the distortion of the detected image separably in a x-direction and in a y-direction
405 perpendicular to the x-direction. Such separable transformation can simplify and accelerate the process of correcting the distortion in the captured image.
According to a further embodiment of the present invention, 410 while computing an undistorted image, the computing unit includes an algorithm to computationally compress data corresponding to the center region of the projected image and not to compress data corresponding to the border region of the projected image. In other words, while 415 computationally reversing the distortion of the projected image, it is taken into account that the border region of the projected area has already been more compressed optically by the projection system than the center region, which was magnified. Accordingly, in order to obtain an 420 undistorted image having the same magnification across the entire image area, it is sufficient to computationally compress the center region to a degree to which the border has already been compressed optically.
425 According to a further embodiment of the present invention, the computing unit is adapted to crop and compute a zoomed, undistorted partial image from the center region of the projected image. For this purpose, it takes advantage of the fact that the projected image acquired by the detector
430 has a higher resolution at its center than at its border region. For normal pictures of the entire field of view, the center region is compressed computationally. However, if a zoomed partial image of a part of the image close to the center is tα be taken, this can be done by simply
435 cropping the partial image and compressing it less or not compressing it at all depending on the desired zoom and the degree of distortion of the partial image. In other words, with respect to a non-zoomed image, the image is expanded and cropped so that all pixel information will be used to
440 describe the zoomed image.
Hence, a great advantage of the image capturing device of the present invention is that zoomed partial images at the center of the projected object can be obtained without
445 losing resolution. In contrast to conventional fixed focus devices, the zoomed image is not generated by expanding the original image by computational interpolation known as "digital zoom" . Instead, the original distorted image is simply compressed less while computationally correcting its
450 distortion. Therefore, no virtual image information is produced by interpolation but real information "hidden" in the high resolution distorted image is used for generating the zoomed partial picture.
455 According to a further embodiment of the present invention, the computing unit is adapted to perform the manipulation of electronic information obtained from the image detector separately for separate information packages each package corresponding to a portion of the projected image. In other
460 words, the information provided by the image detector is divided into several packages. The computing unit then does not manipulate the entire information provided by the image detector in one single step but manipulates one package after the other. Between the manipulation of two packages
465 there can be an interruption which can e.g. be used for storing the data corresponding to the package of information of the undistorted picture obtained in the previous manipulation. Using such "pipeline processing" can make the process of manipulating and storing the data of 470 the undistorted image more flexible.
According to a further embodiment of the present invention, the optical projection system includes at least one lens formed by injection molding. Such a lens consisting e.g. of
475 a resin material is much cheaper than lenses made of polished glass. However, such a low cost lens normally provides images of reduced quality due to aberrations and/or lens error resulting from production tolerances. In the image capturing device of the present invention, the
480 reduced image quality can be corrected using the computing unit. The image errors generated by the low cost lens are known a priori e.g. by measuring or by computational simulation. As the projected image is manipulated anyway by the computing unit while correcting its distortion, these
485 image errors can be accounted for in the same processing step without causing additional complexity or costs. I.e. image quality loss due to the low cost lens can be compensated with the image capturing device's computing unit .
490
According to a further embodiment of the present invention, the pixels of the image detector have uniform size over the entire detector surface. Therefore, conventional image detectors can be used. For example, for obtaining improved
495 resolution and zoom capability, an existing image capturing system can simply be retrofitted by an adapted projection system and a computing unit according to the present invention wherein the detector need not be replaced. Accordingly, costs can be saved when retrofitting or 500 replacing an existing image capturing device as a standard detector can be used for embodying the present invention.
According to a further embodiment , an image capturing device of the present invention has a volume of less than 505 500 mm3, preferably less than 200 mm3 and more preferably less than 100 mm3. Having such small volume, it can be easily incorporated in portable devices such as mobile phones, digital cameras or laptops.
510 According to a further embodiment of the present invention, the optical projection system comprises less than four lenses, preferably less than three lenses and more preferably only one lens. Although a reduced number of lenses induces more aberration, these aberrations can be
515 accounted for while computationally processing the electronic image data.
According to a further embodiment of the present invention, the optical projection system has a fixed focal length. 520 That means that there are no moving parts included. Accordingly, the costs of the projection system are reduced and the risk of failure of a mechanical moving mechanism is eliminated.
525 According to a further embodiment of the present invention, the image capturing system further includes a storing unit wherein electronic information obtained from the image detector corresponding to the projected distorted image is stored in the storing unit. By storing the information
530 corresponding to the projected distorted image the information can be protected against unauthorized access to a certain degree. The information cannot be accessed without knowing the algorithm for computing the undistorted picture. Therefore, an unauthorized person reading the 535 "encrypted" information cannot easily print or view the images .
For example in a single use camera the information can be stored in the "encrypted" format. Such single use camera
540 normally does not include its own computing unit. Instead, the stored information is processed externally in order to reproduce the undistorted images. Such processing can only be performed by a person knowing the "key", i.e. the algorithm for reversing the distortion introduced by the
545 image projection system.
According to a further embodiment of the present invention, the optical projection system is adapted for projecting an object within a field of view with a distortion of radial 550 symmetry onto the detecting surface.
According to a further embodiment of the present invention, the optical projection system is adapted such that the field of view projected onto the detecting surface of the
555 detector and the detecting surface of the detector have the same shape. For example, if the detector has a rectangular detecting surface with a given aspect ratio (height/width) , than the field of view projected onto the detecting surface preferably is rectangular as well having the same aspect
560 ratio.
Aternatively, the optical projection system can be adapted for projecting an object within a field of view with a distortion such that the geometry of the projected image 565 corresponds to the geometry of the electronic image detector. E.g. in case of an image detector of a rectangular shape the optical projection system can be adapted to distort the projected image in such a way that it matches the shape of the detector. In such case, the 570 field of view of the image capturing system can preferably have a rectangular geometry but it is also possible to have a field of view of any other geometry.
According to further aspects of the present invention,
575 there is provided a portable electronic device such as a mobile phone, a webcam or a portable computer including an image capturing device as described herein before . In such applications, the computing unit of the inventive image capturing device can be implemented into the processor
580 provided in such portable devices. Alternatively, it can be embedded into a microchip including the sensor of the image capturing device.
585 Brief description of the drawings
Further details and advantages of the present invention will appear to those skilled in the art from the following description of preferred embodiments thereof in conjunction 590 with the appended drawings, wherein:
Fig. 1 schematically shows an embodiment of an image capturing device according to the present invention;
595 Figs. 2a and 2b show an example of a rectangular distortion pattern that is separable in X & Y coordinates and can be used in an embodiment of the present invention;
Figs. 3a and 3b show an example of a distortion pattern 600 with circular symmetry that can be used in an embodiment of the present invention; Figs . 4a and 4b show curves of a separable X-Y- transformation similar to the one presented in Fig. 2b, 605 which can be used for designing the optical projection system and for programming the computing unit of the image capturing device according to an embodiment of the present invention;
610 Fig. 5 shows a schematic illustration representing the PSF at a border region and at the centre of the image for a conventional image capturing system and an image capturing system according to an embodiment of the invention.
615 Fig. 6 shows a graph illustrating the inverse effective resolution versus the distance from the center of the object being captured for a conventional image capturing system compared with an image capturing system according to the present invention.
620
Fig. 7 shows an exemplary optical design of the present invention that provides image expansion at the center and image compression at the borders when compared with a standard imager.
625
Description of preferred embodiments
In Fig. 1, a schematic example of an image capturing device 1 of the present invention is shown wherein light rays 3
630 coming from an object to be projected (on the left side in figure, not shown) pass through an optical projection system 5 comprising a first lens 7, an aperture 9, a second lens 11 behind the aperture 9 and a third lens 13. Finally, the light rays 3 impinge onto the detecting surface 15 of a 35 CCD or CMOS image detector 17. The lenses 7, 11, 13 of the projection system 5 have surface geometries that are designed such that, while being projected onto the detector, the image of the object is
640 distorted such that its center region is expanded whereas its border region is compressed, when compared to a non- distorted projected image of the same field of view. The lenses 7, 11, 13 are designed such that for small angles of incidence the structure of the three lenses resembles a
645 telephoto design, i.e., effective focal length that is greater than the physical length of the lens module. For large angles of incidence, the lenses are designed to resemble a retro-photo system, i.e., the effective focal length is smaller than the physical length of the lens
650 module .
The light of the object projected onto the detecting surface 15 is then captured by sensor pixels of the detector 17 and is transformed into electrical signals. 655 These signals are transmitted to a computing unit 19. The computing unit 19 is connected with a memory 21 and a display unit 23.
The signals coming from the detector 17 can be stored in 660 the memory 21 and/or they can be directly displayed on the display unit 23. E.g. in video applications, the signals are normally both stored in a memory and displayed on a screen in real time. In the case where the data are to be stored, the processing of the data may be performed before 665 or after storing the data and the stored data correspond to either the distorted image captured by the detector or the processed undiεtorted image. Processing the signals can be performed by software or by dedicated hardware or off camera, e.g., for single use or multi use digital cameras. 670 The signal processing can be performed using pipelined architectures or without using pipelined architectures.
When a picture of the stored image is to be displayed at a later stage of time, the image data can be read from the
675 memory. Before displaying the picture of the captured object, the image data have to be computed by the computing unit in order to reverse the distortion introduced by the projection system. Different signal processing schemes may be applied to different displays depending on the
680 resolution of the display, which may not match the resolution of the detector.
It is to be noted that one skilled in the art a priori knowing the distortion effected by the projection system 685 will be able to program the computing unit in such a way to be able to calculate the undistorted image.
Fig. 2a shows a rectangular pattern. In Fig. 2b, a representation of the pattern of Fig. 2a is shown as it is
690 projected by the optical projection system of an embodiment of the present invention. The projection is distorted such that the pattern is expanded in a center region and compressed in a border region. In this specific example the transformation representing the distortion is separable in
695 the horizontal and vertical axes.
Fig. 3a shows a pattern of circular symmetry having equidistant rings. In Fig. 3b, a representation of the pattern of Fig. 3a is shown as it is projected by the 700 optical projection system of an embodiment of the present invention. The projection is distorted such that the pattern is expanded in a center region and compressed in a border region. 705 Figs . 4a and 4b show exemplary transformation functions of a separable transformation similar to the one used in Fig. 2b, which can be used for computing an undistorted image from the signals provided by the image sensor. When the projection distortion function provided by the optical
710 projection system is known, by simulation or by measuring, such transformation functions can be derived as an inverse of the projection distortion function, e.g. by a polynomial approximation. Accordingly, from the position xa of a pixel in the projected distorted image, the position xna of this
715 pixel in a non-distorted image can be calculated using the transfer function shown in Fig. 3a. The same applies for the y-coordinate wherein the transfer function of Fig. 3b can be used.
720 Using a separable transformation is therefore advantageous for implementing one-dimensional operators for performing the required processing, for storing the transformation function in a small one-dimensional array and for fast processing algorithms. An X-Y separable coordinate
725 distortion is of further advantage since it transforms a rectangular detecting array that captures a distorted image to a rectangular non-distorting image after applying the required processing for correcting the distorting image.
730 Alternatively, when using an optical projection system with radial symmetric distortion (as shown e.g. in Fig. 3b} a transfer function can be used which depends only on the radial distance of a pixel from the center. Therein, polar coordinates can be used for calculating the coordinates of
735 the non-distorted image. Fig. 5 shows a schematic illustration representing the PSF at a border region and at the centre of the image for a conventional image capturing system (full line A) and an 740 image capturing system according to an embodiment of the invention (broken line B) . On the x-axis the size of the pixels is represented schematically.
It can be seen that in the conventional system, the FWHM of 745 the PSF is larger than the pixel size and oversampling occurs in the border region. In the center, the FWHM of the PSF is smaller than the pixel size such that optical information is wasted. In contrast hereto, the FWHM of the PSF for an image capturing system according to an 750 embodiment of the invention can be comparable to the size of the pixels both in the center and in the border region. In the optimum case, no oversampling occurs and no optical information is wasted.
755
Summarizing, with respect to the present invention, the following is to be noted: In general, the optical geometrical distortion matches the optical resolution (limited by the characteristics of the optics and the
760 aberrations employed with it) to the digital resolution introduced by the pixels of the digital detector array. The geometrical distortion is chosen in accordance with the desired maximum zoom value. Nonetheless, since the image is stretched at the centre of the image, it must be shrink at
765 the borders to maintain the same field of view. This shrinking effect can be implemented such that the image quality at the borders will be comparable to that of a standard image capturing device. This is possible due to the dependence of spatial resolution on the location in the
770 image . Fig. 6 presents schematically this phenomenon in an exemplary embodiment of the present invention: The y-axis shows the inverse "d" (in units of lines per mm) of the
775 effective resolution of a standard (solid line a) and of a capturing device according to the invention (solid line b) , i.e., of the minimal distance between two point sources, at which .two point sources can be distinguished (rather than being seen as a single point source) . The x-axis represents
780 the distance at the object plane from the centre of the object to a point on the object plane. In this example, the effective focal length of the standard device is assumed to be 4mm, the F-number (F/#) is assumed to be 3, the field of view of the lens module is approximately +/-30° and the
785 distance of the object from the lens module is assumed to be 400mm. The resolution curve shows the resolution at the object plane (i.e., 400mm away from the lens module) . For example, in a 2Mpixel image capturing device one can assign values to the y-axis taking into account the following
790 assumptions:
1. The diffraction limit is approximately 1.5μm (lambda X F/#) ;
2. The FWHM of the PSF (at the sensor plane) of a 795 standard mass-produced triplet image capturing device can reach approximately 33% of the diffraction limit at the central part of the image, i.e., approximately 4.5μm;
3. There is another approximately 50% resolution 800 degradation at the edges of the image, meaning that the resolution at the edges (at the sensor plane) is approximately 9.0μm. This means that whenever pixels are smaller than 9. Oμm 805 there is over-sampling at the edges of the image when a standard image capturing device is employed. Furthermore, whenever pixels are larger than 4.5μm the CMOS sensor of a standard imager does not capture the entire image information at the image centre. For this reason, it is 810 still preferable to use, for example, a 4.5μm CMOS sensor (or 2.2μm pixels that detects different colors) .
Under these conditions, the effective inverse resolution d in units of lines per mm (measured at the object plane) of
815 a standard imager (solid line a) and of the proposed imager according to the present invention (solid line b) is given in Pig. 6. Therein the x-axis represents the distance x from the centre of the object to another point on the object, wherein for x=230mm the viewing angle is
820 approximately 30°, which corresponds to the object's borders. It can be seen that in this embodiment the resolution of the inventive image capturing device is approximately double that of the conventional image capturing device at the centre of the image and equal to
825 the resolution of the conventional device at its borders.
It is important to note that solid line b in Fig. 6 takes into account the nonlinear magnification curve of the proposed optical system and therefore provides high
830 resolution at its centre. The width of the PSF of this embodiment is fairly constant across the entire capturing device and it equal to 4.5μm. This allows utilizing the sensor pixels more efficiently. The different magnification values result in different resolution values across the
835 object. It is to be noted that in the forgoing description the different magnification, resolution and the like have always been mentioned with respect to the border region or
840 to the centre region of the projecting surface. However, as one skilled in the art will easily recognize, there is no abrupt transition between these two regions but the mentioned parameters change continuously from the border to the centre.
845
In Fig. 7 an exemplary optical design according to the invention that results in a radial distortion, which provides image expansion in the center and image compression at the borders for a standard field of view of
850 +/-30° is provided. The spherical and ashperical coefficients and the apertures of all optical surfaces along with the materials from which the lenses are made are provided as follows :
855
SURFACE DATA
SURF RADIUS THICKNESS MEDIUM INDEX V-NUMBER
860
0 INFINITE INFINITE AIR
1 2.18431 O 0.72193 BKlO 1 .49782 66.95 SCHOTT
2 -13.45090 O 0.60470 AIR
865 APS INFINITE 0.90947 AIR
4 -2.51855 O 0.37974 GLM-NdVd 1 .82364S 23.16
5 -5.69831 O 0.85729 AIR
6 4.07431 O 1.30406 PICKUP 1 .49782P 66.95
7 3.27945 O 1.22632 AIR
870 8 INFINITE 0.00000 AIR
IMG INFINITE
75
SPECIAL SURFACE DATA
SURFACE NO. 1 — CONIC+POWER-SERIES ASPHERE 80 Gl 0.012681(R**2) G3 -0.011721 (R**4) G6 -0.020081(R**6) GlO 1.000000E-11(R**8)
CONIC CONSTANT (CC) -1.692041 SEMI-MAJOR AXIS (b) -3.156326 SEMI-MINOR AXIS (a) -2.625717 85 SURFACE NO. 2 — CONIC+POWER-SERIES ASPHERE
Gl 0.125679(R**2) G3 -0.070979 (R**4) G6 0.001758(R**6)
GlO 1.000000E-11(R**8)
890
CONIC CONSTANT (CC) 51.703471 SEMI-MAJOR AXIS (b) -0.255218 SEMI-MINOR AXIS (a) 1.852814
SURFACE NO. 4 — CONIC+POWER-SERIES ASPHERE
895 Gl 0.214030(R**2) G3 0.071936 (R**4) G6 -0.052975 (R**6)
GlO 1.000000E-11(R**8)
CONIC CONSTANT (CC) 4.121475 SEMI-MAJOR AXIS (b) -0.491762 SEMI-MINOR AXIS (a) 1.112892
900
SURFACE NO. 5 — CONIC+POWER-SERIES ASPHERE
Gl 0.037959(R**2) G3 0.057576 (R**4) G6 -0.043016(R**6)
GlO 1.000000E-11(R**8)
905 CONIC CONSTANT (CC) 14.212248 SEMI-MAJOR AXIS (b) -0.374587 SEMI-MINOR AXIS (a) 1.460996
SURFACE NO. 6 — CONIC+POWER-SERIES ASPHERE
Gl -0.172504(R**2) G3 -0.036871 (R**4) G6 0.005550 (R**6)
910 GlO 1.000000E-ll(R**8)
CONIC CONSTANT (CC) -0.092289 SEMI-MAJOR AXIS (b) 4.488549 SEMI-MINOR AXIS (a) 4.276415
915 SURFACE NO. 7 — CONIC+POWER-SERIES ASPHERE
Gl 0.040726(R**2) G3 -0.048180 (R**4) G6 -0.004017 (R**6)
GlO 1.000000E-ll(R**8)
CONIC CONSTANT (CC) -37.594416
920 SEMI-MAJOR AXIS (b) -0.089616 SEMI-MINOR AXIS (a) -0.542117
SURF R- Semi APERTURE
925 1 1.3808
2 1.2536
3 0.5500
4 1.0715
5 1.3162
930 6 1.8825
7 2.2152
8 2.2005
9 2.2005
935
The foregoing description is only exemplary and shall not restrict the scope of the present invention as it is defined in the appended claims. Furthermore, the full disclosure included in US patents 5,909,312 and 6,343,307 40 Bl concerning inventions made in parts by the same inventors as the present application shall be incorporated herein by reference .

Claims

O Q945Claims
1. Image capturing device, comprising:
950 an electronic image detector having a detecting surface, and
an optical projection system for projecting an object 955 within a field of view onto the detecting surface,
wherein,
the projection system is adapted to project the object 960 in a distorted way such that the projected image is expanded in a center region of the field of view and is compressed in a border region of the field of view.
2. Image capturing device according to claim 1, wherein 965 the detecting surface of the image detector includes pixels of predetermined size, and
the projection system is adapted such that its point 970 spread function in the border region of the field of view has a full width at half maximum being less than three times, preferably less than twice the size of corresponding pixels of the image.
975 3. Image capturing device according to claim 2, wherein the projection system is adapted such that its point spread function in the border region of the field of view has a full width, at half maximum essentially corresponding to the size of corresponding pixels of 980 the image detector .
4. Image capturing device according to one of claims 1 to
3, wherein the projection system is adapted to magnify the center region of the projected image such that the
985 optical magnification of the projected image in the center region of the field of view is more than two times, preferably more than three times, and more preferably more than four times the optical magnification of the projected image in the border
990 region of the field of view.
5. Image capturing device according to one of claims 1 to
4, wherein the projection system is adapted such that a local magnification of any partial area of the field
995 of view is selected such that the point spread function in such partial area has a full width at half maximum essentially corresponding to the size of corresponding pixels of the image detector onto which the partial area is projected. 1000
6. Image capturing device according to one of claims 1 to
5, further comprising a computing unit for manipulating electronic information obtained from the image detector.
1005
7. Image capturing device, according to claim 6, wherein the image detector has pixels including different types of subpixels each type being sensitive to a
1010 predetermined range of colors to detect different color components of the projected image, respectively, and wherein the computing unit is adapted to compute data from different types of subpixels differently.
1015 8. Image capturing device according to claim 7 , wherein the computing unit is adapted to correct image errors due to chromatic aberrations .
9. Image capturing device according to one of claims 6 to 1020 8, wherein the computing unit is adapted to computationally correct the distortion of the detected image introduced by the optical projection system.
10. Image capturing device according to claim 9 , wherein 1025 the optical projection system is adapted for projecting in a distorted way such that the distortion of the projected image is separable in a x-direction and in a y-direction perpendicular to the x-direction.
1030 11. Image capturing device according to one of claims 6 to
10, wherein the computing unit includes an algorithm to computationally compress data corresponding to the center region of the projected image and not to compress data corresponding to the border region of
1035 the projected image.
12. Image capturing device according to one of claims 6 to
11, wherein the computing unit is adapted to crop and compute a zoomed, undistorted partial image from the
1040 center region of the projected image.
13. Image capturing device according to one of claims 6 to
12, wherein the computing unit is adapted to perform the manipulation of electronic information obtained
1045 from the image detector separately for separate information packages each package corresponding to a portion of the projected image.
14. Image capturing device according to one of claims 1 to 1050 13, wherein the optical projection system includes at least one lens made of plastic or glass formed by injection molding.
15. Image capturing device according to one of claims 1 to 1055 14, wherein the pixels of the image detector have uniform size over the entire detector surface.
16. Image capturing device according to one of claims 1 to 15 having a volume of less than 1000 mm3 , preferably
1060 less 500 mm3 , preferably less than 200 mm3 and more preferably less than 100 mm3.
17. Image capturing device according to one of claims 1 to 16, wherein the optical projection system comprises
1065 less than six lenses, preferably less than four lenses, preferably less than three lenses and more preferably only one lens .
18. Image capturing device according to one of claims 1 to 1070 17, wherein the optical projection system has a fixed focal length.
19. Image capturing device according to one of claims 1 to 18, further including a storing unit wherein
1075 electronic information obtained from the image detector corresponding to the projected distorted image is stored in the storing unit.
20. Image capturing device according to one of claims 1 to
1080 19, wherein the optical projection system is adapted for projecting an object within a field of view with a distortion of radial symmetry onto the detecting surface .
1085 21. Image capturing device according to one of claims 1 to 19, wherein the optical projection system is adapted such that the field of view projected onto the detecting surface of the detector and the detecting surface of the detector have the same shape.
1090
22. Mobile phone including an image capturing device according to one of claims 1 to 21 incorporated therein .
1095 23. Portable computer including an image capturing device according to one of claims 1 to 21 incorporated therein .
24. Webcam including an image capturing device according 1100 to one of claims 1 to 21 incorporated therein.
1105
PCT/EP2006/002861 2006-03-29 2006-03-29 Image capturing device with improved image quality WO2007110097A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN2006800540272A CN101438577B (en) 2006-03-29 2006-03-29 Image acquisition apparatus with improved image quality
US12/225,591 US8233073B2 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality
EP06723830A EP1999947B1 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality
JP2009501859A JP5241700B2 (en) 2006-03-29 2006-03-29 Imaging device with improved image quality
PCT/EP2006/002861 WO2007110097A1 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality
KR1020087026489A KR101388564B1 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2006/002861 WO2007110097A1 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality

Publications (1)

Publication Number Publication Date
WO2007110097A1 true WO2007110097A1 (en) 2007-10-04

Family

ID=36579609

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2006/002861 WO2007110097A1 (en) 2006-03-29 2006-03-29 Image capturing device with improved image quality

Country Status (6)

Country Link
US (1) US8233073B2 (en)
EP (1) EP1999947B1 (en)
JP (1) JP5241700B2 (en)
KR (1) KR101388564B1 (en)
CN (1) CN101438577B (en)
WO (1) WO2007110097A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009064512A1 (en) * 2007-11-15 2009-05-22 Sony Ericsson Mobile Communications Ab System and method for generating a photograph with variable image quality
US7621647B1 (en) * 2006-06-23 2009-11-24 The Elumenati, Llc Optical projection system and method of use
US8542289B1 (en) 2008-02-08 2013-09-24 Google Inc. Mapping a two-dimensional image to a cylindrical surface using a tuned distortion curve
WO2014033099A2 (en) 2012-08-27 2014-03-06 Digital Optics Corporation Europe Limited Rearview imaging systems for vehicle
US8711245B2 (en) 2011-03-18 2014-04-29 Digitaloptics Corporation Europe Ltd. Methods and systems for flicker correction
WO2014072837A2 (en) 2012-06-07 2014-05-15 DigitalOptics Corporation Europe Limited Mems fast focus camera module
US9001268B2 (en) 2012-08-10 2015-04-07 Nan Chang O-Film Optoelectronics Technology Ltd Auto-focus camera module with flexible printed circuit extension
US9007520B2 (en) 2012-08-10 2015-04-14 Nanchang O-Film Optoelectronics Technology Ltd Camera module with EMI shield
US9525807B2 (en) 2010-12-01 2016-12-20 Nan Chang O-Film Optoelectronics Technology Ltd Three-pole tilt control system for camera module
US9817206B2 (en) 2012-03-10 2017-11-14 Digitaloptics Corporation MEMS auto focus miniature camera module with fixed and movable lens groups
US10101636B2 (en) 2012-12-31 2018-10-16 Digitaloptics Corporation Auto-focus camera module with MEMS capacitance estimator
WO2020184286A1 (en) * 2019-03-14 2020-09-17 Ricoh Company, Ltd. Imaging device, image capturing optical system, and movable apparatus

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080212895A1 (en) * 2007-01-09 2008-09-04 Lockheed Martin Corporation Image data processing techniques for highly undersampled images
JP5633218B2 (en) * 2010-07-13 2014-12-03 富士通株式会社 Image processing apparatus and image processing program
US10132925B2 (en) * 2010-09-15 2018-11-20 Ascentia Imaging, Inc. Imaging, fabrication and measurement systems and methods
JP2013083925A (en) * 2011-09-29 2013-05-09 Canon Inc Imaging apparatus and control method therefor
CN103018907A (en) * 2012-12-19 2013-04-03 虢登科 Display method and head-mounted display
CN108989649B (en) * 2013-08-01 2021-03-19 核心光电有限公司 Thin multi-aperture imaging system with auto-focus and method of use thereof
US9888181B2 (en) 2014-02-28 2018-02-06 Sharp Kabushiki Kaisha Camera module and image capturing apparatus with shake correction of image capturing lens or image sensor
US10438331B2 (en) * 2014-06-26 2019-10-08 Intel Corporation Distortion meshes against chromatic aberrations
CN105205806B (en) * 2015-08-19 2018-03-02 广东科杰机械自动化有限公司 A kind of precision compensation method based on machine vision
US9781350B2 (en) * 2015-09-28 2017-10-03 Qualcomm Incorporated Systems and methods for performing automatic zoom
KR20180001048A (en) * 2016-06-24 2018-01-04 삼성전기주식회사 Optical system
CN106358034A (en) * 2016-10-19 2017-01-25 深圳市麦极客图像技术有限公司 Device and equipment for recording and watching VR videos and VR video recording and playing system
CN109752852B (en) * 2018-12-12 2021-04-13 重庆爱奇艺智能科技有限公司 Display system for head-mounted equipment and design method thereof
CN110996082B (en) * 2019-12-17 2021-11-09 成都极米科技股份有限公司 Projection adjusting method and device, projector and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432404A (en) * 1993-12-10 1995-07-11 Hitachi, Ltd. Apparatus for detecting a geometric distortion of an image on a display device
US5909312A (en) 1996-10-02 1999-06-01 Ramot University Authority For Applied Research & Industrial Development Ltd. Phase-only filter for generating an arbitrary illumination pattern
US5940217A (en) 1998-05-06 1999-08-17 Intel Corporation Anti-aliasing diffractive aperture and optical system using the same
US6248988B1 (en) 1998-05-05 2001-06-19 Kla-Tencor Corporation Conventional and confocal multi-spot scanning optical microscope
WO2004063989A2 (en) 2003-01-16 2004-07-29 D-Blur Technologies Ltd. Camera with image enhancement functions
US20050276475A1 (en) * 2004-06-14 2005-12-15 Canon Kabushiki Kaisha Image processing device, image processing method and image processing program

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4554585A (en) 1983-08-12 1985-11-19 Rca Corporation Spatial prefilter for variable-resolution sampled imaging systems
FR2626130A1 (en) 1988-01-19 1989-07-21 Labo Electronique Physique TELEVISION SHOOTING CAMERA HAVING INCREASED RESOLUTION IN PART OF THE FIELD OF IMAGE
US4897722A (en) 1988-04-07 1990-01-30 General Electric Company Widescreen television transmission system utilizing conventional equipment including a camera and VCR
US5124840A (en) 1989-06-08 1992-06-23 Trumbull Donald E Portable viewing apparatus
US5175616A (en) 1989-08-04 1992-12-29 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Canada Stereoscopic video-graphic coordinate specification system
JPH0411467A (en) 1990-04-28 1992-01-16 Olympus Optical Co Ltd Electronic image pickup device
US5673086A (en) 1990-10-05 1997-09-30 Canon Kabushiki Kaisha Image aspect ratio conversion processing apparatus
JPH05176212A (en) 1991-12-19 1993-07-13 Canon Inc Image pickup device
US5309241A (en) 1992-01-24 1994-05-03 Loral Fairchild Corp. System and method for using an anamorphic fiber optic taper to extend the application of solid-state image sensors
JP2759727B2 (en) 1992-04-22 1998-05-28 日本ビクター株式会社 Display device
GB2269955B (en) 1992-08-07 1996-02-14 British Broadcasting Corp Method of showing 16:9 pictures on 4:3 displays
US5905530A (en) 1992-08-24 1999-05-18 Canon Kabushiki Kaisha Image pickup apparatus
JPH0767025A (en) 1993-08-30 1995-03-10 Olympus Optical Co Ltd Video processor
US5696560A (en) 1994-07-25 1997-12-09 Magma, Inc. Motion picture distribution system
JPH08184759A (en) 1995-01-05 1996-07-16 Nikon Corp Zoom lens provided with anamorphic converter
JP2947726B2 (en) 1995-03-01 1999-09-13 鹿島建設株式会社 Image system for remote control support
JPH08307753A (en) 1995-05-09 1996-11-22 Minolta Co Ltd Video system consisting of camera and display device and display device
JP4643773B2 (en) * 1996-12-17 2011-03-02 テセラ・テクノロジーズ・ハンガリー・ケイエフティー Electronic zoom image input method
US6738057B1 (en) * 1998-12-22 2004-05-18 Micron Technology, Inc. Compensation for optical distortion at imaging plane
FR2826221B1 (en) 2001-05-11 2003-12-05 Immervision Internat Pte Ltd METHOD FOR OBTAINING AND DISPLAYING A VARIABLE RESOLUTION DIGITAL PANORAMIC IMAGE
US7098949B2 (en) 2002-07-29 2006-08-29 Hewlett-Packard Development Company, L.P. Apparatus and method for improved-resolution digital zoom in a portable electronic imaging device
US7227984B2 (en) 2003-03-03 2007-06-05 Kla-Tencor Technologies Corporation Method and apparatus for identifying defects in a substrate surface by using dithering to reconstruct under-sampled images
JP4197994B2 (en) 2003-06-19 2008-12-17 コニカミノルタオプト株式会社 Imaging device
US7502063B2 (en) 2004-08-09 2009-03-10 Aptina Imaging Corporation Camera with scalable resolution
JP4804194B2 (en) * 2006-03-30 2011-11-02 キヤノン株式会社 LENS DEVICE AND IMAGING DEVICE

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432404A (en) * 1993-12-10 1995-07-11 Hitachi, Ltd. Apparatus for detecting a geometric distortion of an image on a display device
US5909312A (en) 1996-10-02 1999-06-01 Ramot University Authority For Applied Research & Industrial Development Ltd. Phase-only filter for generating an arbitrary illumination pattern
US6248988B1 (en) 1998-05-05 2001-06-19 Kla-Tencor Corporation Conventional and confocal multi-spot scanning optical microscope
US5940217A (en) 1998-05-06 1999-08-17 Intel Corporation Anti-aliasing diffractive aperture and optical system using the same
WO2004063989A2 (en) 2003-01-16 2004-07-29 D-Blur Technologies Ltd. Camera with image enhancement functions
US20050276475A1 (en) * 2004-06-14 2005-12-15 Canon Kabushiki Kaisha Image processing device, image processing method and image processing program

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7621647B1 (en) * 2006-06-23 2009-11-24 The Elumenati, Llc Optical projection system and method of use
US7959307B1 (en) 2006-06-23 2011-06-14 The Elumenati, Llc Optical projection system and method of use
WO2009064512A1 (en) * 2007-11-15 2009-05-22 Sony Ericsson Mobile Communications Ab System and method for generating a photograph with variable image quality
US8542289B1 (en) 2008-02-08 2013-09-24 Google Inc. Mapping a two-dimensional image to a cylindrical surface using a tuned distortion curve
US9525807B2 (en) 2010-12-01 2016-12-20 Nan Chang O-Film Optoelectronics Technology Ltd Three-pole tilt control system for camera module
US8711245B2 (en) 2011-03-18 2014-04-29 Digitaloptics Corporation Europe Ltd. Methods and systems for flicker correction
US9817206B2 (en) 2012-03-10 2017-11-14 Digitaloptics Corporation MEMS auto focus miniature camera module with fixed and movable lens groups
WO2014072837A2 (en) 2012-06-07 2014-05-15 DigitalOptics Corporation Europe Limited Mems fast focus camera module
US9001268B2 (en) 2012-08-10 2015-04-07 Nan Chang O-Film Optoelectronics Technology Ltd Auto-focus camera module with flexible printed circuit extension
US9007520B2 (en) 2012-08-10 2015-04-14 Nanchang O-Film Optoelectronics Technology Ltd Camera module with EMI shield
WO2014033099A2 (en) 2012-08-27 2014-03-06 Digital Optics Corporation Europe Limited Rearview imaging systems for vehicle
US10101636B2 (en) 2012-12-31 2018-10-16 Digitaloptics Corporation Auto-focus camera module with MEMS capacitance estimator
WO2020184286A1 (en) * 2019-03-14 2020-09-17 Ricoh Company, Ltd. Imaging device, image capturing optical system, and movable apparatus
US11595631B2 (en) 2019-03-14 2023-02-28 Ricoh Company, Ltd. Imaging device, image capturing optical system, and movable apparatus

Also Published As

Publication number Publication date
JP2009531723A (en) 2009-09-03
US8233073B2 (en) 2012-07-31
EP1999947A1 (en) 2008-12-10
KR101388564B1 (en) 2014-04-23
CN101438577B (en) 2013-03-27
CN101438577A (en) 2009-05-20
JP5241700B2 (en) 2013-07-17
US20090225171A1 (en) 2009-09-10
KR20090005103A (en) 2009-01-12
EP1999947B1 (en) 2012-08-29

Similar Documents

Publication Publication Date Title
US8233073B2 (en) Image capturing device with improved image quality
US8203644B2 (en) Imaging system with improved image quality and associated methods
JP4981124B2 (en) Improved plenoptic camera
US20120099005A1 (en) Methods and systems for reading an image sensor based on a trajectory
US8525914B2 (en) Imaging system with multi-state zoom and associated methods
US8400555B1 (en) Focused plenoptic camera employing microlenses with different focal lengths
US5739852A (en) Electronic imaging system and sensor for use therefor with a nonlinear distribution of imaging elements
US8466989B2 (en) Camera having image correction function, apparatus and image correction method
JP4434653B2 (en) Portable electronic imaging device with digital zoom function and method for providing digital zoom function
US20080002041A1 (en) Adaptive image acquisition system and method
KR20070004202A (en) Method for correcting lens distortion in digital camera
US20090185028A1 (en) Camera apparatus and image recording/reproducing method
JPH06165024A (en) Image pickup device, image reproducing device and video system
US9743007B2 (en) Lens module array, image sensing device and fusing method for digital zoomed images
JP2007049266A (en) Picture imaging apparatus
JP2020043430A (en) Imaging apparatus
JP3052160B2 (en) Solid-state imaging device
Cvetković et al. MULTIPLE SENSORS’LENSLETS FOR SECURE DOCUMENT SCANNERS
WO2018020424A1 (en) A method for image recording and an optical device for image registration
JP2002218296A (en) Image pickup device
WO2008122145A1 (en) Adaptive image acquisition system and method
KR20090078933A (en) Apparatus and method for providing digital zoom

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06723830

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 3892/KOLNP/2008

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 200680054027.2

Country of ref document: CN

Ref document number: 2006723830

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009501859

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 1020087026489

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 12225591

Country of ref document: US