Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20050041121 A1
Publication typeApplication
Application numberUS 10/919,226
Publication dateFeb 24, 2005
Filing dateAug 16, 2004
Priority dateOct 9, 1997
Also published asEP1782618A1, US7738015, US7746385, US7787022, US7973828, US8203621, US8279301, US8493466, US8698916, US8885074, US20080211937, US20090027520, US20100295959, US20110080499, US20110134287, US20130033624, US20130044243, US20140198238, WO2006018056A1
Publication number10919226, 919226, US 2005/0041121 A1, US 2005/041121 A1, US 20050041121 A1, US 20050041121A1, US 2005041121 A1, US 2005041121A1, US-A1-20050041121, US-A1-2005041121, US2005/0041121A1, US2005/041121A1, US20050041121 A1, US20050041121A1, US2005041121 A1, US2005041121A1
InventorsEran Steinberg, Yury Prilutsky, Peter Corcoran, Petronel Bigioi, Michael DeLuca
Original AssigneeEran Steinberg, Yury Prilutsky, Peter Corcoran, Petronel Bigioi, Deluca Michael J.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Red-eye filter method and apparatus
US 20050041121 A1
Abstract
A digital image acquisition system having no photographic film, such as a digital camera, has a flash unit for providing illumination during image capture and a red-eye filter for detecting a region within a captured image indicative of a red-eye phenomenon, the detection being based upon a comparison of the captured image and a reference image of nominally the same scene taken without flash. In the embodiment the reference image is a preview image of lower pixel resolution than the captured image, the filter matching the pixel resolutions of the captured and reference images by up-sampling the preview image and/or sub-sampling the captured image. The filter also aligns at least portions of the captured image and reference image prior to comparison to allow for, e.g. movement in the subject.
Images(11)
Previous page
Next page
Claims(50)
1. A digital image acquisition system having no photographic film, comprising a portable apparatus for capturing digital images, a flash unit for providing illumination during image capture, and a red-eye filter for detecting a region within a captured image indicative of a red-eye phenomenon, said detection being based upon a comparison of said captured image and at least one reference eye color characteristic.
2. The system of claim 1, wherein said at least one reference eye color characteristic comprises an eye color characteristic of a reference image of nominally the same scene taken without flash.
3. A system according to claim 2, wherein the reference image is a preview image of lower pixel resolution than the captured image, the filter including programming code for matching the pixel resolutions of the captured and reference images by at least one of up-sampling the preview image and sub-sampling the captured image.
4. A system according to claim 3, the filter including further programming code for aligning at least portions of the captured image and reference image prior to said comparison.
5. A system according to claim 4, wherein the filter also includes a shape analyser so that the filter is configured to identify a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
6. A system according to claim 4, wherein the filter also includes a shape analyser to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
7. A system according to claim 2, the filter further including programming code for aligning at least portions of the captured image and reference image prior to said comparison.
8. A system according to claim 7, wherein the filter also includes a shape analyser so that the filter is configured to identify a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
9. A system according to claim 7, wherein the filter also includes a shape analyser to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
10. A system according to claim 2, wherein the filter detects said region indicative of a red-eye phenomenon by identifying a region in the captured image at least having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, the filter further designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
11. A system according to claim 10, wherein the decision as to whether a region has a color indicative of a red-eye phenomenon is determined on a statistical basis as a global operation on the entire region.
12. A system according to claim 2, wherein the filter also includes a shape analyser so that the filter is configured to identify a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
13. A system according to claim 2, wherein the filter also includes a shape analyser to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
14. A system according to claim 2, wherein said digital image acquisition system is a digital camera.
15. A system according to claim 2, wherein said digital image acquisition system is a camera is a combination of a digital camera and an external processing device.
16. A system as claimed in claim 15, wherein said red-eye filter is located in said external processing device.
17. A system according to claim 2, further including a pixel modifier for modifying the color of the pixels within a region indicative of a red-eye phenomenon.
18. A system according to claim 1, wherein the filter also includes a shape analyser so that the filter is configured to identify a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with a reference eye shape characteristic and the reference eye color characteristic, respectively.
19. A system according to claim 18, further including a pixel modifier for modifying the color of the pixels within a region indicative of a red-eye phenomenon.
20. A system according to claim 1, further including a pixel modifier for modifying the color of the pixels within a region indicative of a red-eye phenomenon.
21. A digital image acquisition system having no photographic film, comprising a portable apparatus for capturing digital images, a flash unit for providing illumination during image capture, and a red-eye filter for detecting red-eye phenomenon in a captured image based upon a comparison of said captured image and a reference image of nominally the same scene taken without flash, wherein the reference image is a preview image of lower pixel resolution than the captured image, the filter including means for matching the pixel resolutions of the captured and reference images by at least one of up-sampling the preview image and sub-sampling the captured image.
22. A system according to claim 21, the filter further including means for aligning at least portions of the captured image and reference image prior to said comparison.
23. A system according to claim 22, wherein the filter detects said red-eye phenomenon by identifying a region in the captured image having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, the filter further designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
24. A system according to claim 21, wherein the decision as to whether a region has a color indicative of a red-eye phenomenon is determined on a statistical basis as a global operation on the entire region.
25. A system according to claim 21, wherein the filter detects said red-eye phenomenon by identifying a region in the captured image having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, the filter further designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
26. A digital image acquisition system having no photographic film, comprising a portable apparatus for capturing digital images, a flash unit for providing illumination during image capture, and a red-eye filter for detecting red-eye phenomenon in a captured image based upon a comparison of said captured image and a reference image of nominally the same scene taken without flash, the filter further including means for aligning at least portions of the captured image and reference image prior to said comparison.
27. A system according to claim 26, wherein the filter detects said red-eye phenomenon by identifying a region in the captured image having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, the filter further designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
28. A system according to claim 27, wherein the decision as to whether a region has a color indicative of a red-eye phenomenon is determined on a statistical basis as a global operation on the entire region.
29. A digital image acquisition system having no photographic film, comprising a portable apparatus for capturing digital images, a flash unit for providing illumination during image capture, and a red-eye filter for detecting red-eye phenomenon in a captured image based upon a comparison of said captured image and a reference image of nominally the same scene taken without flash, wherein the filter further comprising means for detecting said red-eye phenomenon by identifying a region in the captured image having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, the filter further designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
30. A system according to claim 29, wherein the decision as to whether a region has a color indicative of a red-eye phenomenon is determined on a statistical basis as a global operation on the entire region.
31. A method of detecting a red-eye phenomenon within a digital image acquired by a digital image acquisition device having no photographic film, the device comprising a portable apparatus for capturing digital images, and a flash unit for providing illumination during image capture, the method comprising identifying a region within a captured image indicative of a red-eye phenomenon including comparing said captured image and at least one reference eye color characteristic.
32. The method of claim 31, wherein said at least one reference eye color characteristic comprises an eye color characteristic of a reference image of nominally the same scene taken without flash.
33. A method according to claim 32, wherein the reference image is a preview image of lower pixel resolution than the captured image, the method further comprising matching the pixel resolutions of the captured and reference images including up-sampling the preview image or sub-sampling the captured image, or a combination thereof.
34. A method according to claim 33, further comprising aligning at least portions of the captured image and reference image prior to said comparison.
35. A method according to claim 34, further comprising analysing a shape so that the identifying comprises identifying a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
36. A method according to claim 34, further comprising analysing a shape to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
37. A method according to claim 32, further comprising aligning at least portions of the captured image and reference image prior to said comparison.
38. A method according to claim 37, further comprising analysing a shape so that the identifying comprises identifying a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
39. A method according to claim 38, further comprising analysing a shape to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
40. A method according to claim 32, further comprising detecting said region indicative of a red-eye phenomenon by identifying a region in the captured image at least having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image, and designating said region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon.
41. A method according to claim 40, further comprising deciding whether a region has a color indicative of a red-eye phenomenon by determining on a statistical basis as a global operation on the entire region.
42. A method according to claim 32, further comprising analysing a shape so that the identifying comprises identifying a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image.
43. A method according to claim 32, further comprising analysing a shape to determine subsequent to said comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
44. A method according to claim 32, wherein said digital image acquisition system is a digital camera.
45. A method according to claim 32, wherein said digital image acquisition system is a camera is a combination of a digital camera and an external processing device.
46. A method as claimed in claim 45, wherein said red-eye filter is located in said external processing device.
47. A method according to claim 32, further comprising modifying the color of the pixels within a region indicative of a red-eye phenomenon.
48. A method according to claim 31, further comprising analysing a shape, so that the identifying comprises identifying a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with a reference eye shape characteristic and the reference eye color characteristic, respectively.
49. A method according to claim 48, further comprising modifying the color of the pixels within a region indicative of a red-eye phenomenon.
50. A method according to claim 31, further comprising modifying the color of the pixels within a region indicative of a red-eye phenomenon.
Description
    PRIORITY AND RELATED APPLICATIONS
  • [0001]
    This application is a continuation-in-part (CIP) application which claims the benefit of priority to U.S. patent application Ser. No. 10/772,767, filed Feb. 4, 2004, which is a CIP that claims the benefit of priority to U.S. patent application Ser. No. 10/635,862, which is a CIP that claims the benefit of priority to U.S. patent application Ser. No. 10/170,511, filed Jun. 12, 2002, which is a continuation of U.S. patent application Ser. No. 08/947,603, filed Oct. 9, 1997, now U.S. Pat. No. 6,407,777, issued Jun. 18, 2002, which are each hereby incorporated by reference. This application is also related to U.S. patent applications No. 10/635,918, filed Aug. 5, 2003, and Ser. No. 10/773,092, filed Feb. 4, 2004.
  • FIELD OF THE INVENTION
  • [0002]
    The invention relates generally to the area of digital photography, and more specifically to filtering “red-eye” artefacts from a flash-induced digital camera image.
  • BACKGROUND OF THE INVENTION
  • [0003]
    “Red-eye” is a phenomenon in flash photography where a flash is reflected within a subject's eye and appears in a photograph as a red dot where the black pupil of the subject's eye would normally appear. The unnatural glowing red of an eye is due to internal reflections from the vascular membrane behind the retina, which is rich in blood vessels. This objectionable phenomenon is well understood to be caused in part by a small angle between the flash of the camera and the lens of the camera. This angle has decreased with the miniaturization of cameras with integral flash capabilities. Additional contributors include the relative closeness of the subject to the camera and ambient light levels.
  • [0004]
    The red-eye phenomenon can be minimized by causing the iris to reduce the opening of the pupil. This is typically done with a “pre-flash”, a flash or illumination of light shortly before a flash photograph is taken. This causes the iris to close. Unfortunately, the pre-flash is an objectionable 0.2 to 0.6 seconds prior to the flash photograph. This delay is readily discernible and easily within the reaction time of a human subject. Consequently the subject may believe the pre-flash is the actual photograph and be in a less than desirable position at the time of the actual photograph. Alternately, the subject must be informed of the pre-flash, typically loosing any spontaneity of the subject captured in the photograph.
  • [0005]
    Those familiar with the art have developed complex analysis processes operating within a camera prior to invoking a pre-flash. Various conditions are monitored prior to the photograph before the pre-flash is generated; the conditions include the ambient light level and the distance of the subject from the camera. Such a system is described in U.S. Pat. No. 5,070,355, which is hereby incorporated by reference. Although that invention minimizes the occurrences where a pre-flash is used, it does not eliminate the need for a pre-flash.
  • [0006]
    Digital cameras are becoming more popular and smaller in size. Digital cameras have several advantages over film cameras. Digital cameras eliminate the need for film as the image is digitally captured and stored in a memory array for display on a display screen on the camera itself. This allows photographs to be viewed and enjoyed virtually instantaneously as opposed to waiting for film processing. Furthermore, the digitally captured image may be downloaded to another display device such as a personal computer or color printer for further enhanced viewing. Digital cameras include microprocessors for image processing and compression and camera systems control. It is possible to exploit the computation capabilities of such microprocessors for performing operations to improve the red-eye detection and elimination. Thus, what is needed is a method of better tools for eliminating red-eye phenomenon within, for example, a digital camera having a flash unit without the distraction of a pre-flash.
  • [0007]
    U.S. Patent Application 2002/0150306 (Baron), which is hereby incorporated by reference, describes a method for the removal of flash artefacts by capturing two digital images of a subject, one with flash and one without flash, and subtracting one image from the other to provide an artefact image which is then thresholded and subtracted from the flash image. However, the technique is directed to flash artefacts in general, and not specifically to red-eye removal. There is no attempt to identify red-eye regions as compared to any other flash-induced artefacts. Indeed, there is no attempt to identify particular regions at all, since the technique is simply one of subtraction and thresholding.
  • BRIEF SUMMARY OF THE INVENTION
  • [0008]
    A system in accordance with the present invention there is provided a digital image acquisition system having no photographic film, comprising a portable apparatus for capturing digital images, a flash unit for providing illumination during image capture, and a red-eye filter for detecting a region within a captured image indicative of a red-eye phenomenon, said detection being based upon a comparison of said captured image and at least one reference eye color characteristic.
  • [0009]
    According to one embodiment of the invention, the at least one reference eye color characteristic includes a reference image of nominally the same scene taken without flash.
  • [0010]
    According to this embodiment, the reference image may be a preview image of lower pixel resolution than the captured image, the filter may include programming code for matching the pixel resolutions of the captured and reference images by up-sampling the preview image and/or sub-sampling the captured image.
  • [0011]
    To allow for inadvertent movement in the subject between taking the two images, preferably the filter may further include programming code for aligning at least portions of the captured image and reference image prior to said comparison.
  • [0012]
    In the embodiment the filter detects said region indicative of a red-eye phenomenon by identifying a region in the captured image at least having a color indicative of a red-eye phenomenon and comparing said identified region with the corresponding region in the reference image. The filter may further designate the region as indicative of a red-eye phenomenon if said corresponding region does not have a color indicative of a red-eye phenomenon. The decision as to whether a region has a color indicative of a red-eye phenomenon may be determined on a statistical basis as a global operation on the entire region.
  • [0013]
    According to a further embodiment, the filter includes a shape analyser so that the filter is configured to identify a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with a reference eye shape characteristic and the reference eye color characteristic, respectively.
  • [0014]
    In another embodiment, a pixel modifier is included for modifying the color of the pixels within a region indicative of a red-eye phenomenon.
  • [0015]
    A method in accordance with the present invention is also provided for detecting a red-eye phenomenon within a digital image acquired by a digital image acquisition device having no photographic film. The device includes a portable apparatus for capturing digital images, and a flash unit for providing illumination during image capture. The method includes identifying a region within a captured image indicative of a red-eye phenomenon including comparing the captured image and at least one reference eye color characteristic.
  • [0016]
    According to one embodiment, the at least one reference eye color characteristic includes an eye color characteristic of a reference image of nominally the same scene taken without flash.
  • [0017]
    The reference image may be a preview image of lower pixel resolution than the captured image. The method may further include matching the pixel resolutions of the captured and reference images including up-sampling the preview image or sub-sampling the captured image, or a combination thereof.
  • [0018]
    The method may further include aligning at least portions of the captured image and reference image prior to said comparison.
  • [0019]
    The method may further include analysing a shape so that the identifying comprises identifying a region in the captured image having both a shape and color indicative of a red-eye phenomenon for subsequent comparison with the corresponding region in the reference image. A shape may be analysed to determine subsequent to the comparison whether a region designated as indicative of a red-eye phenomenon has a shape indicative of a red-eye phenomenon.
  • [0020]
    The method may also include detecting a region indicative of a red-eye phenomenon by identifying a region in the captured image at least having a color indicative of a red-eye phenomenon and comparing the identified region with the corresponding region in the reference image, and designating the region as indicative of a red-eye phenomenon if the corresponding region does not have a color indicative of a red-eye phenomenon.
  • [0021]
    The method may also include deciding whether a region has a color indicative of a red-eye phenomenon by determining on a statistical basis as a global operation on the entire region.
  • [0022]
    The method may also include modifying the color of the pixels within a region indicative of a red-eye phenomenon.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0023]
    FIG. 1 is a block diagram of a camera apparatus operating in accordance with an embodiment of the present invention.
  • [0024]
    FIG. 2 illustrates the workflow of the initial stage of a red-eye filter using preview data according to the embodiment.
  • [0025]
    FIGS. 3-a to 3-d illustrates the pixelation process of an image in different resolutions.
  • [0026]
    FIG. 3-e is a enlargement of a hypothetical digitization of an eye in an image.
  • [0027]
    FIG. 4. illustrates the pixel differences between a red-eye image and a non red-eye image.
  • [0028]
    FIGS. 5-a to 5-d and 6-a and 6-b illustrate the detailed workflow of the red-eye filter according to the embodiment, and alternatives.
  • DESCRIPTION OF A PREFERRED EMBODIMENT
  • [0029]
    FIG. 1 shows a block diagram of a image acquisition system such as a digital camera apparatus operating in accordance with the present invention. The digital acquisition device, also generically referred to in this application as a camera 20, includes a processor 120. It can be appreciated that many of the processes implemented in the digital camera may be implemented in or controlled by software operating in a microprocessor (μProc), central processing unit (CPU), controller, digital signal processor (DSP) and/or an application specific integrated circuit (ASIC), collectively depicted as block 120 and termed as “processor”. Generically, all user interface and control of peripheral components such as buttons and display is controlled by a μ-controller 122. The processor 120, in response to a user input at 122, such as half pressing a shutter button (pre-capture mode 32), initiates and controls the digital photographic process. Ambient light exposure is determined using light sensor 40 in order to automatically determine if a flash is to be used. The distance to the subject is determined using focusing means 50 which also focuses the image on image capture means 60. If a flash is to be used, processor 120 causes the flash means 70 to generate a photographic flash in substantial coincidence with the recording of the image by image capture means 60 upon full depression of the shutter button. The image capture means 60 digitally records the image in color. The image capture means is known to those familiar with the art and may include a CCD (charge coupled device) or CMOS to facilitate digital recording. The flash may be selectively generated either in response to the light sensor 40 or a manual input 72 from the user of the camera. The image recorded by image capture means 60 is stored in image store means 80 which may comprise computer memory such a dynamic random access memory or a non-volatile memory. The camera is equipped with a display 100, such as an LCD, for preview and post-view of images. In the case of preview images, which are generated in the pre-capture mode 32, the display 100 can assist the user in composing the image, as well as being used to determine focusing and exposure. In case of postview, the image display can assist the user in viewing suspected red-eye regions and to manually decide if the region should be corrected or not after viewing it. A temporary storage space 82 is used to store one or plurality of the preview images and be part of the image store means 80 or a separate component. The preview image is usually generated by the same image capture means 60, and for speed and memory efficiency reasons may be generated by subsampling the image 124 using software which can be part of the general processor 120 or dedicated hardware, before displaying 100 or storing 82 the preview image. Depending on the settings of this hardware subsystem, the pre-acquisition image processing may satisfy some predetermined criteria prior to storing the preview image. Such criteria may be chronological—such as save images every 0.5 seconds; more sophisticated criteria may be analysis of the image for changes, or the detection of faces in the image. A straightforward preferred embodiment is to constantly replace the previous saved preview image with a new captured preview image during the pre-capture mode 32, until the final full resolution image is captured by full depression of the shutter button.
  • [0030]
    The red-eye filter 90 can be integral to the camera 20 or part of an external processing device 10 such as a desktop computer, a hand held device, a cell phone handset or a server. In this embodiment, the filter receives the captured image from the full resolution image storage 80 as well as one or a plurality of preview images from the temporary storage 82. The filter 90 analyzes the stored image for characteristics of red-eye and, if found, modifies the image and removes the red-eye phenomenon from the image as will be describe in more detail. The red-eye filter includes a pixel locator 92 for locating pixels having a color indicative of red-eye; a shape analyzer 94 for determining if a grouping of at least a portion of the pixels located by the pixel locator comprise a shape indicative of red-eye; an falsing analyzer 96 for processing the image around the grouping for details indicative of an image of an eye; and a pixel modifier 98 for modifying the color of pixels within the grouping. The modified image may be either displayed on image display 100, saved on a persistent storage 112 which can be internal or a removable storage such as CF card, SD card or the like, or downloaded to another device, such as a personal computer, server or printer via image output means 110 which can be tethered or wireless.
  • [0031]
    In a further embodiment where the red-eye filter 90 is located on an external application in a separate device, 10, such as a desktop computer, the final captured image stored in block 80 along with a representation of the preview image as temporarily stored in 82, may be stored prior to modification on the storage device 112, or transferred together via the image output means 110 onto the external device 10, later to be processed by the red-eye filter 90.
  • [0032]
    FIG. 2 details the initial stage of the workflow of this embodiment. It will be understood both this initial stage as well as the subsequent red-eye correction stage (See FIGS. 5 a to 5 d) will typically be performed by software in the camera and/or separate device 10. A preview image (normally of lesser resolution than the final image) is generated while the camera is in the pre-capture mode 32 such as when the user half presses the shutter button. While in this mode, shown in FIG. 2 as the preview mode 210, the camera constantly captures the preview images 220. The capture interval is usually semi-real time which means fractions of a tenth of a second or less. The camera saves each new preview image if it satisfies some test criteria, 122. If not, the camera continues, 211, to capture the next preview image without saving the previous one. The process will continue until the final full resolution image is acquired 280 and saved 282 by fully depressing the shutter button.
  • [0033]
    In a simple embodiment, if the test criteria are met, or if no test criteria exist, the system will constantly replace the previous saved preview image with the new preview image, 230. Alternatively, where multiple preview images can be saved, 240, the new image will be placed on a chronological FIFO stack, namely First In First Out, where the system continuously captures and saves new preview images 244 while each time clearing the oldest image 242 from the stack, until the user takes the final picture. The reason for storing multiple preview images is based on the fact that the last image, or any single image, may not be the best reference image for comparison with the final full resolution image in the red-eye correction process. By storing multiple images, a better reference image can be achieved, and a closer alignment between the preview and the final captured image can be achieved. This concept will be further discussed in FIGS. 5 a to 5 c, in the alignment stage 540. Other reasons for capturing multiple images are that a single image may be blurred due to motion, the subject had their eyes closed, the exposure was not set, etc. In a yet alternative embodiment, the multiple images may assist in creating a single higher quality reference image; either higher resolution or by taking different portions of different regions from the multiple images. This concept of sub-pixel resolution may be combined with the upsampling process as described in FIGS. 5 a to 5 c, block 534.
  • [0034]
    The test criteria 222 may involve the actual analysis of the preview image content before deciding whether the new preview image should replace a previously saved image. Such criteria may be based on image analysis such as the existence of faces in the image, detection of eyes or metadata analysis such as the exposure condition, whether a flash is going to happen, the distance to the subjects, etc.
  • [0035]
    As part of the red-eye filter 90 the full resolution image 292 and the preview image or images 294 will be loaded into working memory, 292 and 294, unless they are already in memory in which they will just be accessed through a pointer. Referring to FIGS. 3 a-3 e, the digitization process in various resolutions is explained and depicted. FIG. 3 a illustrates the grid like nature of a sensor as illustrated in FIG. 1, block 60. The sensor comprises multiple cells 302 which determine the camera resolution. For example a sensor of 20003000 cells will be a 6 Million pixel sensor (it will be understood that each cell in fact comprises a plurality of individual sensor elements sensitive to different colors, e.g. RGB or RGBG, to create each colored image pixel).
  • [0036]
    FIG. 3-b depicts the optical projection of a face 310 onto the sensor 60. Each of the cells 302 records the average light information it receives for the image. This is the process of digitization and quantization.
  • [0037]
    The degree of details is determined by the resolution of the sensor as depicted in FIG. 3-c. In this illustration a much smaller sensor is used, and in this case the same face 310 is digitized into a smaller number of pixels, or alternatively subsampled from the full resolution sensor data into a smaller number of pixel representations.
  • [0038]
    FIG. 3-d describes the inverse process where the subsampled image of FIG. 3-c is upsampled to the same size as the original. When comparing the resolution, naturally the some of the details are lost in this process. For illustrative example, while in FIG. 3-b the face was roughly 2525=625 pixels, in FIG. 3-d the face is made of only 55=25 pixels.
  • [0039]
    Of course, the above are only for illustration purposes. In practice, due to the larger resolution of the sensors than in this illustration, a normal eye will be depicted by a much larger pixel count to be noticeable. FIG. 3-e displays such a digitized eye. In this figure, an eye 350, as imaged on a sensor 60, will consist of roughly 25 pixels wide, 352. In particular interest for this invention the inner portion, the iris 360, in this case will be roughly 8 pixels in diameter, as illustrated in 462.
  • [0040]
    According to a preferred embodiment of this invention, the preview image and the final image, or portions of them, need to be aligned as depicted in FIG. 5, block 540. As explained above, the reference image and the final image may have different resolutions. The discrepancy in resolution may lead to differences in content, or pixel values, even though no data was changed in the subject image. In particular, edge regions when downsampled and then upsampled may have a blurring or an averaging effect on the pixels. Thus direct comparison of different resolution images, even when aligned, may lead to false contouring. In addition, the reference image may be acquired prior to or after the final image is captured. Due to the above reasons, there is a need to match the two images, both in content and pixel resolution, as described below.
  • [0041]
    FIG. 4 better illustrates the effect of the sub- and up-sample process in finding the difference pixelwise between two images. In this case, the input images are the ones illustrated in FIGS. 3-b and 3-d respectively high resolution and low resolution. In this figure, white squares such as 430 means that there is no difference between the two. Checkered squares or pixels, such as 420 means that there is a difference between the images.
  • [0042]
    The flat regions should display no significant differences due to resolution changes. The main difference will be caused be two reasons namely edge regions where changes in value occur such as in blocks 410. However, there is another cause for difference which is of interest to this invention and displayed in 430. In these pixels, the difference is caused by the actual change in the color of the eye from normal eyes to red-eyes. Not only is there a change in the pixel value but the change is also more specifically reflected as change to a red or light color from the normal color of the iris or form the black color of the pupil.
  • [0043]
    FIGS. 5-a to 5-d illustrate the workflow of the red-eye filter 90 of this embodiment, as well as variations thereof.
  • [0044]
    Referring first to FIG. 5-a, there are two input images into the filter, namely a full resolution image 510, I(x,y) which is the one that was captured by full depression of the shutter button and needs to be analyzed for red-eye artefacts, and a preview image 520, P(x,y) which is used as a reference image and is nominally the same scene as the image I(x,y) but taken without the flash. The preview image may be a result of some image processing taking into account multiple preview images and creating a single image, 522. Methods of improving image quality based on multiple images are familiar to those versed in the art of image processing. The resulting output from the analysis process of 522 is a single preview image.
  • [0045]
    The preview image 520 is normally, but not necessarily, of lower resolution than the full resolution image 510, typically being generated by clocking out a subset of the sesnor cells or by averaging the raw sensor data. Therefore, the two images, or alternatively the relevant regions in the images (i.e. the regions containing or suspected to contain eyes, which can be determined by image processing techniques known in the art), need to be matched in pixel resolution, 530. In the present context “pixel resolution” means the size of the image, or relevant region, in terms of the number of pixels constituting the image or region concerned. Such a process may be done by either upsampling the preview image, 534, downsampling the acquired image, 532, or a combination thereof. Those familiar in the art will be aware of several techniques best used for such sampling methods. The result of step 530 is a pair of images I′(x,y) and P′(x,y) corresponding to the original images I(x,y) and P(x,y), or relevant regions thereof, with matching pixel resolution. The system and method of the preferred embodiment involves the detection and removal of red-eye artefacts. The actual removal of the red-eye will eventually be performed on the full resolution image. However, all or portions of the detection of red-eye candidate pixel groupings, the subsequent testing of said pixel groupings for determining false red-eye groupings, and the initial step of the removal, where the image is presented to the user for user confirmation of the correction, can be performed on the entire image, the subsampled image, or a subset of regions of the entire image or the subsampled image.
  • [0046]
    Although nominally of the same scene, the preview image and the finally acquired full resolution image may differ spatially due to the temporal lag between capturing the two images. Therefore, the two images, or relevant regions thereof, may need to be aligned, 540, especially in respect of regions of the images containing or suspected to contain eyes. Essentially, alignment means transforming at least one of the images, and in this embodiment the preview image P′(x,y), to obtain maximum correlation between the images, or relevant regions thereof, based on measurable characteristics such as color, texture, edge analysis. Those familiar in the art are aware of several algorithms to achieve such alignment; see, for example, U.S. Pat. No. 6,295,367 which is hereby incorporated by reference and which describes alignment of images due to object and camera movement, and U.S. Pat. No. 5,933,546, which is also hereby incorporated by reference and which addresses the use of multi-resolution data for pattern matching.
  • [0047]
    Further discussion on the alignment is presented in FIG. 5-c. In this Figure, the inputs are the two images I′(x,y) and P′(x,y) as defined in FIG. 5-a. The alignment may be global for the entire image or local for specific regions. For example, a simple linear alignment, such as a shift in the horizontal direction by H pixels, and/or in the vertical direction by V pixels, or a combination of the two. Mathematically, the shifted image, P″(x,y), can be described as:
    P″(x,y)=P′(x−H,y−V)
    However, simple translation operation may not suffice in the need to align the image. Therefore, there may be a need for X-Y shearing, which is a symmetrical shift of the object's points in the direction of the axis to correct for perspective changes; X-Y tapering where the object is pinched by shifting its coordinates towards the axis, the greater the magnitude of the coordinate the further the shift; or rotation around an arbitrary point.
  • [0049]
    In general, the alignment process may involve an affine transformation, defined as a special class of projective transformations that do not move any objects from the affine space R3 to the plane at infinity or conversely, or any transformation that preserves collinearity (i.e. all points lying on a line initially still lie on a line after transformation) and ratios of distances (e.g., the midpoint of a line segment remains the midpoint after transformation). Geometric contraction, expansion, dilation, reflection, rotation, shear, similarity transformations, spiral similarities and translation are all affine transformations, as are their combinations. In general, the alignment 540 may be achieved via an affine transformation which is a composition of rotations, translations, dilations, and shears, all well-known to one familiar in the art of image processing.
  • [0050]
    If it is determined through a correlation process that a global transformation suffices, as determined in block 542=YES, one of the images, and for simplicity the preview image, will undergo an affine transformation, 544, to align itself with the final full resolution image. Mathematically, this transformation can be depicted as:
    P″=AP′+q
    where A is a linear transformation and q is a translation.
  • [0052]
    However, in some cases a global transformation may not work well, in particular for cases where the subject matter moved, as could happen when photographing animated objects. In such case, in particular in images with multiple human subjects, and when the subjects move in independent fashion, the process of alignment 540 may be broken down, 546, to numerous local regions each with its own affine transformation. What is important is to align the eyes between the images. Therefore, according to this alternative, one or multiple local alignments may be performed, 548, for regions in the vicinity surrounding the eyes, such as faces.
  • [0053]
    Only after the images are aligned can one compare the potential red-eye colors.
  • [0054]
    In the preferred embodiment of FIG. 5-a, the preview image information is used as part of the falsing stage 96. Blocks 92, 94 and 98 correspond to the same blocks in FIG. 1, being the stages of pixel locator, shape analyzer and pixel modification respectively. This embodiment can incorporate pixel locator 92, shape analyzer 94 and pixel modifier 98 as described in U.S. Pat. No. 6,407,777 (DeLuca), incorporated by reference above, the functions of the pixel locator 92 and shape analyzer 94 being performed on the image I′(x,y) and the pixel modifier 98 operating on the original acquired image I(x,y). Block 96, which is the falsing stage, is improved in this embodiment as compared to the falsing stage of DeLuca.
  • [0055]
    Referring to block 96, for each region of the image I′(x,y) suspected as red-eye, step 596-2, as identified by steps 92 and 94, the suspected region is tested by comparing the pixel values of the region with the pixel values of the corresponding region in the aligned preview image P″(x,y), 596-6. However, prior to doing so, the regions need to be prepared and modified for such comparison, 596-4.
  • [0056]
    Due to the fact that the regions may not match exactly, a pixel-by-pixel comparison may not suffice. The reason for the mismatch may occur due to the original size discrepancy. For example, in edges this phenomenon is graphically illustrated in FIG. 4. Other reasons for a mismatch are potential movement of the object, or there may be some averaging that is done in the low resolution preview image that may loose high frequency color data. Such effects are referred to as smoothing and aliasing. In addition, even if the alignment is optimal, there may be sub-pixel alignment that can not be accounted for. Moreover, there may be color differences between the preview image, shot using available light and the acquired full resolution image which is shot using flash. In many cases, the color transformation between one image to another is not global and uniform. Therefore, the process of preparing the regions for comparison.
  • [0057]
    This process as illustrated in block 596-4 will be further described in FIG. 5-d. The underlying concept behind step 596-4 is to distinguish between differences that are caused due to the acquisition process and the differences that are caused due to the existence of red-eye in the image. This problem is well known to one familiar in the art of statistical pattern matching and scene analysis and image recognition. An example of such an application taking into account differences due to resolution is described in U.S. Pat. No. 5,933,546, which is hereby incorporated by reference.
  • [0058]
    If a region in the aligned preview image P″(x,y) was red and the equivalent region is red in the image I′(x,y), 596-6, that region will be eliminated from I′(x,y) as a red-eye artefact, 596-9, and the corresponding region will be eliminated as a red-eye artefact from the original full resolution image I(x,y). Otherwise, the region will continue to remain suspected as red-eye, 596-8. The process will continue, 596-3, for all suspected regions.
  • [0059]
    The comparison of the regions for a color value is done as a global operation on the entire region, and the answer to the question of whether a region is red or not is made statistically for the entire region and not pixel by pixel, i.e. it does not depend on the value of any particular individual pixel. Such approach will account for inconsistencies on the pixel level that may be resolved statistically when analyzing a larger collection of pixels consisting of a region. For example, some regions of the eye may not be fully red, or display other artefacts such as a glint of high luminance. Other example for the need of a global statistical operation is the presence of noise in the image. Techniques are known in the art for such global comparison.
  • [0060]
    Based on the information above, the regions finally identified as red-eye artefacts can be modified, 98, to eliminate the red-eye from the original full resolution image I(x,y). The modification can be done using any one of numerous available techniques such as luminance reduction, chrominance reduction, or subtraction of the artefact, as described in US Published Patent Application 2002/0150306 (Baron), which is hereby incorporated by reference.
  • [0061]
    FIG. 5-d describes the preparation of regions suspected of red-eye for comparison as described in FIG. 5-a, block 596-4. As discussed above, a simple pixel level comparison may not be enough to determine whether the region is not of red-eye nature. The process of preparation may include a combination of several components such as creating color balance between the regions of the preview image and the final image, 1510, analyzing the texture, or differences in high frequency patterns between the two regions that may have occurred due to the change in resolution, 1520, and comparing the edges between the two regions, 1530, where the differences may have occurred due to change in exposure, color balance, resolution or alignment, and in particular sub pixel alignment. The color balance step 1510 comprises marking each red-eye region in I′(x,y) and the corresponding region in P″(x,y), steps 1512 and 1514, determining the difference in color balance between the region in I′(x,y) surrounding, but not including, the suspected red-eye region and the corresponding region of P″(x,y), step 1516, and transforming the entire region, including the suspected red-eye region, based on the color balance difference so determined, step 1518.
  • [0062]
    As an alternative embodiment of this invention, the preview image can be used as part of the pixel locator stage 92, as illustrated in FIG. 5-b, rather than as part of the falsing analyzer 96. In FIG. 5-b, blocks 510, 520, 522, 530, 532, 534, 540, 94 and 98 are identical to those in FIG. 5-a. According to this embodiment, the use of the preview image in order to detect red-eye artefacts is implemented as part of the red-eye identification process, otherwise described as the pixel locator 92 in FIG. 1 but here identified as Pixel Analyser and Region Segmenter 592.
  • [0063]
    After the suspected red-eye regions are identified, the process continues via the shape analysis 94, false detection elimination 96 and correction 98 as described in FIG. 1. In this case, the falsing detector 96 may be performed according to DeLuca.
  • [0064]
    According to this embodiment, after the alignment step 540 the following steps 592-1 a and 592-1 b analyse both images I′(x,y) and P″(x,y) for the presence of pixels having a color indicative of red-eye (592-1 a), for example in the manner of DeLuca, and then identifies clusters of contiguous red pixels so detected (592-1 b). This is known as segmentation and is more fully described in US Pat. Appn. 2002/0176623, which is hereby incorporated by reference.
  • [0065]
    Now, each region (cluster) with red content in the acquired image I′(x,y), step 592-2, is compared with the corresponding region in the aligned preview image P″(x,y). The regions will need to be prepared, 592-4, as previously described in relation to block 596-4 of FIG. 5-a. If the regions are red in both cases, 592-6=YES, the region will not be marked as red-eye, no action will be taken and the process will continue to the next suspected region, 592-3. If the region is red in the acquired image I′(x,y) while the corresponding region is not red in the preview image P″(x,y), 592-6=NO, then the region will be marked as suspected red-eye, 592-8.
  • [0066]
    FIG. 6-a shows a modification of the embodiment of FIG. 5-b in which Step 540 (Align Images) has been divided into two steps, Step 541 (If Possible Globally Align Images) and Step 592-3 (If Required Locally Align Images). Step 541 corresponds to Steps 542 and 544 of FIG. 5-c. However, if a global alignment is not possible or practical, the local alignment is deferred until after red pixel identification and clustering has been performed, since the presence of such clusters in the two images I′(x,y) and P′(x,y) will assist in the local alignment. FIG. 6-b shows a similar modification applied to FIG. 5-a.
  • [0067]
    In the embodiments of the invention, in the comparison stages, 592-6 and 596-6 the pixel values do not necessarily have to be compared with red but may alternatively or additionally be compared with other values such as yellow, white, pink, brown or other color indicative of a red-eye phenomenon, or to a range of values, to accommodate other flash related eye artefacts that are not purely red. Due to the fact that the eye surface is retro-reflective (due to the smoothness created by the tears, and the spherical shape of the eyeball), the technique as described in this specification can assist in the detection of the eyes in an image. Such existence of an eye can be found by comparison of the spectral reflection of the flash in the eye with the same region where no flash was used, and thus without spectral reflection. This comparison may assist in locating eyes in general and not just eyes with red-eye artefacts. This process may be implemented by finding the change of small specular reflections that occur in the eye region when flash illumination is used such as described in WO 03/026278 (Jarman). The specular reflections may be used as another indication of suspected regions as defined in blocks 592-2 and 596-2 by comparing the specular reflection of the flash image with no specular reflection of the preview image.
  • [0068]
    Alternatively to a binary decision of adding or eliminating a region, 596-8 and 596-9, in the case of a continuous probability for each region, the process will be revised from a binary decision to changing a probability decision. The quantitative determination of such change in probability may be decided based on analysis of the confidence level of the comparison 592-4 and 596-4.
  • [0069]
    The preferred embodiments described above may be modified by adding or changing operations, steps and/or components in many ways to produce advantageous alternative embodiments. For example, the reference image can be a post-view image rather than a preview image, i.e. an image taken without flash immediately after the flash picture is taken.
  • [0070]
    A red-eye correction procedure may begin as described by block 92 with detecting a human face in a digital image and, based on this detection, finding the eyes in the face (see, e.g., U.S. Pat. No. 6,252,976, U.S. Publ. Pat. App. No. 2003/0044070 and U.S. Pat. No. 6,278,491, which are hereby incorporated by reference). This procedure may also be used for creating the regional alignment 546 and color balance 1510.
  • [0071]
    A range of alternative techniques may be employed to detect and verify the existence of red-eye defects in an image (see, e.g., U.S. Publ. Pat. Apps. No. 2003/0044177 and 2003/0044178, which are hereby incorporated by reference). These techniques may be incorporated into the pixel locator, shape analyzer, falsing analyzer and pixel modifier corresponding to blocks 92, 94, 96 and 98. A camera may include software or firmware for automatically detecting a red-eye image using a variety of image characteristics such as image brightness, contrast, the presence of human skin and related colors. The analysis of these image characteristics may be utilized, based on certain pre-determined statistical thresholds, to decide if red-eye defects exist and if a flash was used to take the original image.
  • [0072]
    The preferred embodiments described herein may involve expanded digital acquisition technology that inherently involves digital cameras, but that may be integrated with other devices such as cell-phones equipped with an acquisition component, toy cameras etc. The digital camera or other image acquisition device of the preferred embodiment has the capability to record not only image data, but also additional data referred to as meta-data. The file header of an image file, such as JPEG, TIFF, JPEG-2000, etc., may include capture information including the preview image, for processing and red-eye detection at a later post processing stage, which may be performed in the acquisition device or in a separate device such as a personal computer. The preferred embodiments described herein serve to improve the detection of red-eyes in images, while eliminating or reducing the occurrence of false positives, and to improve the correction of the detected artefacts.
  • [0073]
    While an exemplary drawing and specific embodiments of the present invention have been described and illustrated, it is to be understood that that the scope of the present invention is not to be limited to the particular embodiments discussed. Thus, the embodiments shall be regarded as illustrative rather than restrictive, and it should be understood that variations may be made in those embodiments by workers skilled in the arts without departing from the scope of the present invention, as set forth in the claims below and structural and functional equivalents thereof.
  • [0074]
    In addition, in methods that may be performed according to preferred embodiments herein and that may have been described above, the operations have been described in selected typographical sequences. However, the sequences have been selected and so ordered for typographical convenience and are not intended to imply any particular order for performing the operations, unless expressly set forth or understood by those skilled in the art being necessary.
  • [0075]
    Thus, the preferred embodiments described herein provide an improved method and apparatus for detecting red-eye phenomenon within images taken by a digital camera having a flash while eliminating or reducing the occurrence of false positives by using preview information.
  • [0076]
    In addition to all of the references cited above that have been incorporated by reference, the sections entitled BACKGROUND, SUMMARY OF THE INVENTION, ABSTRACT and BRIEF DESCRIPTION OF THE DRAWINGS, are hereby incorporated by reference into the DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US5016107 *May 9, 1989May 14, 1991Eastman Kodak CompanyElectronic still camera utilizing image compression and digital storage
US5202720 *Jan 31, 1990Apr 13, 1993Minolta Camera Kabushiki KaishaPhotographic camera with flash unit
US5748764 *Apr 3, 1995May 5, 1998Eastman Kodak CompanyAutomated detection and correction of eye color defects due to flash illumination
US5751836 *Sep 27, 1996May 12, 1998David Sarnoff Research Center Inc.Automated, non-invasive iris recognition system and method
US5761550 *Feb 20, 1997Jun 2, 1998Kancigor; BarryTelescoping flash unit for a camera
US5862217 *Mar 28, 1996Jan 19, 1999Fotonation, Inc.Method and apparatus for in-camera encryption
US5862218 *Apr 4, 1996Jan 19, 1999Fotonation, Inc.Method and apparatus for in-camera image marking and authentication
US6016354 *Oct 23, 1997Jan 18, 2000Hewlett-Packard CompanyApparatus and a method for reducing red-eye in a digital image
US6035072 *Dec 8, 1997Mar 7, 2000Read; Robert LeeMapping defects or dirt dynamically affecting an image acquisition device
US6172706 *Apr 5, 1996Jan 9, 2001Canon Kabushiki KaishaVideo camera with automatic zoom adjustment based on distance between user's eyes
US6204858 *May 30, 1997Mar 20, 2001Adobe Systems IncorporatedSystem and method for adjusting color data of pixels in a digital image
US6252976 *May 26, 1998Jun 26, 2001Eastman Kodak CompanyComputer program product for redeye detection
US6396963 *Dec 29, 1998May 28, 2002Eastman Kodak CompanyPhotocollage generation and modification
US6407777 *Oct 9, 1997Jun 18, 2002Deluca Michael JosephRed-eye filter method and apparatus
US6505003 *Oct 12, 2001Jan 7, 2003Eastman Kodak CompanyHybrid cameras that revise stored electronic image metadata at film unit removal and methods
US6510520 *Jun 26, 1998Jan 21, 2003Fotonation, Inc.Secure storage device for transfer of digital camera data
US6516154 *Jul 17, 2001Feb 4, 2003Eastman Kodak CompanyImage revising camera and method
US6700614 *Sep 24, 1999Mar 2, 2004Ricoh Company, Ltd.Autofocus apparatus
US6707950 *Jun 22, 1999Mar 16, 2004Eastman Kodak CompanyMethod for modification of non-image data in an image processing chain
US6718051 *Oct 16, 2000Apr 6, 2004Xerox CorporationRed-eye detection method
US6724941 *Sep 30, 1999Apr 20, 2004Fuji Photo Film Co., Ltd.Image processing method, image processing device, and recording medium
US6728401 *Aug 17, 2000Apr 27, 2004Viewahead TechnologyRed-eye removal using color image processing
US6873743 *Mar 29, 2002Mar 29, 2005Fotonation Holdings, LlcMethod and apparatus for the automatic real-time detection and correction of red-eye defects in batches of digital images or in handheld appliances
US6885766 *Jan 30, 2002Apr 26, 2005Imaging Solutions AgAutomatic color defect correction
US6895112 *Feb 13, 2001May 17, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US6900882 *Dec 4, 2001May 31, 2005Fuji Photo Film Co., Ltd.Print processing method, printing order receiving machine and print processing device
US6912298 *Aug 11, 2003Jun 28, 2005Adobe Systems IncorporationObject detection using dynamic probability scans
US6984039 *Dec 1, 2003Jan 10, 2006Eastman Kodak CompanyLaser projector having silhouette blanking for objects in the output light path
US7035461 *Aug 22, 2002Apr 25, 2006Eastman Kodak CompanyMethod for detecting objects in digital images
US7042501 *Dec 11, 1998May 9, 2006Fuji Photo Film Co., Ltd.Image processing apparatus
US7042505 *Jun 12, 2002May 9, 2006Fotonation Ireland Ltd.Red-eye filter method and apparatus
US7062086 *Oct 25, 2004Jun 13, 2006Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US7216289 *Mar 16, 2001May 8, 2007Microsoft CorporationMethod and apparatus for synchronizing multiple versions of digital data
US7352394 *Feb 4, 2004Apr 1, 2008Fotonation Vision LimitedImage modification based on red-eye filter analysis
US20020041329 *May 17, 1999Apr 11, 2002Eran SteinbergIn camera messaging and advertisement system
US20020054224 *Nov 21, 2001May 9, 2002Eastman Kodak CompanyCustomizing digital image transfer
US20030007687 *Jul 5, 2001Jan 9, 2003Jasc Software, Inc.Correction of "red-eye" effects in images
US20030021478 *Jul 23, 2002Jan 30, 2003Minolta Co., Ltd.Image processing technology for identification of red eyes in image
US20030025811 *Aug 9, 2002Feb 6, 2003Eastman Kodak CompanyCustomizing a digital camera based on demographic factors
US20030044063 *Jul 9, 2002Mar 6, 2003Guenter MeckesMethod for processing digital photographic image data that includes a method for the automatic detection of red-eye defects
US20030044070 *Jul 9, 2002Mar 6, 2003Manfred FuersichMethod for the automatic detection of red-eye defects in photographic image data
US20030044177 *Jul 9, 2002Mar 6, 2003Knut OberhardtMethod for the automatic detection of red-eye defects in photographic image data
US20030044178 *Jul 9, 2002Mar 6, 2003Knut OberhardtMethod for the automatic detection of red-eye defects in photographic image data
US20030058349 *Sep 23, 2002Mar 27, 2003Fuji Photo Film Co., Ltd.Method, apparatus, and program for image processing
US20030095197 *Sep 20, 2001May 22, 2003Eastman Kodak CompanySystem and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
US20030118216 *Feb 6, 2003Jun 26, 2003Goldberg David A.Obtaining person-specific images in a public venue
US20040017481 *Apr 8, 2003Jan 29, 2004Olympus Optical Co., Ltd.Digital camera, image pickup method, and image format conversion method
US20040027593 *Oct 12, 2001Feb 12, 2004David WilkinsTechniques for resolution independent rendering of images
US20040032512 *Aug 8, 2003Feb 19, 2004Kia SilverbrookCorrection of distortions in digital images
US20040032526 *Aug 8, 2003Feb 19, 2004Silverbrook Research Pty LtdMethod of processing digital image to correct for flash effects
US20040033071 *Aug 1, 2003Feb 19, 2004Naoki KuboElectronic flash device and camera having the same
US20040037460 *Aug 22, 2002Feb 26, 2004Eastman Kodak CompanyMethod for detecting objects in digital images
US20040046878 *Jul 31, 2002Mar 11, 2004Nick JarmanImage processing to remove red-eyed features
US20040056975 *Sep 22, 2003Mar 25, 2004Daisuke HataAutofocus apparatus
US20040057705 *Sep 10, 2003Mar 25, 2004Nidec Copal CorporationMotor driving apparatus
US20040090461 *Oct 31, 2003May 13, 2004Adams Guy De Warrenne BruceInterface devices
US20040093432 *Nov 7, 2002May 13, 2004Eastman Kodak CompanyMethod and system for conducting image processing from a mobile client device
US20040114796 *Dec 10, 2003Jun 17, 2004Toshihiko KakuImage correction apparatus and image pickup apparatus
US20040114797 *Oct 22, 2003Jun 17, 2004Guenter MeckesMethod for automatic determination of color-density correction values for the reproduction of digital image data
US20040114829 *Oct 10, 2003Jun 17, 2004Intelligent System Solutions Corp.Method and system for detecting and correcting defects in a digital image
US20040114904 *Dec 11, 2002Jun 17, 2004Zhaohui SunSystem and method to compose a slide show
US20040119851 *Dec 10, 2003Jun 24, 2004Fuji Photo Film Co., Ltd.Face recognition method, face recognition apparatus, face extraction method, and image pickup apparatus
US20050001024 *Dec 3, 2002Jan 6, 2005Yosuke KusakaElectronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system
US20050013602 *Jul 16, 2004Jan 20, 2005Pentax CorporationDigital camera having red-eye reduction bracket photographing mode
US20050013603 *Jul 13, 2004Jan 20, 2005Shoji IchimasaImage sensing apparatus, image processing apparatus, and control method therefor
US20050024498 *Jul 27, 2004Feb 3, 2005Fuji Photo Film Co., Ltd.Image processor and image processing system
US20050031224 *Aug 5, 2003Feb 10, 2005Yury PrilutskyDetecting red eye filter and apparatus using meta-data
US20050047655 *Aug 29, 2003Mar 3, 2005Huitao LuoDetecting and correcting redeye in an image
US20050047656 *Aug 29, 2003Mar 3, 2005Huitao LuoSystems and methods of detecting and correcting redeye in an image suitable for embedded applications
US20050053279 *Oct 25, 2004Mar 10, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US20050057715 *Sep 7, 2004Mar 17, 2005Shunichi HashimotoReflective liquid crystal display device, method of manufacturing the same, and liquid crystal display unit
US20050058340 *Oct 4, 2004Mar 17, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US20050058342 *Oct 8, 2004Mar 17, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US20050062856 *Sep 17, 2004Mar 24, 2005Fuji Photo Film Co., Ltd.Camera equipped with red-eye correction feature
US20050063083 *Oct 24, 2003Mar 24, 2005Dart Scott E.Systems and methods for the implementation of a digital images schema for organizing units of information manageable by a hardware/software interface system
US20050074164 *Sep 17, 2004Apr 7, 2005Fuji Photo Film Co., Ltd.Image processing apparatus and method, red-eye detection method, as well as programs for executing the image processing method and the red-eye detection method
US20050078191 *Oct 14, 2003Apr 14, 2005Hewlett-Packard Development Company LpSystem and method to allow undoing of certain digital image modifications
US20050090461 *Feb 19, 2004Apr 28, 2005Leadlay Peter F.Polyketides and their synthesis
US20050117132 *Dec 1, 2003Jun 2, 2005Eastman Kodak CompanyLaser projector having silhouette blanking for objects in the output light path
US20050129331 *Nov 4, 2004Jun 16, 2005Omron CorporationPupil color estimating device
US20050140801 *Feb 4, 2004Jun 30, 2005Yury PrilutskyOptimized performance and performance for red-eye filter method and apparatus
US20060008171 *Jul 6, 2004Jan 12, 2006Microsoft CorporationDigital photography with flash/no flash extension
US20060039690 *Aug 30, 2005Feb 23, 2006Eran SteinbergForeground/background segmentation in digital images with differential exposure calculations
US20060045352 *Sep 1, 2004Mar 2, 2006Eastman Kodak CompanyDetermining the age of a human subject in a digital image
US20060050300 *Aug 16, 2005Mar 9, 2006Canon Kabushiki KaishaInformation processing apparatus and method for managing order data of on-line orders, program for the same
US20060066628 *Sep 30, 2004Mar 30, 2006Microsoft CorporationSystem and method for controlling dynamically interactive parameters for image processing
US20060082847 *Oct 13, 2005Apr 20, 2006Fuji Photo Film Co., Ltd.Image correction apparatus and method of controlling same
US20060093212 *Oct 28, 2004May 4, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image
US20060093213 *May 6, 2005May 4, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering
US20060093238 *Jul 15, 2005May 4, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image using face recognition
US20060098867 *Nov 10, 2004May 11, 2006Eastman Kodak CompanyDetecting irises and pupils in images of humans
US20060098875 *Nov 4, 2005May 11, 2006Fuji Photo Film Co., Ltd.Image search apparatus for images to be detected, and method of controlling same
US20060119832 *Jan 24, 2006Jun 8, 2006Takayuki IidaPrint processing method, printing order receiving machine and print processing device
US20060120599 *Sep 21, 2005Jun 8, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image
US20070116379 *Nov 18, 2005May 24, 2007Peter CorcoranTwo stage detection for photographic eye artifacts
US20070133863 *Oct 10, 2006Jun 14, 2007Hitachi, Ltd.Image Alignment Method, Comparative Inspection Method, and Comparative Inspection Device for Comparative Inspections
US20080002060 *Jun 27, 2007Jan 3, 2008Fotonation Vision LimitedOptimized Performance and Performance for Red-Eye Filter Method and Apparatus
US20080043121 *Jul 2, 2007Feb 21, 2008Fotonation Vision LimitedOptimized Performance and Performance for Red-Eye Filter Method and Apparatus
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7042505Jun 12, 2002May 9, 2006Fotonation Ireland Ltd.Red-eye filter method and apparatus
US7171044 *Oct 8, 2004Jan 30, 2007Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US7352394Feb 4, 2004Apr 1, 2008Fotonation Vision LimitedImage modification based on red-eye filter analysis
US7436998May 6, 2005Oct 14, 2008Fotonation Vision LimitedMethod and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering
US7536036 *Oct 28, 2004May 19, 2009Fotonation Vision LimitedMethod and apparatus for red-eye detection in an acquired digital image
US7606417Oct 20, 2009Fotonation Vision LimitedForeground/background segmentation in digital images with differential exposure calculations
US7619665Nov 17, 2009Fotonation Ireland LimitedRed eye filter for in-camera digital image processing within a face of an acquired subject
US7636470 *Oct 4, 2004Dec 22, 2009Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US7636486Nov 10, 2004Dec 22, 2009Fotonation Ireland Ltd.Method of determining PSF using multiple instances of a nominally similar scene
US7639888Nov 10, 2004Dec 29, 2009Fotonation Ireland Ltd.Method and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US7639889Dec 29, 2009Fotonation Ireland Ltd.Method of notifying users regarding motion artifacts based on image analysis
US7652717 *Jan 26, 2010Eastman Kodak CompanyWhite balance correction in digital camera images
US7660478Feb 9, 2010Fotonation Vision Ltd.Method of determining PSF using multiple instances of nominally scene
US7660517 *Mar 16, 2006Feb 9, 2010The Trustees Of Columbia University In The City Of New YorkSystems and methods for reducing rain effects in images
US7680342Mar 16, 2010Fotonation Vision LimitedIndoor/outdoor classification in digital images
US7684630Mar 23, 2010Fotonation Vision LimitedDigital image adjustable compression and resolution using face detection information
US7685341May 6, 2005Mar 23, 2010Fotonation Vision LimitedRemote control apparatus for consumer electronic appliances
US7689009Mar 30, 2010Fotonation Vision Ltd.Two stage detection for photographic eye artifacts
US7692696Dec 27, 2005Apr 6, 2010Fotonation Vision LimitedDigital image acquisition system with portrait mode
US7693311Apr 6, 2010Fotonation Vision LimitedPerfecting the effect of flash within an image acquisition devices using face detection
US7694048Apr 6, 2010Fotonation Vision LimitedRemote control apparatus for printer appliances
US7697778Apr 13, 2010Fotonation Vision LimitedMethod of notifying users regarding motion artifacts based on image analysis
US7702136Jul 5, 2007Apr 20, 2010Fotonation Vision LimitedPerfecting the effect of flash within an image acquisition devices using face detection
US7738015Aug 16, 2004Jun 15, 2010Fotonation Vision LimitedRed-eye filter method and apparatus
US7746385Jun 29, 2010Fotonation Vision LimitedRed-eye filter method and apparatus
US7773118Aug 10, 2010Fotonation Vision LimitedHandheld article with movement discrimination
US7787022May 13, 2008Aug 31, 2010Fotonation Vision LimitedRed-eye filter method and apparatus
US7792970Dec 2, 2005Sep 7, 2010Fotonation Vision LimitedMethod for establishing a paired connection between media devices
US7804531Sep 28, 2010Fotonation Vision LimitedDetecting red eye filter and apparatus using meta-data
US7809162Oct 30, 2008Oct 5, 2010Fotonation Vision LimitedDigital image processing using face detection information
US7844076Nov 30, 2010Fotonation Vision LimitedDigital image processing using face detection and skin tone information
US7844135Nov 30, 2010Tessera Technologies Ireland LimitedDetecting orientation of digital images using face detection information
US7847839Aug 7, 2008Dec 7, 2010Fotonation Vision LimitedDetecting red eye filter and apparatus using meta-data
US7847840Dec 7, 2010Fotonation Vision LimitedDetecting red eye filter and apparatus using meta-data
US7848549Dec 7, 2010Fotonation Vision LimitedDigital image processing using face detection information
US7852384Dec 14, 2010Fotonation Vision LimitedDetecting red eye filter and apparatus using meta-data
US7853043Dec 14, 2009Dec 14, 2010Tessera Technologies Ireland LimitedDigital image processing using face detection information
US7855737Dec 21, 2010Fotonation Ireland LimitedMethod of making a digital camera image of a scene including the camera user
US7860274Oct 30, 2008Dec 28, 2010Fotonation Vision LimitedDigital image processing using face detection information
US7864990Dec 11, 2008Jan 4, 2011Tessera Technologies Ireland LimitedReal-time face tracking in a digital image acquisition device
US7865036Sep 14, 2009Jan 4, 2011Tessera Technologies Ireland LimitedMethod and apparatus of correcting hybrid flash artifacts in digital images
US7868922Aug 21, 2006Jan 11, 2011Tessera Technologies Ireland LimitedForeground/background segmentation in digital images
US7869628Jan 11, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7903870 *Feb 22, 2007Mar 8, 2011Texas Instruments IncorporatedDigital camera and method
US7912245Mar 22, 2011Tessera Technologies Ireland LimitedMethod of improving orientation and color balance of digital images using face detection information
US7912285 *Sep 13, 2010Mar 22, 2011Tessera Technologies Ireland LimitedForeground/background segmentation in digital images with differential exposure calculations
US7916190Mar 29, 2011Tessera Technologies Ireland LimitedRed-eye filter method and apparatus
US7916897Jun 5, 2009Mar 29, 2011Tessera Technologies Ireland LimitedFace tracking for controlling imaging parameters
US7916971May 24, 2007Mar 29, 2011Tessera Technologies Ireland LimitedImage processing method and apparatus
US7920723Aug 2, 2006Apr 5, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7953251Nov 16, 2010May 31, 2011Tessera Technologies Ireland LimitedMethod and apparatus for detection and correction of flash-induced eye defects within digital images using preview or other reference images
US7953252Nov 22, 2010May 31, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7953287Oct 17, 2008May 31, 2011Tessera Technologies Ireland LimitedImage blurring
US7957597Sep 13, 2010Jun 7, 2011Tessera Technologies Ireland LimitedForeground/background segmentation in digital images
US7962629Sep 6, 2010Jun 14, 2011Tessera Technologies Ireland LimitedMethod for establishing a paired connection between media devices
US7965875Jun 12, 2007Jun 21, 2011Tessera Technologies Ireland LimitedAdvances in extending the AAM techniques from grayscale to color images
US7970182Jun 28, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7970183Nov 22, 2010Jun 28, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7970184Nov 22, 2010Jun 28, 2011Tessera Technologies Ireland LimitedTwo stage detection for photographic eye artifacts
US7995804Aug 9, 2011Tessera Technologies Ireland LimitedRed eye false positive filtering using face location and orientation
US7995855Aug 9, 2011Tessera Technologies Ireland LimitedImage processing method and apparatus
US8000526Aug 16, 2011Tessera Technologies Ireland LimitedDetecting redeye defects in digital images
US8005265Sep 8, 2008Aug 23, 2011Tessera Technologies Ireland LimitedDigital image processing using face detection information
US8014592 *Jun 27, 2005Sep 6, 2011Thomson LicensingEdge based CMY automatic picture registration
US8036458Nov 8, 2007Oct 11, 2011DigitalOptics Corporation Europe LimitedDetecting redeye defects in digital images
US8036460Oct 11, 2011DigitalOptics Corporation Europe LimitedAnalyzing partial face regions for red-eye detection in acquired digital images
US8050465Nov 1, 2011DigitalOptics Corporation Europe LimitedReal-time face tracking in a digital image acquisition device
US8055029Jun 18, 2007Nov 8, 2011DigitalOptics Corporation Europe LimitedReal-time face tracking in a digital image acquisition device
US8055067Jan 18, 2007Nov 8, 2011DigitalOptics Corporation Europe LimitedColor segmentation
US8055090Sep 14, 2010Nov 8, 2011DigitalOptics Corporation Europe LimitedDigital image processing using face detection information
US8081254Dec 20, 2011DigitalOptics Corporation Europe LimitedIn-camera based method of detecting defect eye with high accuracy
US8086031May 13, 2009Dec 27, 2011Microsoft CorporationRegion detection
US8090205Jul 28, 2005Jan 3, 2012Thomson LicensingMethod for edge matching in film and image processing
US8126208Dec 3, 2010Feb 28, 2012DigitalOptics Corporation Europe LimitedDigital image processing using face detection information
US8126217Apr 3, 2011Feb 28, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8126218May 30, 2011Feb 28, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8131016Dec 3, 2010Mar 6, 2012DigitalOptics Corporation Europe LimitedDigital image processing using face detection information
US8131021Apr 4, 2011Mar 6, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8135184May 23, 2011Mar 13, 2012DigitalOptics Corporation Europe LimitedMethod and apparatus for detection and correction of multiple image defects within digital images using preview or other reference images
US8155397Sep 26, 2007Apr 10, 2012DigitalOptics Corporation Europe LimitedFace tracking in a camera processor
US8155468Jun 13, 2011Apr 10, 2012DigitalOptics Corporation Europe LimitedImage processing method and apparatus
US8160308Apr 17, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8169486May 24, 2007May 1, 2012DigitalOptics Corporation Europe LimitedImage acquisition method and apparatus
US8170294May 1, 2012DigitalOptics Corporation Europe LimitedMethod of detecting redeye in a digital image
US8170350May 1, 2012DigitalOptics Corporation Europe LimitedForeground/background segmentation in digital images
US8175342Apr 3, 2011May 8, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8175385May 8, 2012DigitalOptics Corporation Europe LimitedForeground/background segmentation in digital images with differential exposure calculations
US8180115May 15, 2012DigitalOptics Corporation Europe LimitedTwo stage detection for photographic eye artifacts
US8180173May 15, 2012DigitalOptics Corporation Europe LimitedFlash artifact eye defect correction in blurred images using anisotropic blurring
US8184900 *Aug 20, 2007May 22, 2012DigitalOptics Corporation Europe LimitedAutomatic detection and correction of non-red eye flash defects
US8199222Jun 12, 2012DigitalOptics Corporation Europe LimitedLow-light video frame enhancement
US8203621Jun 14, 2010Jun 19, 2012DigitalOptics Corporation Europe LimitedRed-eye filter method and apparatus
US8212864Jan 29, 2009Jul 3, 2012DigitalOptics Corporation Europe LimitedMethods and apparatuses for using image acquisition data to detect and correct image defects
US8212882May 27, 2010Jul 3, 2012DigitalOptics Corporation Europe LimitedHandheld article with movement discrimination
US8212897Mar 15, 2010Jul 3, 2012DigitalOptics Corporation Europe LimitedDigital image acquisition system with portrait mode
US8213737Jun 20, 2008Jul 3, 2012DigitalOptics Corporation Europe LimitedDigital image enhancement with reference images
US8224039Jul 17, 2012DigitalOptics Corporation Europe LimitedSeparating a directional lighting variability in statistical face modelling based on texture space decomposition
US8224108Dec 4, 2010Jul 17, 2012DigitalOptics Corporation Europe LimitedDigital image processing using face detection information
US8233674May 23, 2011Jul 31, 2012DigitalOptics Corporation Europe LimitedRed eye false positive filtering using face location and orientation
US8243182Aug 14, 2012DigitalOptics Corporation Europe LimitedMethod of making a digital camera image of a scene including the camera user
US8244053Dec 16, 2009Aug 14, 2012DigitalOptics Corporation Europe LimitedMethod and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US8264575Sep 11, 2012DigitalOptics Corporation Europe LimitedRed eye filter method and apparatus
US8264576Sep 11, 2012DigitalOptics Corporation Europe LimitedRGBW sensor array
US8265388Sep 11, 2012DigitalOptics Corporation Europe LimitedAnalyzing partial face regions for red-eye detection in acquired digital images
US8270674Jan 3, 2011Sep 18, 2012DigitalOptics Corporation Europe LimitedReal-time face tracking in a digital image acquisition device
US8270751Sep 18, 2012DigitalOptics Corporation Europe LimitedMethod of notifying users regarding motion artifacts based on image analysis
US8285067Apr 17, 2011Oct 9, 2012DigitalOptics Corporation Europe LimitedMethod of notifying users regarding motion artifacts based on image analysis
US8320641 *Nov 27, 2012DigitalOptics Corporation Europe LimitedMethod and apparatus for red-eye detection using preview or other reference images
US8326066Mar 8, 2010Dec 4, 2012DigitalOptics Corporation Europe LimitedDigital image adjustable compression and resolution using face detection information
US8330831Dec 11, 2012DigitalOptics Corporation Europe LimitedMethod of gathering visual meta data using a reference image
US8334926Dec 18, 2012DigitalOptics Corporation Europe LimitedIn-camera based method of detecting defect eye with high accuracy
US8345114Jan 1, 2013DigitalOptics Corporation Europe LimitedAutomatic face and skin beautification using face detection
US8355039Jan 15, 2013DigitalOptics Corporation Europe LimitedScene background blurring including range measurement
US8363085Sep 16, 2010Jan 29, 2013DigitalOptics Corporation Europe LimitedScene background blurring including determining a depth map
US8363908May 3, 2007Jan 29, 2013DigitalOptics Corporation Europe LimitedForeground / background separation in digital images
US8379917Feb 19, 2013DigitalOptics Corporation Europe LimitedFace recognition performance using additional image features
US8384793Jul 30, 2009Feb 26, 2013DigitalOptics Corporation Europe LimitedAutomatic face and skin beautification using face detection
US8385610Jun 11, 2010Feb 26, 2013DigitalOptics Corporation Europe LimitedFace tracking for controlling imaging parameters
US8417055Apr 9, 2013DigitalOptics Corporation Europe LimitedImage processing method and apparatus
US8442349Dec 22, 2006May 14, 2013Nokia CorporationRemoval of artifacts in flash images
US8452169 *Feb 8, 2012May 28, 2013Csr Technology Inc.Control of artificial lighting of a scene to reduce effects of motion in the scence on an image being acquired
US8482620Mar 4, 2009Jul 9, 2013Csr Technology Inc.Image enhancement based on multiple frames and motion estimation
US8494232Feb 25, 2011Jul 23, 2013DigitalOptics Corporation Europe LimitedImage processing method and apparatus
US8494286Feb 5, 2008Jul 23, 2013DigitalOptics Corporation Europe LimitedFace detection in mid-shot digital images
US8494299Feb 8, 2010Jul 23, 2013DigitalOptics Corporation Europe LimitedMethod of determining PSF using multiple instances of a nominally similar scene
US8494300Apr 6, 2010Jul 23, 2013DigitalOptics Corporation Europe LimitedMethod of notifying users regarding motion artifacts based on image analysis
US8498452Aug 26, 2008Jul 30, 2013DigitalOptics Corporation Europe LimitedDigital image processing using face detection information
US8503800Feb 27, 2008Aug 6, 2013DigitalOptics Corporation Europe LimitedIllumination detection using classifier chains
US8503818Sep 25, 2007Aug 6, 2013DigitalOptics Corporation Europe LimitedEye defect detection in international standards organization images
US8509496Nov 16, 2009Aug 13, 2013DigitalOptics Corporation Europe LimitedReal-time face tracking with reference images
US8509561Feb 27, 2008Aug 13, 2013DigitalOptics Corporation Europe LimitedSeparating directional lighting variability in statistical face modelling based on texture space decomposition
US8515138May 8, 2011Aug 20, 2013DigitalOptics Corporation Europe LimitedImage processing method and apparatus
US8520082Oct 19, 2010Aug 27, 2013DigitalOptics Corporation Europe LimitedImage acquisition method and apparatus
US8520093Aug 31, 2009Aug 27, 2013DigitalOptics Corporation Europe LimitedFace tracker and partial face tracker for red-eye filter method and apparatus
US8593542Jun 17, 2008Nov 26, 2013DigitalOptics Corporation Europe LimitedForeground/background separation using reference images
US8605170 *May 2, 2008Dec 10, 2013Sony CorporationImaging device, method of processing captured image signal and computer program
US8649604Jul 23, 2007Feb 11, 2014DigitalOptics Corporation Europe LimitedFace searching and detection in a digital image acquisition device
US8675991Jun 2, 2006Mar 18, 2014DigitalOptics Corporation Europe LimitedModification of post-viewing parameters for digital images using region or feature information
US8682097Jun 16, 2008Mar 25, 2014DigitalOptics Corporation Europe LimitedDigital image enhancement with reference images
US8698914Jun 22, 2010Apr 15, 2014S1 CorporationMethod and apparatus for recognizing a protrusion on a face
US8698924Nov 8, 2010Apr 15, 2014DigitalOptics Corporation Europe LimitedTone mapping for low-light video frame enhancement
US8711234May 28, 2013Apr 29, 2014Csr Technology Inc.Image enhancement based on multiple frames and motion estimation
US8723912Sep 16, 2010May 13, 2014DigitalOptics Corporation Europe LimitedScene background blurring including face modeling
US8743274Oct 21, 2012Jun 3, 2014DigitalOptics Corporation Europe LimitedIn-camera based method of detecting defect eye with high accuracy
US8823830Oct 23, 2012Sep 2, 2014DigitalOptics Corporation Europe LimitedMethod and apparatus of correcting hybrid flash artifacts in digital images
US8831381Jan 23, 2013Sep 9, 2014Qualcomm IncorporatedDetecting and correcting skew in regions of text in natural images
US8878967Oct 11, 2010Nov 4, 2014DigitalOptics Corporation Europe LimitedRGBW sensor array
US8896725Jun 17, 2008Nov 25, 2014Fotonation LimitedImage capture device with contemporaneous reference image capture mechanism
US8923564Feb 10, 2014Dec 30, 2014DigitalOptics Corporation Europe LimitedFace searching and detection in a digital image acquisition device
US8948468Jun 26, 2003Feb 3, 2015Fotonation LimitedModification of viewing parameters for digital images using face detection information
US8983148Sep 12, 2011Mar 17, 2015Fotonation LimitedColor segmentation
US8989453Aug 26, 2008Mar 24, 2015Fotonation LimitedDigital image processing using face detection information
US8989516Dec 16, 2008Mar 24, 2015Fotonation LimitedImage processing method and apparatus
US9007480Jul 30, 2009Apr 14, 2015Fotonation LimitedAutomatic face and skin beautification using face detection
US9014480Mar 12, 2013Apr 21, 2015Qualcomm IncorporatedIdentifying a maximally stable extremal region (MSER) in an image by skipping comparison of pixels in the region
US9047540Mar 14, 2013Jun 2, 2015Qualcomm IncorporatedTrellis based word decoder with reverse pass
US9053361Jan 23, 2013Jun 9, 2015Qualcomm IncorporatedIdentifying regions of text to merge in a natural image or video frame
US9053545Mar 19, 2007Jun 9, 2015Fotonation LimitedModification of viewing parameters for digital images using face detection information
US9064191Mar 8, 2013Jun 23, 2015Qualcomm IncorporatedLower modifier detection and extraction from devanagari text images to improve OCR performance
US9076242Mar 14, 2013Jul 7, 2015Qualcomm IncorporatedAutomatic correction of skew in natural images and video
US9129381Jun 17, 2008Sep 8, 2015Fotonation LimitedModification of post-viewing parameters for digital images using image region or feature information
US9141874Mar 7, 2013Sep 22, 2015Qualcomm IncorporatedFeature extraction and use with a probability density function (PDF) divergence metric
US9160897Jun 11, 2008Oct 13, 2015Fotonation LimitedFast motion estimation method
US9183458Mar 12, 2013Nov 10, 2015Qualcomm IncorporatedParameter selection and coarse localization of interest regions for MSER processing
US9224034Dec 22, 2014Dec 29, 2015Fotonation LimitedFace searching and detection in a digital image acquisition device
US9262699Mar 14, 2013Feb 16, 2016Qualcomm IncorporatedMethod of handling complex variants of words through prefix-tree based decoding for Devanagiri OCR
US9412007Aug 31, 2009Aug 9, 2016Fotonation LimitedPartial face detector red-eye filter method and apparatus
US20030044178 *Jul 9, 2002Mar 6, 2003Knut OberhardtMethod for the automatic detection of red-eye defects in photographic image data
US20040223063 *Aug 5, 2003Nov 11, 2004Deluca Michael J.Detecting red eye filter and apparatus using meta-data
US20050031224 *Aug 5, 2003Feb 10, 2005Yury PrilutskyDetecting red eye filter and apparatus using meta-data
US20050058340 *Oct 4, 2004Mar 17, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US20050058342 *Oct 8, 2004Mar 17, 2005Microsoft CorporationRed-eye detection based on red region detection with eye confirmation
US20050140801 *Feb 4, 2004Jun 30, 2005Yury PrilutskyOptimized performance and performance for red-eye filter method and apparatus
US20060039690 *Aug 30, 2005Feb 23, 2006Eran SteinbergForeground/background segmentation in digital images with differential exposure calculations
US20060093212 *Oct 28, 2004May 4, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image
US20060093213 *May 6, 2005May 4, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering
US20060098237 *Nov 10, 2004May 11, 2006Eran SteinbergMethod and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US20060098891 *Nov 10, 2004May 11, 2006Eran SteinbergMethod of notifying users regarding motion artifacts based on image analysis
US20060120599 *Sep 21, 2005Jun 8, 2006Eran SteinbergMethod and apparatus for red-eye detection in an acquired digital image
US20060152603 *Jan 11, 2005Jul 13, 2006Eastman Kodak CompanyWhite balance correction in digital camera images
US20060204034 *Jun 26, 2003Sep 14, 2006Eran SteinbergModification of viewing parameters for digital images using face detection information
US20060204055 *Jun 26, 2003Sep 14, 2006Eran SteinbergDigital image processing using face detection information
US20060204110 *Dec 27, 2004Sep 14, 2006Eran SteinbergDetecting orientation of digital images using face detection information
US20060282551 *May 6, 2005Dec 14, 2006Eran SteinbergRemote control apparatus for printer appliances
US20060282572 *May 6, 2005Dec 14, 2006Eran SteinbergRemote control apparatus for consumer electronic appliances
US20060284982 *Dec 2, 2005Dec 21, 2006Petronel BigioiMethod for establishing a paired connection between media devices
US20060285754 *May 30, 2006Dec 21, 2006Eran SteinbergIndoor/Outdoor Classification in Digital Images
US20070053671 *Mar 16, 2006Mar 8, 2007Kshitiz GargSystems and methods for reducing rain effects in images
US20070110305 *Oct 30, 2006May 17, 2007Fotonation Vision LimitedDigital Image Processing Using Face Detection and Skin Tone Information
US20070116379 *Nov 18, 2005May 24, 2007Peter CorcoranTwo stage detection for photographic eye artifacts
US20070147820 *Dec 27, 2005Jun 28, 2007Eran SteinbergDigital image acquisition system with portrait mode
US20070160307 *Mar 19, 2007Jul 12, 2007Fotonation Vision LimitedModification of Viewing Parameters for Digital Images Using Face Detection Information
US20070263104 *Mar 25, 2007Nov 15, 2007Fotonation Vision LimitedDetecting Red Eye Filter and Apparatus Using Meta-Data
US20070296833 *May 24, 2007Dec 27, 2007Fotonation Vision LimitedImage Acquisition Method and Apparatus
US20080013798 *Jun 12, 2007Jan 17, 2008Fotonation Vision LimitedAdvances in extending the aam techniques from grayscale to color images
US20080043122 *Jul 5, 2007Feb 21, 2008Fotonation Vision LimitedPerfecting the Effect of Flash within an Image Acquisition Devices Using Face Detection
US20080049970 *Aug 20, 2007Feb 28, 2008Fotonation Vision LimitedAutomatic detection and correction of non-red eye flash defects
US20080112599 *Nov 7, 2007May 15, 2008Fotonation Vision Limitedmethod of detecting redeye in a digital image
US20080118144 *Jun 27, 2005May 22, 2008Shu LinEdge Based Cmy Automatic Picture Registration
US20080122959 *Jul 28, 2005May 29, 2008Shu LinMethod for Edge Matching in Film and Image Processing
US20080143854 *Nov 18, 2007Jun 19, 2008Fotonation Vision LimitedPerfecting the optics within a digital image acquisition device using face detection
US20080186389 *Feb 21, 2008Aug 7, 2008Fotonation Vision LimitedImage Modification Based on Red-Eye Filter Analysis
US20080205712 *Feb 27, 2008Aug 28, 2008Fotonation Vision LimitedSeparating Directional Lighting Variability in Statistical Face Modelling Based on Texture Space Decomposition
US20080211937 *May 13, 2008Sep 4, 2008Fotonation Vision LimitedRed-eye filter method and apparatus
US20080219517 *Feb 27, 2008Sep 11, 2008Fotonation Vision LimitedIllumination Detection Using Classifier Chains
US20080219581 *Sep 18, 2007Sep 11, 2008Fotonation Vision LimitedImage Processing Method and Apparatus
US20080231713 *Mar 25, 2007Sep 25, 2008Fotonation Vision LimitedHandheld Article with Movement Discrimination
US20080232711 *Mar 5, 2008Sep 25, 2008Fotonation Vision LimitedTwo Stage Detection for Photographic Eye Artifacts
US20080240555 *Aug 2, 2006Oct 2, 2008Florin NanuTwo Stage Detection for Photographic Eye Artifacts
US20080267461 *Jul 3, 2008Oct 30, 2008Fotonation Ireland LimitedReal-time face tracking in a digital image acquisition device
US20080284875 *May 2, 2008Nov 20, 2008Sony CorporationImaging device, method of processing captured image signal and computer program
US20080292193 *May 24, 2007Nov 27, 2008Fotonation Vision LimitedImage Processing Method and Apparatus
US20080309769 *Jun 11, 2008Dec 18, 2008Fotonation Ireland LimitedFast Motion Estimation Method
US20080309770 *Jun 18, 2007Dec 18, 2008Fotonation Vision LimitedMethod and apparatus for simulating a camera panning effect
US20080316327 *Jun 17, 2008Dec 25, 2008Fotonation Ireland LimitedImage capture device with contemporaneous reference image capture mechanism
US20080316328 *Jun 17, 2008Dec 25, 2008Fotonation Ireland LimitedForeground/background separation using reference images
US20080316341 *Aug 15, 2008Dec 25, 2008Fotonation Vision LimitedDetecting red eye filter and apparatus using meta-data
US20080317339 *Jun 19, 2008Dec 25, 2008Fotonation Ireland LimitedMethod and apparatus for red-eye detection using preview or other reference images
US20080317357 *Jun 16, 2008Dec 25, 2008Fotonation Ireland LimitedMethod of gathering visual meta data using a reference image
US20080317378 *Jun 16, 2008Dec 25, 2008Fotonation Ireland LimitedDigital image enhancement with reference images
US20090003661 *Sep 3, 2008Jan 1, 2009Fotonation Vision LimitedSeparating a Directional Lighting Variability In Statistical Face Modelling Based On Texture Space Decomposition
US20090003708 *Jun 17, 2008Jan 1, 2009Fotonation Ireland LimitedModification of post-viewing parameters for digital images using image region or feature information
US20090027520 *Aug 19, 2008Jan 29, 2009Fotonation Vision LimitedRed-eye filter method and apparatus
US20090040342 *Oct 17, 2008Feb 12, 2009Fotonation Vision LimitedImage Blurring
US20090052749 *Oct 30, 2008Feb 26, 2009Fotonation Vision LimitedDigital Image Processing Using Face Detection Information
US20090052750 *Oct 30, 2008Feb 26, 2009Fotonation Vision LimitedDigital Image Processing Using Face Detection Information
US20090080713 *Sep 26, 2007Mar 26, 2009Fotonation Vision LimitedFace tracking in a camera processor
US20090102949 *Jul 5, 2007Apr 23, 2009Fotonation Vision LimitedPerfecting the Effect of Flash within an Image Acquisition Devices using Face Detection
US20090123063 *Nov 8, 2007May 14, 2009Fotonation Vision LimitedDetecting Redeye Defects in Digital Images
US20090179999 *Jul 16, 2009Fotonation Ireland LimitedImage Processing Method and Apparatus
US20090185753 *May 6, 2008Jul 23, 2009Fotonation Ireland LimitedImage processing method and apparatus
US20090189998 *Jan 29, 2009Jul 30, 2009Fotonation Ireland LimitedMethods And Apparatuses For Using Image Acquisition Data To Detect And Correct Image Defects
US20090196466 *Feb 5, 2008Aug 6, 2009Fotonation Vision LimitedFace Detection in Mid-Shot Digital Images
US20090208056 *Dec 11, 2008Aug 20, 2009Fotonation Vision LimitedReal-time face tracking in a digital image acquisition device
US20090238452 *May 13, 2009Sep 24, 2009Microsoft CorporationRegion Detection
US20090244296 *Mar 26, 2008Oct 1, 2009Fotonation Ireland LimitedMethod of making a digital camera image of a scene including the camera user
US20090273685 *Aug 21, 2006Nov 5, 2009Fotonation Vision LimitedForeground/Background Segmentation in Digital Images
US20090303343 *Jun 16, 2009Dec 10, 2009Fotonation Ireland LimitedLow-light video frame enhancement
US20100026831 *Feb 4, 2010Fotonation Ireland LimitedAutomatic face and skin beautification using face detection
US20100026832 *Jul 30, 2009Feb 4, 2010Mihai CiucAutomatic face and skin beautification using face detection
US20100040284 *Sep 14, 2009Feb 18, 2010Fotonation Vision LimitedMethod and apparatus of correcting hybrid flash artifacts in digital images
US20100053362 *Aug 31, 2009Mar 4, 2010Fotonation Ireland LimitedPartial face detector red-eye filter method and apparatus
US20100053368 *Aug 31, 2009Mar 4, 2010Fotonation Ireland LimitedFace tracker and partial face tracker for red-eye filter method and apparatus
US20100054533 *Mar 4, 2010Fotonation Vision LimitedDigital Image Processing Using Face Detection Information
US20100054549 *Mar 4, 2010Fotonation Vision LimitedDigital Image Processing Using Face Detection Information
US20100060727 *Nov 16, 2009Mar 11, 2010Eran SteinbergReal-time face tracking with reference images
US20100085439 *Apr 8, 2010Hon Hai Precision Industry Co., Ltd.Image capture device and method thereof
US20100092039 *Dec 14, 2009Apr 15, 2010Eran SteinbergDigital Image Processing Using Face Detection Information
US20100146165 *Feb 24, 2010Jun 10, 2010Fotonation Vision LimitedRemote control apparatus for consumer electronic appliances
US20100165140 *Mar 8, 2010Jul 1, 2010Fotonation Vision LimitedDigital image adjustable compression and resolution using face detection information
US20100182454 *Dec 17, 2009Jul 22, 2010Fotonation Ireland LimitedTwo Stage Detection for Photographic Eye Artifacts
US20100182458 *Mar 15, 2010Jul 22, 2010Fotonation Ireland LimitedDigital image acquisition system with portrait mode
US20100201826 *Feb 8, 2010Aug 12, 2010Fotonation Vision LimitedMethod of determining psf using multiple instances of a nominally similar scene
US20100201827 *Dec 16, 2009Aug 12, 2010Fotonation Ireland LimitedMethod and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US20100238309 *May 27, 2010Sep 23, 2010Fotonation Vision LimitedHandheld Article with Movement Discrimination
US20100260414 *Jun 27, 2010Oct 14, 2010Tessera Technologies Ireland LimitedDetecting redeye defects in digital images
US20100272363 *Jul 23, 2007Oct 28, 2010Fotonation Vision LimitedFace searching and detection in a digital image acquisition device
US20100278452 *Dec 22, 2006Nov 4, 2010Nokia CorporationRemoval of Artifacts in Flash Images
US20100302394 *May 28, 2009Dec 2, 2010Phanish Hanagal Srinivasa RaoBlinked eye artifact removal for a digital imaging device
US20100328472 *Apr 6, 2010Dec 30, 2010Fotonation Vision LimitedMethod of Notifying Users Regarding Motion Artifacts Based on Image Analysis
US20110025859 *Feb 3, 2011Tessera Technologies Ireland LimitedForeground/Background Segmentation in Digital Images
US20110026780 *Jun 11, 2010Feb 3, 2011Tessera Technologies Ireland LimitedFace tracking for controlling imaging parameters
US20110053654 *Mar 3, 2011Tessera Technologies Ireland LimitedMethod of Making a Digital Camera Image of a Scene Including the Camera User
US20110063465 *Mar 17, 2011Fotonation Ireland LimitedAnalyzing Partial Face Regions for Red-Eye Detection in Acquired Digital Images
US20110069182 *Mar 24, 2011Tessera Technologies Ireland LimitedTwo Stage Detection For Photographic Eye Artifacts
US20110069208 *Nov 22, 2010Mar 24, 2011Tessera Technologies Ireland LimitedTwo Stage Detection For Photographic Eye Artifacts
US20110081052 *Apr 7, 2011Fotonation Ireland LimitedFace recognition performance using additional image features
US20110102628 *May 5, 2011Tessera Technologies Ireland LimitedForeground/Background Segmentation in Digital Images
US20110102638 *May 5, 2011Tessera Technologies Ireland LimitedRgbw sensor array
US20110102643 *Nov 8, 2010May 5, 2011Tessera Technologies Ireland LimitedPartial Face Detector Red-Eye Filter Method and Apparatus
US20110115928 *Oct 19, 2010May 19, 2011Tessera Technologies Ireland LimitedImage Acquisition Method and Apparatus
US20110115949 *Dec 4, 2010May 19, 2011Tessera Technologies Ireland LimitedTwo Stage Detection for Photographic Eye Artifacts
US20110129121 *Jan 3, 2011Jun 2, 2011Tessera Technologies Ireland LimitedReal-time face tracking in a digital image acquisition device
US20110134271 *Dec 1, 2010Jun 9, 2011Tessera Technologies Ireland LimitedDetecting Red Eye Filter and Apparatus Using Meta-Data
US20110157408 *Jun 30, 2011Tessera Technologies Ireland LimitedForeground/Background Segmentation in Digital Images with Differential Exposure Calculations
US20110193989 *Aug 11, 2011Tessera Technologies Ireland LimitedMethod of Notifying Users Regarding Motion Artifacts Based on Image Analysis
US20110199493 *Aug 18, 2011Tessera Technologies Ireland LimitedMethod of Notifying Users Regarding Motion Artifacts Based on Image Analysis
US20110205397 *Feb 24, 2010Aug 25, 2011John Christopher HahnPortable imaging device having display with improved visibility under adverse conditions
US20110211095 *Sep 1, 2011Tessera Technologies Ireland LimitedTwo Stage Detection For Photographic Eye Artifacts
US20110221936 *Sep 15, 2011Tessera Technologies Ireland LimitedMethod and Apparatus for Detection and Correction of Multiple Image Defects Within Digital Images Using Preview or Other Reference Images
US20110222730 *Sep 15, 2011Tessera Technologies Ireland LimitedRed Eye False Positive Filtering Using Face Location and Orientation
US20110234847 *Sep 29, 2011Tessera Technologies Ireland LimitedImage Processing Method and Apparatus
US20110235912 *Sep 29, 2011Tessera Technologies Ireland LimitedImage Processing Method and Apparatus
US20120148224 *Jun 14, 2012Csr Technology Inc.Control of artificial lighting of a scene to reduce effects of motion in the scene on an image being acquired
CN102483851A *Jun 22, 2010May 30, 2012株式会社S1Method and apparatus for prominent facial features recognition
DE112006001017B4 *Apr 27, 2006Apr 17, 2014Hewlett-Packard Development Company, L.P.Verfahren und Vorrichtung zum Eingliedern von Irisfarbe in eine Rotes-Auge-Korrektur
EP1984892A2 *Aug 21, 2006Oct 29, 2008Fotonation Vision LimitedForeground/background segmentation in digital images
EP1984892B1 *Aug 21, 2006Jul 29, 2015FotoNation LimitedForeground/background segmentation in digital images
EP2153374A1 *May 24, 2007Feb 17, 2010Fotonation Vision LimitedImage processing method and apparatus
WO2007057064A1 *Aug 25, 2006May 24, 2007Fotonation Vision LimitedTwo stage detection for photographic eye artifacts
WO2008078153A1 *Dec 22, 2006Jul 3, 2008Nokia CorporationRemoval of artifacts in flash images
WO2010017953A1Aug 11, 2009Feb 18, 2010Fotonation Ireland LimitedIn-camera based method of detecting defect eye with high accuracy
WO2010145910A1May 20, 2010Dec 23, 2010Tessera Technologies Ireland LimitedLow-light video frame enhancement
Classifications
U.S. Classification348/239, 348/E05.038
International ClassificationG06T1/00, G06T7/40, H04N5/235, G06T7/00, G06T5/00, G06T5/50, H04N1/62, G06K9/00
Cooperative ClassificationH04N5/2354, G06K9/00248, G06T7/408, G06T5/008, G06K9/0061, H04N1/624, H04N1/62, G06T2207/10024, G06T2207/30201, H04N1/6086, G06T2207/30216, H04N9/70, H04N5/23229, H04N5/23293, G06K9/00234
European ClassificationG06K9/00F1L, G06T1/00A, H04N1/60R2, G06T5/00M1, H04N1/62C, G06T5/00D, H04N5/235L, G06T7/00B, H04N1/62, G06K9/00S2, G06T7/40C
Legal Events
DateCodeEventDescription
Nov 2, 2004ASAssignment
Owner name: FOTONATION VISION LTD., IRELAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STEINBERG, ERAN;PRILUTSKY, YURY;CORCORAN, PETER;AND OTHERS;REEL/FRAME:015324/0620;SIGNING DATES FROM 20040817 TO 20041016
Owner name: FOTONATION VISION LTD.,IRELAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STEINBERG, ERAN;PRILUTSKY, YURY;CORCORAN, PETER;AND OTHERS;SIGNING DATES FROM 20040817 TO 20041016;REEL/FRAME:015324/0620
Nov 29, 2010ASAssignment
Owner name: TESSERA TECHNOLOGIES IRELAND LIMITED, IRELAND
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FOTONATION VISION LIMITED;REEL/FRAME:025424/0623
Effective date: 20101001
Oct 30, 2011ASAssignment
Owner name: DIGITALOPTICS CORPORATION EUROPE LIMITED, IRELAND
Free format text: CHANGE OF NAME;ASSIGNOR:TESSERA TECHNOLOGIES IRELAND LIMITED;REEL/FRAME:027144/0293
Effective date: 20110713
Oct 23, 2013FPAYFee payment
Year of fee payment: 4
Dec 2, 2014ASAssignment
Owner name: FOTONATION LIMITED, IRELAND
Free format text: CHANGE OF NAME;ASSIGNOR:DIGITALOPTICS CORPORATION EUROPE LIMITED;REEL/FRAME:034512/0972
Effective date: 20140609