WO2003007240A1 - Procede et systeme pour modifier la qualite d'image - Google Patents
Procede et systeme pour modifier la qualite d'image Download PDFInfo
- Publication number
- WO2003007240A1 WO2003007240A1 PCT/FR2002/001911 FR0201911W WO03007240A1 WO 2003007240 A1 WO2003007240 A1 WO 2003007240A1 FR 0201911 W FR0201911 W FR 0201911W WO 03007240 A1 WO03007240 A1 WO 03007240A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- formatted information
- chain
- devices
- faults
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 201
- 238000012545 processing Methods 0.000 claims abstract description 80
- 230000007547 defect Effects 0.000 claims abstract description 47
- 230000008569 process Effects 0.000 claims abstract description 39
- 230000009466 transformation Effects 0.000 claims description 58
- 238000004364 calculation method Methods 0.000 claims description 23
- 238000004458 analytical method Methods 0.000 claims description 10
- 230000004048 modification Effects 0.000 claims description 10
- 238000012986 modification Methods 0.000 claims description 10
- 230000001747 exhibiting effect Effects 0.000 claims description 8
- 238000003860 storage Methods 0.000 claims description 6
- 230000000694 effects Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 5
- 210000004556 brain Anatomy 0.000 claims description 4
- 238000006467 substitution reaction Methods 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 abstract description 13
- 101150053844 APP1 gene Proteins 0.000 abstract description 5
- 101100189105 Homo sapiens PABPC4 gene Proteins 0.000 abstract description 5
- 102100039424 Polyadenylate-binding protein 4 Human genes 0.000 abstract description 5
- 238000009877 rendering Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 14
- 238000013461 design Methods 0.000 description 11
- 230000002093 peripheral effect Effects 0.000 description 9
- 230000004075 alteration Effects 0.000 description 8
- 230000004907 flux Effects 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 201000009310 astigmatism Diseases 0.000 description 6
- 238000012937 correction Methods 0.000 description 6
- 101100055496 Arabidopsis thaliana APP2 gene Proteins 0.000 description 5
- 101100016250 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) GYL1 gene Proteins 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 238000000429 assembly Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 238000011282 treatment Methods 0.000 description 4
- 101100503482 Arabidopsis thaliana FTSH5 gene Proteins 0.000 description 3
- 206010010071 Coma Diseases 0.000 description 3
- 101150082136 VAR1 gene Proteins 0.000 description 3
- 238000007792 addition Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000011002 quantification Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 101100013558 Arabidopsis thaliana FTSH2 gene Proteins 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000010894 electron beam technology Methods 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G06T3/10—
-
- G06T5/70—
-
- G06T5/73—
-
- G06T5/80—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00007—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for relating to particular apparatus or devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00045—Methods therefor using a reference pattern designed for the purpose, e.g. a test chart
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00071—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for characterised by the action taken
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/40093—Modification of content of picture, e.g. retouching
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/58—Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
Definitions
- the present invention relates to a method and a system for modifying the quality of at least one image originating from or intended for a chain of devices.
- the invention relates to a method for modifying the quality of at least one image originating from or intended for a given chain of devices.
- the determined device chain comprises at least one image capturing device and / or at least one image rendering device, the image capturing devices and / or the image rendering devices, progressively put on the market by distinct economic players, belong to an indeterminate set of devices, the devices of the set of devices have faults which can be characterized by formatted information, the process comprises, for the image concerned, the steps following: - the step of listing formatted sources of information relating to the devices of the set of devices,
- the method is such that the automatic search is carried out by means of indexes obtained directly or indirectly from an analysis:
- the apparatuses. the chain of devices are identified by identifiers, in particular a bar code
- the analysis with a view to searching for specific formatted information includes the step of determining the identifiers.
- the method is such that the image, the index and / or the identifier are contained in the same file. It results from the combination of technical features that it is possible to implement a posteriori the method according to the invention in the case where certain apparatuses of the chain have been put on the market before the formatted information concerning them has been established.
- the method is such that the image and at least part of the specific formatted information are contained in the same image file. he results from the combination of technical features that it is possible to automatically search for the information formatted in the image file.
- the method further comprises the step of previously storing at least part of the information formatted in a database, the method further comprises the step of updating the database.
- the method is such that one of the devices in the device chain has at least one variable characteristic depending on the image, in particular the focal length, a fraction of the specific formatted information is linked to the defects of the apparatus having the variable characteristic, the method further comprising the following steps:
- the step of determining the value of the variable characteristics, for said image the step of determining the fraction of said specific formatted information taking into account the values of the variable characteristics thus obtained.
- the image is contained in a file
- the method is such that to determine the value of the variable characteristic, data present in the file are used, in particular data, for example the focal length, according to a format such as the Exif standard. It results from the combination of technical features that it is possible to implement a posteriori the method according to the invention in the case where the device having the variable characteristic has been put on the market before the formatted information concerning it n 'have been established.
- the method is such that to modify the quality of at least one image originating from or intended for a chain of devices: - a virtual device with defects equivalent to at least part of the faults of at least one device in the chain of devices, hereinafter referred to as the original faults, is determined, - the virtual formatted information linked to the faults is determined the virtual device, we substitute to determine the specific formatted information linked to all the devices in the chain of devices, the virtual formatted information, to the specific formatted information relating to the original faults.
- the method is intended to modify the quality of at least one color plane of a color image
- the color plane is characterized by a determined color
- the specific formatted information furthermore includes data relating to the determined color
- a color plane is calculated using the data relating to the determined color and to the image.
- the method further comprises, in the case where the process of searching for specific formatted information does not succeed for one of the devices in the chain of devices, the step of calculating the unknown formatted information .
- the method further comprises the step of calculating the unknown formatted information linked to an appliance of the appliance chain:
- the method further comprises, for an apparatus for capturing images of the chain. devices, the step of calculating unknown formatted information:
- the unknown formatted information is at least partly composed of the parameters of the chosen configurable transformation models.
- the method further comprises: the step of calculating the differences between the transformed image and the class of synthetic images,
- the method is such that one of the devices in the device chain has at least one variable characteristic depending on the image, in particular the focal length and / or the aperture. a fraction of the specific formatted information is linked to faults in the device having the variable characteristic (s). each variable characteristic is likely to be associated with a value to form a combination made up of all the variable characteristics and values, the method further comprises the step of determining the fraction of the unknown formatted information: - by selecting predetermined combinations,
- the method further comprises, for a means of restoring images of the chain of devices, the step of producing data characterizing faults in the means of restoring the images, in particular the characteristics of distortion.
- the unknown formatted information is at least partly composed of the data characterizing defects in the restitution means.
- the method is such that the specific formatted information, linked to an appliance or to several appliances of the appliance chain, is determined so that it can be applied to similar appliances. It results from the combination of technical features that the implementation of the process involves only a limited number of formatted information.
- the method is such that the image includes associated information, in particular a digital signature, the steps of the method are implemented so that they conserve or modify the associated information.
- the method further comprises the step of associating information with the modified image, in particular the information indicating that it has been modified.
- the method is more particularly intended for modifying the visual quality of the image for an observer
- the formatted information linked to the faults of the apparatuses of the apparatus chain further comprises formatted information linked to the characteristics observer's vision, including abnormalities in the eyes and / or brain of the observer.
- the invention also relates to an application of the method described above.
- the purpose of the application is to improve the quality of the images processed by image processing software or image processing components, by correcting the effects of at least one of the defects in the devices in the device chain. . It results from the combination of technical features that the quality of the processed images is improved if not perfect, without resorting to expensive devices.
- the object of the application is that the quality of the images processed by the image processing software or the image processing components is comparable to that of images produced with a chain of reference devices.
- the application is such that, in order for the quality of the images processed to be comparable to those of images produced with a reference appliance chain, formatted information relating to the appliance chain is produced taking into account the faults in the reference device chain.
- the invention relates to a system for modifying the quality of at least one image originating from or intended for a given chain of devices.
- the determined device chain includes at least one image capturing device and / or at least one image rendering device, the image capturing devices and / or the image rendering devices, progressively placed on the market by economic players separate, belong to an indeterminate set of devices, the devices of the set of devices have faults which can be characterized by formatted information.
- the system comprises computer processing means making it possible, for the image concerned, to: - list sources of formatted information relating to the devices of the set of devices, automatically search among the formatted information thus listed, information specific formatting relating to the determined device chain,
- the system is such that the computer processing means carry out the search automatically by means of an index, the index is obtained directly or indirectly by means of analysis from an analysis : - of the image, and / or
- the devices in the device chain are identified by identifiers, in particular a bar code
- the analysis means include identification means for determining the identifiers.
- the system is such that the image, the index and / or the identifier are contained in the same file.
- the system is such that the image and at least part of the specific formatted information are contained in the same image file.
- the system further comprises storage means for previously storing at least part of the information formatted in a database, the system further comprises updating means for updating the database.
- the system is such that one of the devices in the device chain has at least one variable characteristic depending on the image, in particular the focal length, a fraction of the specific formatted information is linked to the defects of the device having the variable characteristic, the system further comprises calculation means for determining: the value of the variable characteristics, for the image concerned,
- the image is contained in a file
- the system is such that, to determine the value of the variable characteristic, the system comprises computer processing means for processing data present in the file, in particular data, for example the focal length, in a format such as the Exif standard.
- the system is such that, to modify the quality of at least one image coming from or intended for a chain of devices, the system comprises computer processing means for determining: a virtual device having faults equivalent to at least some of the faults of at least one device in the device chain, hereinafter referred to as the original faults,
- the system is such that, in order to determine the specific formatted information linked to all of the apparatuses in the chain of apparatuses, the computer processing means comprise substitution means for substituting the virtual formatted information for the specific formatted information relating to faults d 'origin.
- the system is intended to modify the quality of at least one color plane of a color image
- the color plane is characterized by a determined color
- the specific formatted information also comprises data relating to the determined color
- the system comprises calculation means for calculating a color plane using the data relating to the determined color and to the image.
- the system further comprises, in the case where the process of searching for specific formatted information does not succeed for one of the apparatuses in the chain of apparatuses, calculation means for calculating the formatted information unknown.
- the system is such that the calculation means for calculating the unknown formatted information, linked to an appliance in the appliance chain, include processing means for measuring faults in the appliance, and / or to simulate the device.
- the system further comprises, for an apparatus for capturing images of the chain of apparatuses, calculation means calculating the unknown formatted information by producing a class of synthetic images by mathematical projections.
- the image capturing apparatus capturing at least one reference image of each reference scene
- the calculation means calculating the unknown formatted information by choosing from a set of configurable transformation models that making it possible to transform the reference image into a transformed image close to class d 'synthetic images of the reference scene, the transformed image having a deviation from the class of synthetic images, the unknown formatted information is at least partly composed of the parameters of the chosen configurable transformation models.
- system further comprises computer processing means for:
- the system is such that one of the devices in the device chain has at least one variable characteristic depending on the image, in particular the focal length and / or the aperture. a fraction of the specific formatted information is linked to faults in the device having the variable characteristic (s). each variable characteristic is likely to be associated with a value to form a combination consisting of all the variable characteristics and the values, the system furthermore comprises computer processing means for determining the fraction of the unknown formatted information:
- the system further comprises, for a means of restoring images of the chain of devices, computer processing means for producing data characterizing faults in the means of restoring the images, in particular the distortion characteristics, the unknown formatted information is at least partly composed of the data characterizing faults in the restitution means.
- the system is such that the specific formatted information, linked to an appliance or to several appliances in the appliance chain, is determined so that it can be applied to similar appliances.
- the system is such that the image includes associated information, in particular a digital signature.
- the system is implemented in such a way that it retains or modifies the associated information.
- the system further comprises computer processing means for associating information with the modified image, in particular information indicating that it has been modified.
- the system is more particularly intended to modify the visual quality of the image for an observer.
- the formatted information linked to the faults of the apparatuses of said chain of apparatuses further comprises formatted information linked to the vision characteristics of said observer, in particular anomalies of functioning of the eyes and / or brain of said observer.
- FIG. 1 a schematic view of an image capture
- FIG. 2 a schematic view of an image restitution
- FIG. 3 a schematic view of the pixels of an image
- FIG. 5 the flow diagram of the method making it possible to calculate the difference between the mathematical image and the corrected image
- FIG. 6 the flow diagram of the method making it possible to obtain the best rendering transformation for an image rendering means
- FIG. 7 a schematic view of the elements making up the system to which the invention applies
- FIG. 9d a schematic profile view of a real point of an image
- FIG. 12 the flow diagram of the method making it possible to obtain the best transformation for an image capture device
- FIGS. 13a to 13c connection diagrams of example of embodiment of systems making it possible to correct an image
- FIGS. 14a to 14c flow diagrams of example embodiments of methods making it possible to implement an automatic image correction
- FIG. 15 a flow diagram of a method for substituting a virtual device for a chain of devices.
- FIG. 16.2 a diagram showing an apparatus having variable characteristics; figure 16.3, a diagram involving one or more vision defects of an observer; - Figure 16.4, a processing diagram of the characteristics of a virtual device; FIG. 16.5, a diagram showing the addition of information associated with a corrected image;
- FIG. 16 a diagram illustrating the fact that formatted information can relate to one or more batches of devices
- FIG. 17 a description of an example of implementation of the method and system according to the invention.
- the devices in the P3 device chain in particular image capture devices and / or image rendering devices, are gradually being put on the market by distinct economic players and belong to an indefinite set of devices, also called the P75 device set.
- a device can be in particular: - an image capture device, such as for example a disposable camera, a digital camera, a reflex camera, a scanner, a fax, an endoscope, a camcorder , a surveillance camera, a toy, a camera integrated or connected to a telephone, a personal assistant or a computer, a thermal camera, an ultrasound machine,
- an image capture device such as for example a disposable camera, a digital camera, a reflex camera, a scanner, a fax, an endoscope, a camcorder , a surveillance camera, a toy, a camera integrated or connected to a telephone, a personal assistant or a computer, a thermal camera, an ultrasound machine,
- an image reproduction apparatus or image reproduction means 19 such as for example a screen, a projector, a television set, virtual reality glasses or a printer, an apparatus including its installation, for example a projector, a screen and the way they are positioned,
- an apparatus which one wishes to resemble for producing images having for example an aspect similar to those produced by a Leica brand apparatus
- an image processing device for example a zoom software which has as edge effect a 'add blur
- a more complex device such as a scanner / fax / printer, a Minilab for photo printing, a video conference device can be considered as one device or several devices.
- the device chain P3 is called a set of devices.
- the notion of chain of apparatuses P3 can also include a notion of order.
- a camera a scanner, a printer for example in a Minilab of photo printing,
- a digital camera for example in a photo printing Minilab
- a scanner for example in a photo printing Minilab
- a screen or a printer for example in a computer
- a fault P5 of a device of the set of devices P75 is called a fault linked to the characteristics of the optics and / or the sensor and / or the electronics and / or the software integrated into a device; examples of P5 faults are for example distortion, blurring, vignetting, chromatic aberrations, color rendering, uniformity of the flash, sensor noise, grain, astigmatism, spherical aberration.
- P2 image is a digital image captured or modified or rendered by a device.
- the P2 image can come from a device in the P3 device chain.
- the P2 image can be intended for a device in the P3 device chain. More generally, the image P2 can come from and / or be intended for the chain of apparatuses P3.
- image P2 is called: a still image of the sequence of images.
- Formatted information 15 is called data linked to the faults P5 of one or more apparatuses of the apparatus chain P3 and allowing means image processing to modify the quality of P2 images taking into account P5 faults in the device.
- various methods and systems can be used based on measurements, and / or captures or restitution of references, and / or simulations.
- the formatted information 15 is intended for means of image processing, in particular software, in order to modify the quality of the images processed by the image processing means, the chain of apparatuses P3 notably comprises at least one image capturing apparatus and / or at least one restitution means and / or at least one observer, the method comprises the step of producing data characterizing faults P5 of the devices of the device chain P3. the data is formatted information 15.
- the formatted information 15 it is possible, for example, to use the method and the system described in the international patent application filed on the same day as the present application on behalf of the company Vision IQ and under the title: "Method and system for producing formatted information related to geometric distortions. "In this application, there is described a method for producing formatted information related to the devices of an appliance chain P3.
- the chain of apparatuses P3 notably comprises at least one image capture apparatus and / or at least one image restitution apparatus.
- the method includes the step of producing formatted information 15 linked to the geometric distortions of at least one device in the chain.
- the device making it possible to capture or restore an image on a support
- the device comprises at least one fixed characteristic and / or one variable characteristic depending on the image
- the fixed characteristic and / or variable characteristic is likely to be associated with one or more characteristic values, in particular the focal length and / or the focus and their associated characteristic values
- the method comprises the step of producing measured formatted information linked to the geometric distortions of the camera from a measured field
- the formatted information 15 may include the measured formatted information.
- the formatted information 15 it is possible, for example, to use the method and the system described in the international patent application filed on the same day as the presents request on behalf of the company Vision IQ and under the title: "Method and system for producing formatted information linked to faults in at least one device in a chain, in particular blurring."
- a method is described for producing formatted information linked to the devices of a chain of devices P3.
- the chain of apparatuses P3 notably comprises at least one image capture apparatus and / or at least one image restitution apparatus, the method comprises the step of producing formatted information linked to the defects P5 of at least a chain device.
- the apparatus making it possible to capture or restore an image comprises at least one fixed characteristic and / or one variable characteristic depending on the image (I).
- the fixed and / or variable characteristics are likely to be associated with one or more characteristic values, in particular the focal length and / or the focusing and their associated characteristic values.
- the method comprises the step of producing measured formatted information linked to the faults P5 of the apparatus from a measured field, the formatted information can include the measured formatted information.
- the formatted information 15 one can for example use the method and the system described in the international patent application filed on the same day as the present application in the name of the company Vision IQ and under the title: "Method and system for providing , in a standard format, formatted information to image processing means. " In this application, a method is described for supplying, in a standard format, formatted information 15 to image processing means, in particular software and / or components, the formatted information 15 is linked to the defects P5 of a chain of P3 devices.
- the chain of apparatuses P3 notably comprises at least one image capturing apparatus and / or an image restitution apparatus, the image processing means use the formatted information 15 to modify the quality of at least one image P2 from or intended for the P3 appliance chain.
- the formatted information 15 includes data characterizing faults P5 of the image capturing apparatus, in particular the distortion characteristics, and / or data characterizing faults of the image restitution apparatus, in particular the distortion characteristics.
- the method comprises the step of filling in at least one field of the standard format with the formatted information 15.
- the field is designated by a field name, the field containing at least one field value.
- the formatted information 15 it is possible, for example, to use the method and the system described in the international patent application filed on the same day as the present application in the name of the company Vision IQ and under the title: "Method and system for reducing the frequency of updates to image processing means. "
- a method is described for reducing the frequency of updates to image processing means, in particular software and / or a component, the image processing means making it possible to modify the quality of the digital images originating or intended for a chain of P3 devices.
- the apparatus chain P3 comprises at least one image capturing apparatus and / or at least one image restitution apparatus, the image processing means implement formatted information linked to the defects P5 of at least minus one device in the P3 device chain.
- the formatted information P3 depends on at least one variable.
- the formatted information making it possible to establish a correspondence between a part of the variables and identifiers
- the identifiers make it possible to determine the value of the variable corresponding to the identifier taking account of the identifier and the image. It results from the combination of technical features that it is possible to determine the value of a variable, in particular in the case where the physical meaning and / or the content of the variable are known only after the dissemination of the processing means. image. It also results from combination of technical features that the time between two updates of the correction software can be spaced.
- a method for calculating a transformed image from a digital image and formatted information 15 relating to defects P5 of a chain of apparatuses P3.
- the device chain P3 includes image capture devices and / or image rendering devices.
- the appliance chain P3 comprises at least one appliance.
- the method comprises the step of automatically determining characteristic data from the formatted information and / or from the digital image. It results from the combination of technical features that the transformed image does not present any visible or annoying defect, in particular defects linked to noise, for its subsequent use.
- the following example illustrates one way to produce formatted information.
- FIG. 1 there is shown: a scene 3 comprising an object 107, a sensor 101 and the surface of the sensor 110, an optical center 111, an observation point 105 on a surface of the sensor 110, a direction of observation 106 passing through the observation point 105, the optical center 111, the scene 3, a surface 10 geometrically associated with the surface of the sensor 110.
- FIG. 2 shows an image 103, an image rendering means 19 and a restored image 191 obtained on the rendering support 190.
- FIG. 3 shows a scene 3, an image capture device 1 and an image 103 made up of pixels 104.
- FIGS. 4a and 4b two variants of a reference scene 9 have been shown.
- Figure 5 there is shown a flowchart implementing a scene 3, a mathematical projection 8 giving a mathematical image 70 of scene 3, a real projection 72 giving an image 103 of scene 3 for the characteristics used 74, a configurable transformation model 12 giving a corrected image 71 of the image 103, the corrected image 71 having a difference 73 with the mathematical image 70.
- FIG. 6 is represented a flowchart implementing an image 103, an actual projection projection 90 giving a restored image 191 of the image 103 for the reproduction characteristics used 95, a configurable transformation transformation model 97 giving a corrected image of restitution 94 of image 103, a mathematical projection of restitution 96 giving a mathematical image of restitution 92 of the corrected image of restitution 94 and having a difference in restitution 93 with the restored image 191.
- FIG. 7 a system comprising an image capture device 1 consisting of an optic 100, a sensor 101 and an electronics 102.
- a memory area 16 is also shown containing an image 103, a database 22 containing formatted information 15, means 18 for transmitting the completed image 120 consisting of image 103 and formatted information 15 to calculation means 17 containing software for processing picture 4.
- FIG. 8 shows formatted information 15 consisting of fields 90.
- FIGS. 9a to 9d represent a mathematical image 70, an image 103, the mathematical position 40 of a point, the mathematical form 41 of a point, compared to the real position 50 and to the real form 51 of the point correspondent of the image.
- FIG. 10 shows a grid 80 of characteristic points.
- FIG. 11 shows a flow chart implementing an image 103, of the characteristics used 74, a database of characteristics 22.
- the formatted information 15 is obtained from the characteristics used 74 and stored in the database 22.
- the completed image 120 is obtained from image 103 and formatted information 15.
- FIG. 12 shows a flow diagram implementing a reference scene 9, a mathematical projection 8 giving a synthetic image class 7 of the reference scene 9, a real projection 72 giving a reference image 11 of the reference scene 9 for the characteristics used 74.
- This flowchart also implements a configurable transformation model 12 giving a transformed image 13 of the reference image 11.
- the transformed image 13 has a difference 14 with the image class synthesis 7.
- Scene 3 is a place in three-dimensional space, which includes objects 107 lit by light sources.
- Image capture device Image, Image capture We will now describe with reference to FIGS. 3 and 7, what is meant by image capture device 1 and image 103.
- An image device 100 called an image capture device 1, of one or more sensors 101, of electronics 102, of a memory area 16.
- Said image capture apparatus 1 makes it possible, from a scene 3, to obtain fixed or animated digital images 103 recorded in memory area 16 or transmitted to an external device.
- Animated images consist of a succession over time of still images 103.
- Said image capturing device 1 can take the form in particular of a camera, a camcorder, a connected or integrated camera. to a PC, a camera connected or integrated to a personal assistant, a camera connected or integrated to a telephone, a videoconferencing device or a camera or measuring device sensitive to other lengths of wave as those of visible light such as for example a thermal camera.
- the method consisting in calculating the image 103 by the image capture apparatus 1 is called image capture.
- image capture a particular configuration of the device.
- Image restitution means, Restored image, Image restitution
- image restitution means 19 can take the form in particular of a display screen, a television, flat screen, projector, virtual reality glasses, printer.
- Such an image restitution means 19 comprises: an electronics, - one or more sources of light, electrons or ink,
- one or more modulators devices for modulating light, electrons or ink, - a focusing device, in particular in the form of optics in the case of a light projector, or in the form of electron beam focusing coils in the case of a cathode ray tube screen, or in the form of filters in the case of a flat screen,
- a restitution medium 190 in particular in the form of a screen in the case of a cathode-ray tube screen, a flat screen or a projector, in the form of a printing medium on which printing is performed in the case of a printer, or in the form of a virtual surface in space in the case of a virtual image projector.
- Said image restitution means 19 makes it possible, from an image 103, to obtain a restored image 191 on the restitution support 190.
- Animated images consist of a succession over time of still images.
- the method consisting in displaying or printing the image by the image rendering means 19 is called image restitution.
- a restitution means 19 comprises several interchangeable sub-assemblies or which can move relatively with respect to one another, in particular the restitution support 190, a particular configuration is called image restitution means 19 .
- the surface in the sensor 110 is defined as the shape in the space drawn by the sensitive surface of the sensor 101 of the image capture apparatus 1 at the time of image capture. This surface is generally flat.
- An optical center 111 is a point in the space associated with image 103 at the time of image capture. The distance between this point 111 and the plane 110 is called the focal distance, in the case where the surface of the sensor 110 is plane.
- Pixel 104 is called an elementary area of the surface of sensor 110 obtained by creating a generally regular tiling of said surface of sensor 110. Pixel value is called a number associated with this pixel 104.
- An image capture consists in determining the value of each pixel 104. The set of these values constitutes the image 103.
- the pixel value is obtained by the integration on the surface of pixel 104, for a period of time called exposure time, of part of the light flux coming from scene 3 through optics 100 and by converting the result of this integration into a numerical value.
- the integration of the luminous flux and / or the conversion of the result of this integration into a numerical value are carried out by means of the electronics 102.
- the part of the light flux concerned is obtained in various ways: a) In the case of a color image 103, the surface of the sensor 110 generally comprises several types of pixels 104, respectively associated with light fluxes of different wavelengths, such as for example red, green and blue pixels. b) In the case of a color image 103, there may also be several juxtaposed sensors 101 which each receive part of the light flux. c) In the case of a color image 103, the colors used can be different from red, green and blue, as for example for American NTSC television, and can be in number greater than three. d) Finally, in the case of a so-called interlaced scanning television camera, the animated images produced consist of alternating images 103 comprising the even lines, and images 103 comprising the odd lines.
- the configuration used is called the list of removable sub-assemblies of the image capture device 1, for example the optics 100 actually mounted on the image capture device 1 if it is interchangeable.
- the configuration used is characterized in particular by:
- These adjustments can be made by the user, in particular using buttons, or calculated by the image capture device 1.
- These settings can be stored in the device, in particular on a medium removable, or on any device connected to the device. These settings may include, but are not limited to, focus, iris, and focal length settings for optics 100, exposure time settings, white balance settings, built-in image processing settings such as digital zoom, compression, contrast,
- the characteristics used 74 or the set of characteristics used 74 are called: a) Parameters linked to the intrinsic technical characteristics of the image capture device 1, determined at the time of the design of the image capture device 1.
- these parameters can include the formula of the optics 100 of the configuration used impacting the geometric defects and the sharpness of the captured images; the optics 100 formula of the configuration used notably includes the shape, arrangement and material of the optics 100 lenses.
- These parameters may further include:
- the geometry of the sensor 101 namely the surface of the sensor 110 as well as the shape and the relative arrangement of the pixels 104 on this surface
- observation point 105 and observation direction 106.
- Mathematical surface 10 is called a surface geometrically associated with the surface of the sensor 110. For example, if the surface of the sensor is planar, the mathematical surface 10 could be confused with that of the sensor.
- the direction of observation 106 is called a straight line passing through at least one point of the scene 3 and through the optical center 111.
- the point of observation 105 is called the intersection of the direction of observation 106 and the surface 10.
- observed color and observed intensity The color of the light emitted, transmitted or reflected by said scene 3 in said direction of observation 106 at a given instant is called observed color and observed from said observation point 105.
- the intensity of light is called observed intensity emitted by said scene 3 in said observation direction 106 at the same time, and observed from said observation point 105.
- the color can be characterized in particular by a light intensity as a function of a wavelength, or also by two values as measured by a colorimeter.
- the intensity can be characterized by a value as measured with a photometer.
- Said observed color and said observed intensity depend in particular on the relative position of the objects 107 in scene 3 and on the lighting sources present as well as on the transparency and reflection characteristics of the objects 107 at the time of the observation.
- Mathematical color of a point Mathematical intensity of a point, Mathematical form of a point, Mathematical position of a point.
- a determined mathematical projection 8 associates: - with a scene 3 when an image is captured
- a determined mathematical projection 8 is a transformation which makes it possible to determine the characteristics of each point of the mathematical image 70 from scene 3 at the time of the image capture and of the characteristics used 74.
- the mathematical projection 8 is defined in the manner which will be described below.
- the position of the observation point 105 on the mathematical surface 10 is called the mathematical position 40 of the point.
- the geometric shape 41 of the point is called the punctual geometric shape of the observation point 105.
- the observed color is called the mathematical color of the point.
- Mathematical point is called the association of the mathematical position 40, the mathematical form 41, the mathematical color and the mathematical intensity for the observation point 105 considered.
- the mathematical image 70 consists of all of said mathematical points.
- the mathematical projection 8 of scene 3 is the mathematical image 70.
- the image capture apparatus 1 associated with the characteristics used 74 produces an image 103 of the scene 3.
- the light coming from the scene 3 in an observation direction 106 passes through the optics 100 and arrives on the surface of the sensor 110.
- the actual shape 51 associated with said direction of observation 106 is not a point on the surface of the sensor, but has a shape of a cloud in three-dimensional space, which has an intersection with one or more pixels 104.
- These differences originate in particular from coma, spherical aberration, astigmatism, grouping in 104 pixels, chromatic aberration, depth of field, diffraction, stray reflections, the field curvature of the camera image capture 1. They give an impression of blur, lack of sharpness in image 103.
- the actual position 50 associated with said observation direction 106 has a difference with respect to the mathematical position 40 of a point.
- This difference is due in particular to the geometric distortion, which gives an impression of distortion: for example, the vertical walls appear curved. It is also due to the fact that the number of pixels 104 is limited and that consequently the real position 50 can only take a finite number of values.
- the actual intensity associated with said observation direction 106 differs from the mathematical intensity of a point. These differences are due in particular to gamma and vignetting: for example, the edges of image 103 appear darker. In addition noise may be added to the signal.
- the actual color associated with said direction of observation 106 presents differences compared to the mathematical color of a point. These differences have particularly for the gamma and the dominant color. In addition noise may be added to the signal.
- the real point 50 is called the association of the real position, the real shape 51, the real color and the real intensity for the direction of observation 106 considered.
- the real projection 72 of scene 3 is made up of all the real points.
- a parameterizable transformation model 12 (or in a condensed fashion, parameterizable transformation 12) is called a mathematical transformation making it possible to obtain from an image 103, and from the value of parameters, a corrected image 71. Said parameters can in particular be calculated from the characteristics used 74 as indicated below.
- Said transformation makes it possible in particular to determine for each real point of the image 103, the corrected position of said real point, the corrected color of said real point, the corrected intensity of said real point, the corrected shape of said real point, from the value of the parameters, of the real position of said real point and of the values of the pixels of the image 103.
- the corrected position can for example be calculated using polynomials of degree fixed as a function of the real position, the coefficients of the polynomials depending on the value of the parameters.
- the corrected color and the corrected intensity can for example be weighted sums of the values of the pixels, the coefficients depending on the value of the parameters and the actual position, or else on non-linear functions of the values of the pixels of the image 103.
- the parameters can include in particular: the focal length of the optics 100 of the configuration used or a linked value such as the position of a group of lenses, the focusing of the optics 100 of the configuration used or a linked value such as the position of a group of lenses, the opening of the optics 100 of the configuration used or a linked value such as the position of the diaphragm.
- the difference 73 is called between the mathematical image 70 and the corrected image 71 for a given scene 3 and for the characteristics used 74, one or more values determined from the numbers characterizing the position, the color, intensity, shape of all or part of the corrected points and all or part of the mathematical points.
- the difference 73 between the mathematical image 70 and the corrected image 71 for a given scene 3 and for the characteristics used 74 can be determined as follows:
- characteristic points which can be for example the points of an orthogonal grid 80 of points arranged regularly as presented in FIG. 10.
- the difference 73 for example by performing the sum for each characteristic point of the absolute values of the differences between each number characterizing the position, the color, the intensity, the shape respectively for the corrected point and for the mathematical point.
- the function sum of the absolute values of the differences can be replaced by another function such as the mean, the sum of the squares or any other function allowing the numbers to be combined.
- FIG. 4a presents a reference scene 9 made up of a sheet of paper with circles filled with black and arranged regularly.
- Figure 4b shows another sheet of paper with the same circles to which are added lines and colored areas. The circles are used to measure the actual position 50 of a point, the lines the actual shape 51 of a point, the colored areas the actual color of a point and the actual intensity of a point.
- This reference scene 9 can be made of a material other than paper.
- reference image 11 Called reference image 11, an image of the reference scene 9 obtained with the image capture device 1.
- a class of synthetic images 7 is a set of mathematical images 70 obtained by mathematical projection 8 of one or more reference scenes 9, this for one or more sets of characteristics used 74.
- the class of computer-generated images 7 comprises only one computer-generated image.
- transformed image 13 the corrected image obtained by applying a configurable transformation model 12 to a reference image 11.
- the configurable transformation 12 (and its parameters) is chosen which makes it possible to transform the reference image 11 into the transformed image 13 which has the smallest difference with the class of computer-generated images 7.
- the class of computer-generated image 7 and the transformed image 13 are then said to be close.
- Deviation 14 is called said difference.
- the configurable transformation 12 (and its parameters) is chosen as a function of the differences between the transformed image 13 of each reference scene 9 and the synthetic image class 7 of each reference scene 9 considered.
- the sum function can be replaced by another function such as the product.
- the synthetic image class 7 and the transformed images 13 are then said to be close.
- Deviation 14 is called a value obtained from said differences, for example by calculating the average.
- the best transformation is the transformation which, among the configurable transformation models 12, makes it possible to transform each reference image 11 into a transformed image 13 close to the class of synthetic images 7 of the reference scene 9 corresponding to said image of reference 11.
- Calibration We call calibration a process which makes it possible to obtain data relating to the intrinsic characteristics of the image capture apparatus 1, this for one or more configurations used, each consisting of an optic 100 associated with an image capture apparatus. 1.
- Case 1 in the case where there is only one configuration, said method comprises the following steps:
- Case 3 in the case where all the configurations corresponding to a given lens 100 and to all the image capture devices 1 of the same type are considered, said method comprises the following steps: - the step of mounting said lens 100 on an image capture device 1 of the type in question,
- the calibration can be carried out, preferably, by the manufacturer of the image capture device 1, for each device and configuration in case 1. This method is more precise but more restrictive and well suited in the case where optics 100 is not interchangeable.
- the calibration can alternatively be carried out by the manufacturer of the image capture device 1, for each type and configuration of device in case 2. This method is less precise but simpler.
- the calibration can alternatively be carried out by the manufacturer of the image capture device 1, for each lens 100 and type of device in case 3. This method is a compromise allowing the use of a lens 100 on all image capture devices 1 of a type without redoing the calibration for each combination of image capture device 1 and optics 100.
- the calibration can alternatively be carried out by the dealer or installer of the camera, for each image capture device 1 and configuration in case 1.
- the calibration can alternatively be carried out by the dealer or installer of the device, for each optic 100 and type of device in case 3.
- the calibration can alternatively be carried out by the user of the device, for each device and configuration in case 1.
- the calibration can alternatively be carried out by the user of the device, for each lens 100 and type of device in case 3.
- Said method comprises the following steps:
- Said method further comprises the iteration of the following steps:
- the step of calculating images 103 from the characteristics used 74 and in particular from the formulas of the optics 100 of the configuration used by implementing, for example, optical calculation software by ray tracing, or by performing measurements on a prototype
- Formatted information 15 associated with image 103 or formatted information 15 is called all or part of the following data: data relating to the intrinsic technical characteristics of the image capture apparatus 1, in particular the distortion characteristics, and / or data relating to the technical characteristics of the image capture device 1 at the time of image capture, in particular the exposure time, and / or - data relating to the preferences of said user, in particular the color temperature, and / or - data relating to deviations 14.
- Called database of characteristics 22 or database 22 a database comprising, for one or more image capture devices 1 and for one or more images 103, formatted information 15.
- Said database of characteristics 22 can be stored centrally or distributed, and can in particular be:
- the formatted information 15 associated with image 103 can be recorded in several forms and structured in one or more tables but they logically correspond to all or part of the fields 90, including: (a) focal length, (b) depth of field,
- Said geometric faults include the geometry faults of the image 103 characterized by the parameters associated with the characteristics of the shooting 74 and a configurable transformation representing the characteristics of the image capturing apparatus 1 at the time of the shooting .
- Said parameters and said configurable transformation make it possible to calculate the corrected position of a point of the image 103.
- Said geometric defects further include vignetting characterized by the parameters associated with the characteristics of the shot 74 and a configurable transformation representing the characteristics from the image capture device 1 at the time of shooting. said parameters and said configurable transformation make it possible to calculate the corrected intensity of a point in image 103.
- Said geometric defects further include the dominant color characterized by the parameters associated with the characteristics of the shot 74 and a configurable transformation representing the characteristics of the image capture device 1 at the time of the shot. Said parameters and said configurable transformation make it possible to calculate the corrected color of a point in image 103. Said fields 90 further include (d) the sharpness of image 103.
- Said sharpness includes the image resolution blur 103 characterized by the parameters associated with the characteristics of the shooting 74 and a configurable transformation representing the characteristics of the image capturing apparatus 1 at the time of the shooting. Said parameters and said configurable transformation make it possible to calculate the corrected shape of a point of the image 103.
- the blur covers in particular coma, spherical aberration, astigmatism, grouping in pixels 104, chromatic aberration, depth of field, diffraction, stray reflections, field curvature.
- Said sharpness further includes the blurring of depth of field, in particular spherical aberrations, coma, astigmatism.
- Said blur depends on the distance of the points of the scene 3 relative to the image capture device 1 and is characterized by the parameters associated with the characteristics of the shot 74 and a configurable transformation representing the characteristics of the camera image capture 1 at the time of shooting. Said parameters and said configurable transformation make it possible to calculate the corrected shape of a point of the image 103.
- Said fields 90 further comprise (e) parameters of the quantification method. Said parameters depend on the geometry and the physics of the sensor 101, the architecture of the electronics 102 and of possible processing software.
- Said parameters include a function representing the variations in the intensity of a pixel 104 as a function of the wavelength and of the light flux coming from said scene 3.
- Said function notably comprises the gamma information.
- Said parameters also include:
- Said fields 90 further comprise (f) parameters of the digital processing operations carried out by the image capture apparatus 1, in particular the digital zoom, the compression. These parameters depend on the processing software of the image capturing apparatus 1 and the settings of the user.
- Said fields 90 further comprise: (g) parameters representing the preferences of the user, in particular as regards the degree of blurring, the resolution of the image 103. (h) the deviations 14.
- the formatted information 15 can be calculated and saved in the database 22 in several steps. a) A step at the end of the design of the image capture device 1. This step makes it possible to obtain intrinsic technical characteristics of the image capture device 1, and in particular:
- This step makes it possible to obtain other intrinsic technical characteristics of the image capture apparatus 1, and in particular, for a certain number of characteristic values used, the best associated transformation and the associated deviation 14.
- c) A step of choosing the user's preferences using buttons, menus or removable media or connection to another device.
- Step (d) also makes it possible to obtain the focal distance.
- the focal length is calculated from:
- focal length can finally be determined by analyzing the content of image 103.
- Step (d) also makes it possible to obtain the depth of field.
- the depth of field is calculated from: a measurement of the position of the group of focusing lenses of the optics 100 of the configuration used, or
- Step (d) also makes it possible to obtain the geometry and sharpness faults.
- the geometry and sharpness faults correspond to a transformation calculated using a combination of the transformations of the characteristics database 22 obtained at the end of step (b). This combination is chosen to represent the parameter values corresponding to the characteristics used 74, in particular the focal distance.
- Step (d) also makes it possible to obtain the digital processing parameters carried out by the image capture apparatus 1. These parameters are determined by the manual or automatic adjustments carried out.
- the calculation of the formatted information 15 according to steps (a) to (d) can be carried out by:
- step (b) and in step (d) can be stored in the form: - of a general mathematical formula
- the mathematical formulas can be described by: - a list of coefficients, - a list of coefficients and coordinates.
- identifiers associated with the data are recorded in the database 22. These identifiers include in particular:
- the completed image 120 is called the image 103 associated with the formatted information.
- This completed image 120 can take the form, preferably, of a file.
- the completed image 120 can also be distributed in several files.
- the completed image 120 can be calculated by the image capturing apparatus 1. It can also be calculated by an external computing device, for example a computer.
- Image processing software 4 is called software that takes one or more completed images as input. 120 and which performs processing on these images. These treatments may include, in particular:
- Said image processing software can be:
- digital optics is used to mean the combination of an image capture device 1, a database of characteristics 22 and a calculation means 17 allowing:
- the user directly obtains the corrected image 71. If desired, the user can request the deletion of the automatic correction.
- the characteristics database 22 can be: - integrated in the image capture device 1,
- the calculation means 17 can be:
- the reproduction characteristics used 95 denote the intrinsic characteristics of the image reproduction means 19, the characteristics of the image reproduction means 19 at the time of image reproduction, and the preferences of the user at the time of restitution of images. Particularly in the case of a projector, the rendering features used 95 include the shape and position of the screen used.
- the corrected restitution image 94 denotes the image obtained by applying the configurable restitution transformation 97 to the image 103.
- 96 a mathematical projection which associates with a corrected restitution image 94, a mathematical restitution image 92 on the mathematical restitution surface geometrically associated with the surface of the restitution support 190.
- the mathematical restitution points of the mathematical restitution surface have a shape, position, color and intensity calculated from the corrected restitution image 94.
- the actual restitution projection 90 designates a projection associating with an image 103 a restored image 191.
- the pixel values of the image 103 are converted by the electronics of the rendering means 19 into a signal which controls the modulator of the means of restitution 19.
- Real restitution points are obtained on the restitution support 190. Said real restitution points have a shape, color, intensity and position.
- the phenomenon of grouping into pixels 104 previously described in the case of an image capturing apparatus 1 does not occur in the case of an image restitution means. On the other hand, an opposite phenomenon occurs which notably shows straight lines like stair steps.
- the difference in restitution 93 denotes the difference between the restored image 191 and the mathematical restitution image 92. This difference in restitution 93 is obtained mutatis mutandis as the difference 73.
- the image reference 103 the values of the pixels 104 of which are known, is used to designate the restitution reference.
- best restitution transformation for a restitution reference and used restitution characteristics 95 denotes that which makes it possible to transform the image 103 into a corrected restitution image 94 such that its mathematical projection projection 92 has the smallest difference. 93 with the restored image 191.
- the restitution calibration and digital optical design design methods are comparable to the calibration and digital optical design methods in the case of an image capture device 1. However, certain steps have differences, and in particular the following stages: - the stage of choosing a restitution reference; the step of restoring said restitution reference;
- the formatted information 15 linked to an image capturing apparatus 1 and that linked to an image restitution means 19 can be put end to end for the same image.
- the system according to the invention comprises computer processing means P76 to carry out, for the image P2, the following steps of the method according to the invention: the step of listing sources P50 of formatted information 15 relating to the devices of the set of devices P75, these sources P50 can be in particular where appropriate, the image file P58 containing the image P2, the devices, a local and / or remote database 22, means for loading P53 of the image P2 or of the modified image, for example software according to the Twain standard used for a scanner; the sources thus listed are called database 22,
- the method and system according to the invention will be described in application of the reproduction of an image from an image capture apparatus, whether it be a photographic camera, a video camera, a ultrasound machine, etc.
- the reproduction of an image of such a device may involve the characteristics of the camera (notably optical), the sensor or photosensitive surface (CCD, film photosensitive, etc.), the scanner, the processing software, the information transmission chain between the different devices, the printer.
- the image chain may involve other elements having their particular characteristics.
- Each device is characterized by an identifier 60 which makes it possible to identify the type of device and therefore to access known characteristics linked to this type of device and indirectly to obtain indexes P52 whose use will be described later.
- variable characteristics P55 which can be implemented in the context of the invention.
- variable characteristics P55 may have an influence on the fixed characteristics (or original characteristics) of the apparatus or of the chain of apparatuses P3.
- FIG. 13a represents a diagram of an exemplary embodiment of the system of the invention.
- variable characteristics 66 The variable characteristics can also be contained in the image itself or be calculated from the image.
- an image capture device APP1 which has its own characteristics and which may contain variable characteristics 66. The specific characteristics are linked to the type of device or to each device and can be known by knowing the device and its characteristics. origin.
- identifier 60 can be, for example, a bar code on the device or on a film, and to which the system will match these specific characteristics.
- the identifier 60 can be obtained in various ways from the image, from the data 62, and / or by interrogating the management software of the device, or the device itself, or the user, which are symbolized by LOG / MAT / IMPR in Figure 13a.
- variable characteristics 66 are generally linked to the image capture conditions and can be those mentioned previously and which can be contained in the image, or in the image capture apparatus or in both: APP2 to APPn devices including in particular an APP2 device such as a printer or a scanner having characteristics linked to the type of device and in particular reflecting their faults, as well as variable characteristics 66 linked to the mode of use, for example, the report magnification for a printer.
- a device such as APPn can also be a pseudo-device and be in the form of a file representing devices or functionalities and containing the characteristics corresponding to these apparatuses or these functionalities: -image capture device,
- variable characteristics 66 for example a recoding or a zoom
- variable characteristics 66 that should be taken into account, let us quote for example:
- - non-linear luminance correction for example gamma correction
- - contour enhancement for example the leveling level applied by the device
- an operating mode another setting applied by the user of the device, for example an operating mode
- the system has reception interfaces C.VAR1, C.VAR2, ... C.VARn intended to receive the variable characteristics 66 explained above.
- the characteristics coming from the image can be transmitted by the image itself or, as mentioned previously, by data associated with the image. It will be recalled that the variable characteristics 66 of the image can also be transmitted by the capture device.
- Interfaces IDl, ID2, ... IDn are intended to receive the identifiers 60 of the various peripherals APPl to APPn.
- peripheral can, as the case may be, correspond to one or more devices of the same type or not.
- the following examples correspond to several of these cases, with for each example, a possible implementation of the identifier 60 in the form of coding and of content:
- a given peripheral for example, coding IA1: name of the manufacturer, type of peripheral, serial number of the peripheral
- IA2 coding name of the manufacturer, type of device
- IA3 coding name of the manufacturer, type of peripheral, type of interchangeable lens mounted
- a device category for example IA4 coding suitable for disposable cameras: name of the manufacturer, type of camera, view number
- a manufacturer for example IA5 coding for a manufacturer
- IA8 coding name of the person or number, country
- IA9 name of the manufacturer, type of device
- a given device version for example, IA10 coding: name of the manufacturer, type of device, software version of the device
- -to a protocol for example IA11 coding: data from the Twain protocol
- a generic peripheral for example LAI2 coding: list of data sources, field identifier; value of fields
- the system can then analyze the peripherals or apparatuses of the apparatus chain P3, and determine the identifiers 60, in various ways depending on the case, in order to be able to interrogate the database.
- a database contains for each type of device at least one formatted information representing the faults and characteristics of this device.
- the formatted information 15 can be linked to the faults P5 of the apparatus in various ways. They can represent device faults. They can represent the reverse of device faults. They may represent only an approximation of the defects. They can represent the difference in faults between two devices.
- Each identifier 60 supplied by an interface such as the interface ID1 makes it possible to obtain formatted information such as 15.1 which is temporarily received in a circuit 20.1. Circuits 20.1, 20.2, ... 20.n make it possible to receive formatted information relating to the apparatuses APPl, APP2, ... APPn.
- the database can be integrated into the system or at least partially remote remotely.
- FIG. 16.1 shows an apparatus having characteristics showing faults 704 which give rise to formatted information 15 as we have seen previously.
- Variable characteristics 66 representing for example a variable focal length 705 also give rise to formatted information (see FIG. 16.2).
- Image processing operators 23.1, 23.2, ... 23.n are each intended to each receive modified formatted information.
- the first operator 23.1 receives an image to be processed, processes it using the modified formatted information 15.1 'and provides a modified image. This is received by the following operator 23.2 which processes it using the modified formatted information 15.2 'and provides a new modified image, and so on until the last operator 23. n which provides a final modified image .
- an image processing operator does not receive modified formatted information
- the image received by this operator is not modified and is retransmitted as such to the next operator or to the output, or we can by example use information formatted by default.
- a central control unit 25 makes it possible to manage the entire operation of the system and in particular the exchange of information and data between the various elements of the system.
- the central control unit 25 will be responsible for automatically searching, in the database 22, for formatted information including the addresses are given by the interfaces ID1, ID2, ... IDn.
- the central unit 25 manages the search for this information and triggers the operation of the operators 21.1 to 21.n for processing formatted information then the image processing operators 23.1 to 23.n. Operators can possibly locate on different remote systems linked together.
- FIG. 13b represents a variant of the system according to the invention.
- the modified formatted information is combined into a single formatted information and modifies the quality of the image to be processed.
- an operator 23. t replaces operators 23.1 to 23.n.
- This operator receives the various modified formatted information, combines them and makes it possible to modify the quality of the image to be processed in order to provide a modified image.
- it is planned, as indicated in FIG. 13c, to combine the variable characteristics 66 of an apparatus and its identifier 60 to access directly in the database. data 22, to modified formatted information.
- variable characteristics 66 provided by C.VARl are combined with an identifier ID1 to form modified formatted information 15.1 transmitted to 22.1. It is obvious that in FIG. 13c, this provision has only been provided for the modified formatted information 15.1 'but that it could be applied to all or part of the other formatted information.
- the system supplies the operator 23.n for FIG. 13a and 23 and for FIGS. 13b and 13c with a modified image.
- FIG. 16.5 illustrates the case where a modified image 61c is associated with associated information 62c which can be:
- correction data i.e. modified formatted information or their equivalents or simply an indicator, thus indicating that the image has been corrected or modified
- data 62 or information associated P63 with the original image 1 possibly modified or updated to reflect the processing brought to the image, for example data in Exif or PIM format, - or both types of data .
- FIGS. 13a to 13c can be applied to all of the faults or to each defect.
- FIG. 14a we will now describe a simplified embodiment of the methods of the invention.
- This exemplary embodiment is applied to an image capture device. It is assumed that the method should only modify the faults due to a single device, for example to the image capture device and to the settings of this device.
- the image is made available for processing in digitized form by a digitizing device 400.1, a digital capture device 400.2 (digital photo camera, or scanner, or other), a compact disc 400.3 for example.
- a digitized image is available.
- the characteristics of the image capture device are known, or even the type of this device by any means of identification, for example a bar code.
- step 402 of the method the identifier 60 of the device is entered or calculated.
- the identifier 60 makes it possible to access, using indexes P52 for example, in the database 22, characteristics of the image capture apparatus. Indeed, as mentioned previously, there is a database in which, in principle, for each known device, the characteristics of the devices have been recorded. In the context of one invention, these characteristics represent defects to be modified.
- the database 22 is therefore called.
- the call to the database can also take into account certain variable characteristics 66 obtained during step 405, in order to directly obtain formatted information relevant to the values of variable characteristics 66 thus obtained.
- step 404 at an address obtained from the identifier 60, formatted information 15 is read from the database 22 which represents the characteristics (faults) of the corresponding device.
- the system may have variable characteristics. 66 (the shooting conditions in particular).
- step 405 these characteristics are therefore available. Then (step 406), the formatted information
- variable characteristics 66 and in particular their values for the image to be processed, being determined, they serve to determine in the formatted information a fraction of these formatted information which takes account of these variable characteristics 66.
- step 407 this modified formatted information is applied to the image to process it and correct or modify it. This processing will be done using operators assisted by image processing software.
- step 408 a modified image is thus obtained. It is obvious that the above method can operate using only the characteristics inherent in the device without using the variable characteristics 66. In this case the formatted information read from the database is used directly to process the image.
- the method provides for the capture by the system of the digitized image, identifiers 60 of the devices and variable characteristics 66.
- the identifier 60 of a device is taken into account and makes it possible to address the database 22 (step 502) to obtain one or more formatted information corresponding to this identifier 60.
- variable characteristics 66 linked to this device are sought (step 504).
- step 505 the formatted information or certain formatted information is modified according to the variable characteristics 66. As in the method described in relation with FIG. 14b, after having determined the variable characteristics 66, these make it possible to determine among the formatted information those that are useful and which take variable characteristics into account 66. The formatted information thus determined is kept in memory.
- step 506 it is tested to find out whether another device must be taken into account for the modification of the quality of the image.
- the image is then processed during step 507 by the formatted information relating to the first device and gives rise to a first processed image.
- the system then takes into account the formatted information relating to the next device and processes the previous processed image until it has processed all the formatted information, that is to say in principle when all the information relating to the different devices in the chain have been taken into account.
- step 507 it is planned, when all the formatted information of all the devices has been obtained, that is to say at the end of step 506, to combine, during a step 510, the different formatted information.
- the image is processed at once.
- an image may require a plurality devices that may include the image capture device, scanner, printer, transmission systems, etc. Each device is capable of inducing faults in the processing chain.
- pseudo-peripherals which have the mission of making modifications to the image according to a style or in application of defects which correspond to these pseudo-peripherals.
- a variant of the method of the invention is to consider that all of the apparatuses of a chain of apparatuses P3 necessary for processing an image constitutes a single apparatus which will be called virtual apparatus 708 and whose defects correspond to the 'equivalent of all or part of the faults of the various devices in the chain.
- devices such as an image capture device 706 (FIG. 16.4) and a printer 707 can be represented by a virtual device 708 to which correspond virtual formatted information 709.
- information formatted can be a mathematical expression of physical characteristics
- information formatted from a virtual device 708 corresponding to two devices can be the sum of two vectors corresponding to the characteristics of these two devices and / or the convolution of two mathematical functions.
- a virtual device 708 is therefore determined in FIG.
- the virtual formatted information 709 corresponding to this virtual appliance is determined. And the virtual formatted information obtained is recorded or else, this virtual formatted information is substituted for the formatted information relating to the original faults.
- the virtual formatted information is accessible in the database directly using an identifier 60 corresponding to the appliance chain P3 represented by the virtual appliance. The execution of the process can then be simpler and faster to implement.
- An example of organization of the method making it possible to obtain virtual formatted information can be implemented according to the flow diagram of FIG. 15. The characteristics of two devices are taken into account (steps 510 and 511). In step 512, these characteristics are combined. In step 513, the corresponding virtual formatted information is calculated.
- step 514 it is checked whether another device must enter the virtual device chain. If so, we start the process again. If not the process is finished.
- the P3 device chain includes a scanner, a camera and a printer.
- the equivalent virtual device has the faults of the three devices and the time to modify the image quality can be divided substantially by three.
- the corresponding formatted information is determined according to the method described above.
- formatted information is deduced by interpolation, a function of the focal length and of the aperture so that the database contains the formatted information necessary in step 404.
- the image to be processed is an image coming from an image capturing apparatus which it was desired to view or print.
- the invention is also applicable to any image processing chain and therefore also to a chain making it possible to project an image.
- the characteristics of the various devices of the image restitution chain must be obtained. These characteristics make it possible to obtain information formatted for applying the method of the invention.
- the characteristics provided making it possible to obtain formatted information could be characteristics intended to correct vision defects (astigmatism for example) of a observer 701 or to induce special effects.
- the resulting formatted information 702 makes it possible to modify the visual, graphic, colorimetric, etc. quality. of the image as illustrated in FIG. 16.3, the formatted information 702 linked to the vision defects of the observer 701, for example, is treated as formatted information of the appliance chain P3, or even is associated with this formatted information .
- the processing of an image has been considered; we can also consider that the image is in a P57 file, as well as the identifier 60 or an index P52, and possibly the variable characteristics 66 of the capture device and of any device which may have intervened in the processing of the image saved in the file; we can also consider that the image is in a P58 image file, as well as some of the formatted information.
- the invention is therefore also applicable to the case where the image and the formatted information are in the database 22.
- the value of the variable characteristics 66 can be determined using information contained in the file P57 or the image file P58. Preferably, this information will be recorded in the file in a standardized format such as that of the EXIF standard known in the art. In this way the system and the method of the invention are applicable to the processing of images taken and / or already processed using devices placed on the market before the formatted information corresponding to these devices has been established. .
- the modification of the image quality can be simplified by taking into account only the defects of a limited number of devices in the chain, or even only one, and by correcting only these defects .
- the method of the invention can be applied by simulating other devices than those forming part of the chain of devices P3 in use.
- formatted information relating to a device, or to a type of device may be applicable to another device or to another type of device and in particular similar devices. For example, as shown in Figure 16.6, several batches of devices 710.0, 710.1, 710.2 have been shown. Formatted information relates to a type of device 711, but this formatted information can also be applicable to a similar device 712, which makes it possible, for example, to produce only formatted information relating to a single device of each type.
- the invention is applicable to the modification, in particular the improvement, of images processed or provided by a P3 device or device chain.
- An interesting application will be to modify only faults or part of the faults of certain devices only.
- An application may be to modify a defect only in part, for example, in order to establish a compromise between image quality and calculation time.
- the invention is applicable to the design of integrated photo development laboratories. It can be implemented on a computer.
- a camera or a camera can be used to capture a target projected on the screen.
- device chains can include: - computer camera (WEBCAM),
- a camera a scanner, a printer for example in a Minilab of photo printing,
- a digital camera for example in a photo printing Minilab
- a scanner for example in a photo printing Minilab
- a screen or a printer for example in a computer
- color image P21 of color plane P20, of determined color P22, of data relating to a determined color P23.
- the variant embodiment described below applies to the case where the image P2 is a color image P21.
- the P21 color image can be broken down into P20 color planes in various ways: number of planes (1, 3 or more), precision (8 bit unsigned, 16 bit signed, floating, etc.) and meaning of the plans (relative to to a standard color space).
- the P21 color image can be broken down into P20 color planes in various ways: red, green, blue (RGB) or luminance, saturation, hue ...; on the other hand, there are color spaces such as PIM, or negative pixel values are possible in order to allow the representation of subtractive colors which it is not possible to represent in positive RGB; finally it is possible to code a pixel value on 8bits, 16bits, or by using floating values.
- RGB red, green, blue
- PIM negative pixel values
- the formatted information 15 includes data making it possible to decompose the image P2 into color planes P20, compatible with the various defects P5 to be processed; each color plane P20 being characterized by a determined color P22; said formatted information comprising data relating to said determined color P23, for example coordinates in a standard CIE or XYZ or LAB or sRGB color space; said data relating to said determined color P23 making it possible to calculate the color plane P20 of image 1 and to determine the fraction of said formatted information 15 which should be used to modify the quality of said color plane P20.
- the formatted information 15 or a fraction of the formatted information 15 can comprise measured formatted information P101, to represent a raw measurement, for example a mathematical field linked to the geometric distortion defects at a certain number of characteristic points of a grid 80.
- formatted information 15 or a fraction of the formatted information 15 can include extended formatted information P102, which can be calculated from the measured formatted information PI01, for example by interpolation for the other real points than the characteristic points of the grid 80.
- extended formatted information P102 which can be calculated from the measured formatted information PI01, for example by interpolation for the other real points than the characteristic points of the grid 80.
- combination P120 is called a combination consisting of variable characteristics 66 and values of variable characteristics such as for example a combination P120 consisting of the focal, of focusing, aperture, aperture speed, aperture, etc. and associated values. It is difficult to calculate the formatted information 15 relating to the different combinations P120, especially since certain characteristics of the combination P120 can vary continuously such as in particular the focal length and the distance.
- the invention provides for calculating the formatted information 15 in the form of extended formatted information P102 by interpolation from measured formatted information PI01 relating to a predetermined selection of the combinations P120 of known variable characteristics 66.
- the measured formatted information P101 and the extended formatted information P102 may have an interpolation difference P121.
- the invention may include the step of selecting zero, one or more of the variable characteristics 66, such that the interpolation difference P121, for the extended formatted information Pi02 obtained for the variable characteristics 66 thus selected, is less at a predetermined interpolation threshold.
- certain variable characteristics 66 can have a weaker influence on the defect P5 than others and making the approximation that they are constant can introduce only a minimal error; for example, the focus adjustment may have only a slight influence on the vignetting defect and for this reason not be part of the variable characteristics 66 selected.
- the variable characteristics 66 can be selected at the time of the production of the formatted information 15.
- Cost reduction is a method and system for reducing the cost of a device or chain of devices P3, in particular the cost of the optics of a device or a chain of devices; the process consisting in:
- the method and system according to the invention can be used to reduce the cost of an apparatus or a chain of apparatuses: it is possible to design digital optics, to produce formatted information relating to the P5 faults of the apparatus or to the device chain, use this formatted information to allow image processing means, integrated or not, to modify the quality of the images coming from or intended for the device or the device chain, so that the combination of the device or the device chain and image processing means make it possible to capture, modify or restore images of the desired quality with reduced cost.
Abstract
Description
Claims
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/483,494 US7792378B2 (en) | 2001-07-12 | 2002-06-05 | Method and system for modifying image quality |
JP2003512928A JP4020262B2 (ja) | 2001-07-12 | 2002-06-05 | イメージの品質を修正する方法およびシステム |
DE60239061T DE60239061D1 (de) | 2001-07-12 | 2002-06-05 | Verfahren und system zur qualitätsverbesserung von bildern |
KR1020047000414A KR100940147B1 (ko) | 2001-07-12 | 2002-06-05 | 화질 변경 방법 및 시스템 |
EP02743349A EP1410326B1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour modifier la qualite d'image |
AT02743349T ATE497224T1 (de) | 2001-07-12 | 2002-06-05 | Verfahren und system zur qualitätsverbesserung von bildern |
US12/838,198 US8559743B2 (en) | 2001-07-12 | 2010-07-16 | Method and system for modifying image quality |
US14/021,235 US9536284B2 (en) | 2001-07-12 | 2013-09-09 | Method and system for modifying image quality of an image |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0109291A FR2827459B1 (fr) | 2001-07-12 | 2001-07-12 | Procede et systeme pour fournir a des logiciels de traitement d'image des informations formatees liees aux caracteristiques des appareils de capture d'image et/ou des moyens de restitution d'image |
FR01/09291 | 2001-07-12 | ||
FR0109292A FR2827460B1 (fr) | 2001-07-12 | 2001-07-12 | Procede et systeme pour fournir, selon un format standard, a des logiciels de traitement d'images des informations liees aux caracteristiques des appareils de capture d'image et/ou des moyens de resti |
FR01/09292 | 2001-07-12 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/483,494 A-371-Of-International US7792378B2 (en) | 2001-07-12 | 2002-06-05 | Method and system for modifying image quality |
US10483494 A-371-Of-International | 2002-06-05 | ||
US12/838,198 Division US8559743B2 (en) | 2001-07-12 | 2010-07-16 | Method and system for modifying image quality |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003007240A1 true WO2003007240A1 (fr) | 2003-01-23 |
Family
ID=26213095
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2002/001908 WO2003007243A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour modifier une image numerique en prenant en compte son bruit |
PCT/FR2002/001910 WO2003007236A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire la frequence des mises a jour de moyens |
PCT/FR2002/001911 WO2003007240A1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour modifier la qualite d'image |
PCT/FR2002/001914 WO2003007241A1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire des informations formatees liees aux defauts des appareils |
PCT/FR2002/001915 WO2003007242A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire des informations formatees liees aux defauts |
PCT/FR2002/001909 WO2003007239A1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour fournir des informations formatees a des moyens de traitement d'images |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2002/001908 WO2003007243A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour modifier une image numerique en prenant en compte son bruit |
PCT/FR2002/001910 WO2003007236A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire la frequence des mises a jour de moyens |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2002/001914 WO2003007241A1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire des informations formatees liees aux defauts des appareils |
PCT/FR2002/001915 WO2003007242A2 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour produire des informations formatees liees aux defauts |
PCT/FR2002/001909 WO2003007239A1 (fr) | 2001-07-12 | 2002-06-05 | Procede et systeme pour fournir des informations formatees a des moyens de traitement d'images |
Country Status (11)
Country | Link |
---|---|
US (10) | US7724977B2 (fr) |
EP (7) | EP1410326B1 (fr) |
JP (6) | JP4452497B2 (fr) |
KR (4) | KR100879832B1 (fr) |
CN (6) | CN1305010C (fr) |
AT (4) | ATE310284T1 (fr) |
AU (3) | AU2002317219A1 (fr) |
CA (1) | CA2453423C (fr) |
DE (4) | DE60234207D1 (fr) |
ES (2) | ES2311061T3 (fr) |
WO (6) | WO2003007243A2 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008533550A (ja) | 2005-01-19 | 2008-08-21 | ドゥ ラブズ | イメージ記録および/または再現デバイスを製造するための方法、および前記方法によって得られるデバイス |
US11379725B2 (en) | 2018-06-29 | 2022-07-05 | International Business Machines Corporation | Projectile extrapolation and sequence synthesis from video using convolution |
Families Citing this family (209)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6950211B2 (en) * | 2001-07-05 | 2005-09-27 | Corel Corporation | Fine moire correction in images |
DE60234207D1 (de) * | 2001-07-12 | 2009-12-10 | Do Labs | Verfahren und system zur verringerung der aktualisierungs-häufigkeit |
DE60224035D1 (de) * | 2002-08-23 | 2008-01-24 | St Microelectronics Srl | Verfahren zur Rauschfilterung einer numerischen Bildfolge |
US8294999B2 (en) | 2003-01-16 | 2012-10-23 | DigitalOptics Corporation International | Optics for an extended depth of field |
US7773316B2 (en) * | 2003-01-16 | 2010-08-10 | Tessera International, Inc. | Optics for an extended depth of field |
JP4377404B2 (ja) * | 2003-01-16 | 2009-12-02 | ディ−ブルアー テクノロジス リミテッド | 画像向上機能を備えたカメラ |
US7609425B2 (en) * | 2003-01-31 | 2009-10-27 | Canon Kabushiki Kaisha | Image data processing apparatus, method, storage medium and program |
US8471852B1 (en) | 2003-05-30 | 2013-06-25 | Nvidia Corporation | Method and system for tessellation of subdivision surfaces |
JP4096828B2 (ja) * | 2003-07-15 | 2008-06-04 | セイコーエプソン株式会社 | 画像処理装置 |
US7369699B1 (en) * | 2003-08-29 | 2008-05-06 | Apple Inc. | Methods and apparatuses for restoring color and enhancing electronic images |
GB2406992A (en) * | 2003-10-09 | 2005-04-13 | Ta Vision Lab Ltd | Deconvolution of a digital image using metadata |
CN101373272B (zh) * | 2003-12-01 | 2010-09-01 | 全视Cdm光学有限公司 | 用于优化光学和数字系统设计的系统和方法 |
US7944467B2 (en) * | 2003-12-01 | 2011-05-17 | Omnivision Technologies, Inc. | Task-based imaging systems |
US7317843B2 (en) * | 2004-04-01 | 2008-01-08 | Microsoft Corporation | Luminance correction |
US7463296B2 (en) | 2004-04-01 | 2008-12-09 | Microsoft Corporation | Digital cameras with luminance correction |
US8285041B2 (en) * | 2004-09-14 | 2012-10-09 | Olympus Corporation | Image processing apparatus, image recording apparatus, and image processing method |
US7461331B2 (en) * | 2004-12-21 | 2008-12-02 | Fotomedia Technologies, Llc | Automated construction of print order for images capture during a session |
EP1679907A1 (fr) * | 2005-01-05 | 2006-07-12 | Dialog Semiconductor GmbH | Structure hexagonale de pixels couleur avec des pixels blanc |
US7683950B2 (en) * | 2005-04-26 | 2010-03-23 | Eastman Kodak Company | Method and apparatus for correcting a channel dependent color aberration in a digital image |
US20060274209A1 (en) * | 2005-06-03 | 2006-12-07 | Coretronic Corporation | Method and a control device using the same for controlling a display device |
US9583141B2 (en) * | 2005-07-01 | 2017-02-28 | Invention Science Fund I, Llc | Implementing audio substitution options in media works |
US9092928B2 (en) * | 2005-07-01 | 2015-07-28 | The Invention Science Fund I, Llc | Implementing group content substitution in media works |
US20090150444A1 (en) * | 2005-07-01 | 2009-06-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for audio content alteration |
US20070263865A1 (en) * | 2005-07-01 | 2007-11-15 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Authorization rights for substitute media content |
US9230601B2 (en) | 2005-07-01 | 2016-01-05 | Invention Science Fund I, Llc | Media markup system for content alteration in derivative works |
US20070266049A1 (en) * | 2005-07-01 | 2007-11-15 | Searete Llc, A Limited Liability Corportion Of The State Of Delaware | Implementation of media content alteration |
US8126938B2 (en) * | 2005-07-01 | 2012-02-28 | The Invention Science Fund I, Llc | Group content substitution in media works |
US9426387B2 (en) | 2005-07-01 | 2016-08-23 | Invention Science Fund I, Llc | Image anonymization |
US20090037243A1 (en) * | 2005-07-01 | 2009-02-05 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Audio substitution options in media works |
US20090300480A1 (en) * | 2005-07-01 | 2009-12-03 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media segment alteration with embedded markup identifier |
US20080013859A1 (en) * | 2005-07-01 | 2008-01-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Implementation of media content alteration |
US20090151004A1 (en) * | 2005-07-01 | 2009-06-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for visual content alteration |
US20080052104A1 (en) * | 2005-07-01 | 2008-02-28 | Searete Llc | Group content substitution in media works |
US8203609B2 (en) * | 2007-01-31 | 2012-06-19 | The Invention Science Fund I, Llc | Anonymization pursuant to a broadcasted policy |
US20070005423A1 (en) * | 2005-07-01 | 2007-01-04 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Providing promotional content |
US20100154065A1 (en) * | 2005-07-01 | 2010-06-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for user-activated content alteration |
US20080052161A1 (en) * | 2005-07-01 | 2008-02-28 | Searete Llc | Alteration of promotional content in media works |
US20090235364A1 (en) * | 2005-07-01 | 2009-09-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for promotional content alteration |
US20080086380A1 (en) * | 2005-07-01 | 2008-04-10 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Alteration of promotional content in media works |
US20070276757A1 (en) * | 2005-07-01 | 2007-11-29 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Approval technique for media content alteration |
US20090210946A1 (en) * | 2005-07-01 | 2009-08-20 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for promotional audio content |
US20090204475A1 (en) * | 2005-07-01 | 2009-08-13 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Media markup for promotional visual content |
US9065979B2 (en) * | 2005-07-01 | 2015-06-23 | The Invention Science Fund I, Llc | Promotional placement in media works |
US20080028422A1 (en) * | 2005-07-01 | 2008-01-31 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Implementation of media content alteration |
US20070005651A1 (en) | 2005-07-01 | 2007-01-04 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Restoring modified assets |
US20090150199A1 (en) * | 2005-07-01 | 2009-06-11 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Visual substitution options in media works |
US20070294720A1 (en) * | 2005-07-01 | 2007-12-20 | Searete Llc | Promotional placement in media works |
EP2328006B1 (fr) * | 2005-09-19 | 2014-08-06 | OmniVision CDM Optics, Inc. | Systèmes d'imagerie basés sur des tâches |
JP2007096405A (ja) * | 2005-09-27 | 2007-04-12 | Fujifilm Corp | ぶれ方向判定方法および装置ならびにプログラム |
US8571346B2 (en) | 2005-10-26 | 2013-10-29 | Nvidia Corporation | Methods and devices for defective pixel detection |
US7750956B2 (en) | 2005-11-09 | 2010-07-06 | Nvidia Corporation | Using a graphics processing unit to correct video and audio data |
US8588542B1 (en) | 2005-12-13 | 2013-11-19 | Nvidia Corporation | Configurable and compact pixel processing apparatus |
FR2895104A1 (fr) * | 2005-12-19 | 2007-06-22 | Dxo Labs Sa | Procede pour fournir des donnees a un moyen de traitement numerique |
FR2895103B1 (fr) * | 2005-12-19 | 2008-02-22 | Dxo Labs Sa | Procede et systeme de traitement de donnees numeriques |
FR2895102B1 (fr) * | 2005-12-19 | 2012-12-07 | Dxo Labs | Procede pour traiter un objet dans une plateforme a processeur(s) et memoire(s) et plateforme utilisant le procede |
US8295562B2 (en) * | 2006-01-13 | 2012-10-23 | Carl Zeiss Microimaging Ais, Inc. | Medical image modification to simulate characteristics |
US20070165961A1 (en) * | 2006-01-13 | 2007-07-19 | Juwei Lu | Method And Apparatus For Reducing Motion Blur In An Image |
US8737832B1 (en) | 2006-02-10 | 2014-05-27 | Nvidia Corporation | Flicker band automated detection system and method |
US8368749B2 (en) * | 2006-03-27 | 2013-02-05 | Ge Inspection Technologies Lp | Article inspection apparatus |
US20070239417A1 (en) * | 2006-03-31 | 2007-10-11 | D-Blur Technologies Ltd. | Camera performance simulation |
US20070269123A1 (en) * | 2006-05-16 | 2007-11-22 | Randall Don Briggs | Method and apparatus for performing image enhancement in an image processing pipeline |
JP4974586B2 (ja) * | 2006-05-24 | 2012-07-11 | オリンパス株式会社 | 顕微鏡用撮像装置 |
US7612805B2 (en) | 2006-07-11 | 2009-11-03 | Neal Solomon | Digital imaging system and methods for selective image filtration |
JP4839148B2 (ja) * | 2006-07-12 | 2011-12-21 | 株式会社リコー | ネットワーク装置,端末装置,プログラムおよび記録媒体 |
US8594441B1 (en) | 2006-09-12 | 2013-11-26 | Nvidia Corporation | Compressing image-based data using luminance |
DE102006057190A1 (de) * | 2006-12-05 | 2008-06-12 | Carl Zeiss Meditec Ag | Verfahren zur Erzeugung hochqualitativer Aufnahmen der vorderen und/oder hinteren Augenabschnitte |
US20080180539A1 (en) * | 2007-01-31 | 2008-07-31 | Searete Llc, A Limited Liability Corporation | Image anonymization |
US8723969B2 (en) | 2007-03-20 | 2014-05-13 | Nvidia Corporation | Compensating for undesirable camera shakes during video capture |
US20080244755A1 (en) * | 2007-03-30 | 2008-10-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Authorization for media content alteration |
US20080270161A1 (en) * | 2007-04-26 | 2008-10-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Authorization rights for substitute media content |
US9215512B2 (en) | 2007-04-27 | 2015-12-15 | Invention Science Fund I, Llc | Implementation of media content alteration |
US7936915B2 (en) * | 2007-05-29 | 2011-05-03 | Microsoft Corporation | Focal length estimation for panoramic stitching |
US8634103B2 (en) * | 2007-06-12 | 2014-01-21 | Qualcomm Incorporated | Print image matching parameter extraction and rendering on display devices |
US8724895B2 (en) | 2007-07-23 | 2014-05-13 | Nvidia Corporation | Techniques for reducing color artifacts in digital images |
US8570634B2 (en) | 2007-10-11 | 2013-10-29 | Nvidia Corporation | Image processing of an incoming light field using a spatial light modulator |
US9177368B2 (en) | 2007-12-17 | 2015-11-03 | Nvidia Corporation | Image distortion correction |
US8780128B2 (en) | 2007-12-17 | 2014-07-15 | Nvidia Corporation | Contiguously packed data |
US8698908B2 (en) | 2008-02-11 | 2014-04-15 | Nvidia Corporation | Efficient method for reducing noise and blur in a composite still image from a rolling shutter camera |
US9379156B2 (en) * | 2008-04-10 | 2016-06-28 | Nvidia Corporation | Per-channel image intensity correction |
US8280194B2 (en) * | 2008-04-29 | 2012-10-02 | Sony Corporation | Reduced hardware implementation for a two-picture depth map algorithm |
US8194995B2 (en) * | 2008-09-30 | 2012-06-05 | Sony Corporation | Fast camera auto-focus |
US8553093B2 (en) | 2008-09-30 | 2013-10-08 | Sony Corporation | Method and apparatus for super-resolution imaging using digital imaging devices |
US8373718B2 (en) | 2008-12-10 | 2013-02-12 | Nvidia Corporation | Method and system for color enhancement with color volume adjustment and variable shift along luminance axis |
US8290260B2 (en) * | 2008-12-15 | 2012-10-16 | Xerox Corporation | Method and system for creating integrated remote custom rendering profile |
US20100198876A1 (en) * | 2009-02-02 | 2010-08-05 | Honeywell International, Inc. | Apparatus and method of embedding meta-data in a captured image |
DE102009002393A1 (de) * | 2009-04-15 | 2010-11-04 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Verfahren und Vorrichtung zur Bearbeitung von Aufnahmebildern einer digitalen Videokamera |
US8749662B2 (en) | 2009-04-16 | 2014-06-10 | Nvidia Corporation | System and method for lens shading image correction |
CN101551661B (zh) * | 2009-05-12 | 2013-04-24 | 广东工业大学 | 一种面向多机器人系统的控制方法 |
US9519814B2 (en) | 2009-06-12 | 2016-12-13 | Hand Held Products, Inc. | Portable data terminal |
FR2948521B1 (fr) | 2009-07-21 | 2012-01-27 | Dxo Labs | Procede d'estimation d'un defaut d'un systeme de capture d'images et systemes associes |
US8698918B2 (en) | 2009-10-27 | 2014-04-15 | Nvidia Corporation | Automatic white balancing for photography |
KR20110065997A (ko) * | 2009-12-10 | 2011-06-16 | 삼성전자주식회사 | 영상처리장치 및 영상처리방법 |
KR101451136B1 (ko) * | 2010-03-19 | 2014-10-16 | 삼성테크윈 주식회사 | 비네팅 보정 방법 및 장치 |
US8335390B2 (en) * | 2010-03-22 | 2012-12-18 | Sony Corporation | Blur function modeling for depth of field rendering |
US8660372B2 (en) * | 2010-05-10 | 2014-02-25 | Board Of Regents Of The University Of Texas System | Determining quality of an image or video using a distortion classifier |
CN102338972A (zh) * | 2010-07-21 | 2012-02-01 | 华晶科技股份有限公司 | 多人脸区块辅助对焦的方法 |
US20120019709A1 (en) * | 2010-07-21 | 2012-01-26 | Altek Corporation | Assisting focusing method using multiple face blocks |
CH703996A2 (de) | 2010-10-24 | 2012-04-30 | Airlight Energy Ip Sa | Sonnenkollektor. |
EP2447889A1 (fr) * | 2010-10-29 | 2012-05-02 | Siemens Aktiengesellschaft | Procédé pour la modélisation d'une gestion de défauts dans un procédé de fabrication et pour traiter le défaut pendant le procédé de fabrication basé sur ladite gestion de défauts |
CN102625043B (zh) | 2011-01-25 | 2014-12-10 | 佳能株式会社 | 图像处理设备、成像设备和图像处理方法 |
US8842931B2 (en) * | 2011-02-18 | 2014-09-23 | Nvidia Corporation | System, method, and computer program product for reducing noise in an image using depth-based sweeping over image samples |
JP5367749B2 (ja) * | 2011-03-25 | 2013-12-11 | 株式会社東芝 | サーバ装置、通信方法およびプログラム |
US10331658B2 (en) * | 2011-06-03 | 2019-06-25 | Gdial Inc. | Systems and methods for atomizing and individuating data as data quanta |
US8712181B2 (en) * | 2011-06-14 | 2014-04-29 | Apteryx, Inc. | Real-time application of filters based on image attributes |
EP2552099B1 (fr) | 2011-07-27 | 2013-08-28 | Axis AB | Procédé et caméra pour fournir une estimation d'une valeur moyenne du rapport signal à bruit pour une image |
EP2692280B1 (fr) * | 2011-11-16 | 2019-01-09 | Olympus Corporation | Processeur de signal d'image pour endoscope |
JP2013123812A (ja) * | 2011-12-13 | 2013-06-24 | Canon Inc | 検査装置、検査方法、コンピュータプログラム |
US20130329996A1 (en) * | 2012-06-10 | 2013-12-12 | Apple Inc. | Method and system for auto-enhancing photographs with tonal response curves |
JP5656926B2 (ja) | 2012-06-22 | 2015-01-21 | キヤノン株式会社 | 画像処理方法、画像処理装置および撮像装置 |
US8976271B2 (en) | 2012-07-19 | 2015-03-10 | Canon Kabushiki Kaisha | Optical system and image pickup apparatus |
CN104619237B (zh) | 2012-07-26 | 2018-03-30 | 德普伊辛迪斯制品公司 | 光不足环境中的ycbcr脉冲调制的照明方案 |
MX356890B (es) | 2012-07-26 | 2018-06-19 | Depuy Synthes Products Inc | Video continuo en un entorno deficiente de luz. |
IN2015MN00020A (fr) | 2012-07-26 | 2015-10-16 | Olive Medical Corp | |
US9798698B2 (en) | 2012-08-13 | 2017-10-24 | Nvidia Corporation | System and method for multi-color dilu preconditioner |
US9508318B2 (en) | 2012-09-13 | 2016-11-29 | Nvidia Corporation | Dynamic color profile management for electronic devices |
US8867817B1 (en) * | 2012-10-29 | 2014-10-21 | Amazon Technologies, Inc. | Display analysis using scanned images |
GB2507576A (en) * | 2012-11-05 | 2014-05-07 | British Broadcasting Corp | Focus detection |
US9307213B2 (en) | 2012-11-05 | 2016-04-05 | Nvidia Corporation | Robust selection and weighting for gray patch automatic white balancing |
US9026553B2 (en) * | 2012-11-29 | 2015-05-05 | Unisys Corporation | Data expanse viewer for database systems |
CA2906821A1 (fr) | 2013-03-15 | 2014-09-18 | Olive Medical Corporation | Detection d'un scope dans un environnement a lumiere controlee |
WO2014144947A1 (fr) | 2013-03-15 | 2014-09-18 | Olive Medical Corporation | Correction d'artéfacts de mouvements en couleurs et en super-résolution dans un système d'imagerie en couleurs à impulsions |
CN105246395B (zh) | 2013-03-15 | 2019-01-22 | 德普伊新特斯产品公司 | 综合固定模式噪声消除 |
BR112015022944A2 (pt) | 2013-03-15 | 2017-07-18 | Olive Medical Corp | calibração com o uso de tampa distal |
US9777913B2 (en) | 2013-03-15 | 2017-10-03 | DePuy Synthes Products, Inc. | Controlling the integral light energy of a laser pulse |
AU2014233518C1 (en) | 2013-03-15 | 2019-04-04 | DePuy Synthes Products, Inc. | Noise aware edge enhancement |
US9756222B2 (en) | 2013-06-26 | 2017-09-05 | Nvidia Corporation | Method and system for performing white balancing operations on captured images |
US9826208B2 (en) | 2013-06-26 | 2017-11-21 | Nvidia Corporation | Method and system for generating weights for use in white balancing an image |
US9167706B2 (en) | 2013-08-05 | 2015-10-20 | Steven J. Holmstrom | Electronic flight bag retention device |
WO2015131045A1 (fr) | 2014-02-28 | 2015-09-03 | The Board Of Trustees Of The Leland Stanford Junior University | Imagerie fournissant un rapport d'intensité de pixel |
US10084944B2 (en) | 2014-03-21 | 2018-09-25 | DePuy Synthes Products, Inc. | Card edge connector for an imaging sensor |
US9396409B2 (en) | 2014-09-29 | 2016-07-19 | At&T Intellectual Property I, L.P. | Object based image processing |
CN104363986B (zh) * | 2014-10-31 | 2017-06-13 | 华为技术有限公司 | 一种图像处理方法和设备 |
US10334216B2 (en) * | 2014-11-06 | 2019-06-25 | Sony Corporation | Imaging system including lens with longitudinal chromatic aberration, endoscope and imaging method |
JP6465752B2 (ja) * | 2015-05-29 | 2019-02-06 | キヤノン株式会社 | 制御装置、制御方法、及びプログラム |
JP6113386B1 (ja) | 2015-08-13 | 2017-04-12 | Hoya株式会社 | 評価値計算装置及び電子内視鏡システム |
CN106687023B (zh) | 2015-08-13 | 2018-12-18 | Hoya株式会社 | 评价值计算装置以及电子内窥镜系统 |
US9838646B2 (en) * | 2015-09-24 | 2017-12-05 | Cisco Technology, Inc. | Attenuation of loudspeaker in microphone array |
EP3516873A1 (fr) * | 2016-09-19 | 2019-07-31 | InterDigital VC Holdings, Inc. | Procédé et dispositif de reconstruction d'un nuage de points représentatif d'une scène à l'aide de données de champ lumineux |
WO2018070793A1 (fr) * | 2016-10-12 | 2018-04-19 | Samsung Electronics Co., Ltd. | Procédé, appareil et support d'enregistrement de traitement d'image |
CN110520691B (zh) * | 2017-04-03 | 2021-09-10 | 三菱电机株式会社 | 映射图数据生成装置和方法 |
US10733262B2 (en) * | 2017-10-05 | 2020-08-04 | Adobe Inc. | Attribute control for updating digital content in a digital medium environment |
US10657118B2 (en) | 2017-10-05 | 2020-05-19 | Adobe Inc. | Update basis for updating digital content in a digital medium environment |
US11551257B2 (en) | 2017-10-12 | 2023-01-10 | Adobe Inc. | Digital media environment for analysis of audience segments in a digital marketing campaign |
US10685375B2 (en) | 2017-10-12 | 2020-06-16 | Adobe Inc. | Digital media environment for analysis of components of content in a digital marketing campaign |
US11544743B2 (en) | 2017-10-16 | 2023-01-03 | Adobe Inc. | Digital content control based on shared machine learning properties |
US10795647B2 (en) | 2017-10-16 | 2020-10-06 | Adobe, Inc. | Application digital content control using an embedded machine learning module |
GB2570278B (en) * | 2017-10-31 | 2020-09-16 | Cambium Networks Ltd | Spectrum management for a point-to-multipoint wireless network |
US10853766B2 (en) | 2017-11-01 | 2020-12-01 | Adobe Inc. | Creative brief schema |
US10991012B2 (en) | 2017-11-01 | 2021-04-27 | Adobe Inc. | Creative brief-based content creation |
EP3731726A4 (fr) * | 2017-12-27 | 2021-10-27 | Ethicon LLC | Imagerie hyperspectrale dans un environnement peu éclairé |
CN108074241B (zh) * | 2018-01-16 | 2021-10-22 | 深圳大学 | 目标图像的质量评分方法、装置、终端及存储介质 |
JP7278096B2 (ja) * | 2019-02-20 | 2023-05-19 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
US11931009B2 (en) | 2019-06-20 | 2024-03-19 | Cilag Gmbh International | Offset illumination of a scene using multiple emitters in a hyperspectral imaging system |
US11898909B2 (en) | 2019-06-20 | 2024-02-13 | Cilag Gmbh International | Noise aware edge enhancement in a pulsed fluorescence imaging system |
US11398011B2 (en) | 2019-06-20 | 2022-07-26 | Cilag Gmbh International | Super resolution and color motion artifact correction in a pulsed laser mapping imaging system |
US11237270B2 (en) | 2019-06-20 | 2022-02-01 | Cilag Gmbh International | Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation |
US11412152B2 (en) | 2019-06-20 | 2022-08-09 | Cilag Gmbh International | Speckle removal in a pulsed hyperspectral imaging system |
US11925328B2 (en) | 2019-06-20 | 2024-03-12 | Cilag Gmbh International | Noise aware edge enhancement in a pulsed hyperspectral imaging system |
US11550057B2 (en) | 2019-06-20 | 2023-01-10 | Cilag Gmbh International | Offset illumination of a scene using multiple emitters in a fluorescence imaging system |
US11122968B2 (en) | 2019-06-20 | 2021-09-21 | Cilag Gmbh International | Optical fiber waveguide in an endoscopic system for hyperspectral imaging |
US11937784B2 (en) | 2019-06-20 | 2024-03-26 | Cilag Gmbh International | Fluorescence imaging in a light deficient environment |
US11134832B2 (en) | 2019-06-20 | 2021-10-05 | Cilag Gmbh International | Image rotation in an endoscopic hyperspectral, fluorescence, and laser mapping imaging system |
US11457154B2 (en) | 2019-06-20 | 2022-09-27 | Cilag Gmbh International | Speckle removal in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US10841504B1 (en) | 2019-06-20 | 2020-11-17 | Ethicon Llc | Fluorescence imaging with minimal area monolithic image sensor |
US11516388B2 (en) | 2019-06-20 | 2022-11-29 | Cilag Gmbh International | Pulsed illumination in a fluorescence imaging system |
US11291358B2 (en) | 2019-06-20 | 2022-04-05 | Cilag Gmbh International | Fluorescence videostroboscopy of vocal cords |
US11218645B2 (en) | 2019-06-20 | 2022-01-04 | Cilag Gmbh International | Wide dynamic range using a monochrome image sensor for fluorescence imaging |
US11187658B2 (en) | 2019-06-20 | 2021-11-30 | Cilag Gmbh International | Fluorescence imaging with fixed pattern noise cancellation |
US11540696B2 (en) | 2019-06-20 | 2023-01-03 | Cilag Gmbh International | Noise aware edge enhancement in a pulsed fluorescence imaging system |
US11793399B2 (en) | 2019-06-20 | 2023-10-24 | Cilag Gmbh International | Super resolution and color motion artifact correction in a pulsed hyperspectral imaging system |
US11389066B2 (en) | 2019-06-20 | 2022-07-19 | Cilag Gmbh International | Noise aware edge enhancement in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US11633089B2 (en) | 2019-06-20 | 2023-04-25 | Cilag Gmbh International | Fluorescence imaging with minimal area monolithic image sensor |
US11172810B2 (en) | 2019-06-20 | 2021-11-16 | Cilag Gmbh International | Speckle removal in a pulsed laser mapping imaging system |
US11288772B2 (en) | 2019-06-20 | 2022-03-29 | Cilag Gmbh International | Super resolution and color motion artifact correction in a pulsed fluorescence imaging system |
US11265491B2 (en) | 2019-06-20 | 2022-03-01 | Cilag Gmbh International | Fluorescence imaging with fixed pattern noise cancellation |
US11221414B2 (en) | 2019-06-20 | 2022-01-11 | Cilag Gmbh International | Laser mapping imaging with fixed pattern noise cancellation |
US11622094B2 (en) | 2019-06-20 | 2023-04-04 | Cilag Gmbh International | Wide dynamic range using a monochrome image sensor for fluorescence imaging |
US11533417B2 (en) | 2019-06-20 | 2022-12-20 | Cilag Gmbh International | Laser scanning and tool tracking imaging in a light deficient environment |
US11012599B2 (en) | 2019-06-20 | 2021-05-18 | Ethicon Llc | Hyperspectral imaging in a light deficient environment |
US11516387B2 (en) | 2019-06-20 | 2022-11-29 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US11432706B2 (en) | 2019-06-20 | 2022-09-06 | Cilag Gmbh International | Hyperspectral imaging with minimal area monolithic image sensor |
US11671691B2 (en) | 2019-06-20 | 2023-06-06 | Cilag Gmbh International | Image rotation in an endoscopic laser mapping imaging system |
US11758256B2 (en) | 2019-06-20 | 2023-09-12 | Cilag Gmbh International | Fluorescence imaging in a light deficient environment |
US11147436B2 (en) | 2019-06-20 | 2021-10-19 | Cilag Gmbh International | Image rotation in an endoscopic fluorescence imaging system |
US11294062B2 (en) | 2019-06-20 | 2022-04-05 | Cilag Gmbh International | Dynamic range using a monochrome image sensor for hyperspectral and fluorescence imaging and topology laser mapping |
US11172811B2 (en) | 2019-06-20 | 2021-11-16 | Cilag Gmbh International | Image rotation in an endoscopic fluorescence imaging system |
US11624830B2 (en) | 2019-06-20 | 2023-04-11 | Cilag Gmbh International | Wide dynamic range using a monochrome image sensor for laser mapping imaging |
US11716533B2 (en) | 2019-06-20 | 2023-08-01 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system |
US11674848B2 (en) | 2019-06-20 | 2023-06-13 | Cilag Gmbh International | Wide dynamic range using a monochrome image sensor for hyperspectral imaging |
US11375886B2 (en) | 2019-06-20 | 2022-07-05 | Cilag Gmbh International | Optical fiber waveguide in an endoscopic system for laser mapping imaging |
US11076747B2 (en) | 2019-06-20 | 2021-08-03 | Cilag Gmbh International | Driving light emissions according to a jitter specification in a laser mapping imaging system |
US11700995B2 (en) | 2019-06-20 | 2023-07-18 | Cilag Gmbh International | Speckle removal in a pulsed fluorescence imaging system |
US20200397246A1 (en) | 2019-06-20 | 2020-12-24 | Ethicon Llc | Minimizing image sensor input/output in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US11716543B2 (en) | 2019-06-20 | 2023-08-01 | Cilag Gmbh International | Wide dynamic range using a monochrome image sensor for fluorescence imaging |
US11360028B2 (en) | 2019-06-20 | 2022-06-14 | Cilag Gmbh International | Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system |
US11471055B2 (en) | 2019-06-20 | 2022-10-18 | Cilag Gmbh International | Noise aware edge enhancement in a pulsed fluorescence imaging system |
US10979646B2 (en) | 2019-06-20 | 2021-04-13 | Ethicon Llc | Fluorescence imaging with minimal area monolithic image sensor |
US11187657B2 (en) | 2019-06-20 | 2021-11-30 | Cilag Gmbh International | Hyperspectral imaging with fixed pattern noise cancellation |
US11412920B2 (en) | 2019-06-20 | 2022-08-16 | Cilag Gmbh International | Speckle removal in a pulsed fluorescence imaging system |
US20200397239A1 (en) | 2019-06-20 | 2020-12-24 | Ethicon Llc | Offset illumination of a scene using multiple emitters in a fluorescence imaging system |
US11233960B2 (en) | 2019-06-20 | 2022-01-25 | Cilag Gmbh International | Fluorescence imaging with fixed pattern noise cancellation |
US10952619B2 (en) | 2019-06-20 | 2021-03-23 | Ethicon Llc | Hyperspectral and fluorescence imaging and topology laser mapping with minimal area monolithic image sensor |
US11892403B2 (en) | 2019-06-20 | 2024-02-06 | Cilag Gmbh International | Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system |
US11213194B2 (en) | 2019-06-20 | 2022-01-04 | Cilag Gmbh International | Optical fiber waveguide in an endoscopic system for hyperspectral, fluorescence, and laser mapping imaging |
US11903563B2 (en) | 2019-06-20 | 2024-02-20 | Cilag Gmbh International | Offset illumination of a scene using multiple emitters in a fluorescence imaging system |
US11276148B2 (en) | 2019-06-20 | 2022-03-15 | Cilag Gmbh International | Super resolution and color motion artifact correction in a pulsed fluorescence imaging system |
US11284785B2 (en) | 2019-06-20 | 2022-03-29 | Cilag Gmbh International | Controlling integral energy of a laser pulse in a hyperspectral, fluorescence, and laser mapping imaging system |
US11631202B2 (en) * | 2021-01-08 | 2023-04-18 | Samsung Electronics Co., Ltd. | System and method for obtaining and applying a vignette filter and grain layer |
US11829239B2 (en) | 2021-11-17 | 2023-11-28 | Adobe Inc. | Managing machine learning model reconstruction |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0647921A2 (fr) * | 1993-10-08 | 1995-04-12 | Xerox Corporation | Format structure d'image pour la description d'images en trame complexes en couleur |
EP0686945A2 (fr) * | 1994-05-26 | 1995-12-13 | Canon Kabushiki Kaisha | Méthode et appareil de traitement d'images |
JPH10319929A (ja) * | 1997-05-19 | 1998-12-04 | Matsushita Electric Ind Co Ltd | 表示装置 |
WO1999027470A1 (fr) * | 1997-11-26 | 1999-06-03 | Flashpoint Technology, Inc. | Procede et systeme d'extansion de formats de fichiers image disponibles dans un dispositif de capture d'image |
EP0964353A2 (fr) * | 1998-06-12 | 1999-12-15 | Canon Kabushiki Kaisha | Appareil de traitement d'images et mémoire adressable par ordinateur |
WO2001035052A1 (fr) * | 1999-11-12 | 2001-05-17 | Armstrong Brian S | Points de repere robustes pour vision artificielle et procede de detection desdits points de repere |
EP1104175A2 (fr) * | 1999-11-29 | 2001-05-30 | Xerox Corporation | Système de calibration d'un appareil de formation en couleur |
Family Cites Families (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6080374A (ja) * | 1983-10-11 | 1985-05-08 | Hitachi Denshi Ltd | テレビジヨンカメラ装置の撮像特性補正方法 |
FR2652695B1 (fr) * | 1989-10-03 | 1993-04-16 | Thomson Csf | Procede et dispositif de visualisation d'images, a correction automatique de defauts par contre-reaction. |
FR2661061B1 (fr) * | 1990-04-11 | 1992-08-07 | Multi Media Tech | Procede et dispositif de modification de zone d'images. |
US5047861A (en) * | 1990-07-31 | 1991-09-10 | Eastman Kodak Company | Method and apparatus for pixel non-uniformity correction |
US5157497A (en) * | 1991-02-25 | 1992-10-20 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for detecting and compensating for white shading errors in a digitized video signal |
US5251271A (en) * | 1991-10-21 | 1993-10-05 | R. R. Donnelley & Sons Co. | Method for automatic registration of digitized multi-plane images |
JPH05176166A (ja) | 1991-12-25 | 1993-07-13 | Hitachi Ltd | 色再現方法 |
DE69331719T2 (de) * | 1992-06-19 | 2002-10-24 | Agfa Gevaert Nv | Verfahren und Vorrichtung zur Geräuschunterdrückung |
US5905530A (en) * | 1992-08-24 | 1999-05-18 | Canon Kabushiki Kaisha | Image pickup apparatus |
US5323204A (en) * | 1992-11-03 | 1994-06-21 | Eastman Kodak Company | Automatic optimization of photographic exposure parameters for non-standard display sizes and/or different focal length photographing modes through determination and utilization of extra system speed |
US5461440A (en) * | 1993-02-10 | 1995-10-24 | Olympus Optical Co., Ltd. | Photographing image correction system |
US5353362A (en) * | 1993-05-17 | 1994-10-04 | Tucci Robert R | Method of generation of two electromagnetic modes using squeezers |
JPH0715631A (ja) * | 1993-06-29 | 1995-01-17 | Nippon Telegr & Teleph Corp <Ntt> | 画像信号雑音除去方法および装置 |
US5499057A (en) | 1993-08-27 | 1996-03-12 | Sony Corporation | Apparatus for producing a noise-reducded image signal from an input image signal |
US6334219B1 (en) * | 1994-09-26 | 2001-12-25 | Adc Telecommunications Inc. | Channel selection for a hybrid fiber coax network |
JPH08116490A (ja) * | 1994-10-14 | 1996-05-07 | Olympus Optical Co Ltd | 画像処理装置 |
KR100203239B1 (ko) * | 1995-02-16 | 1999-06-15 | 윤종용 | 화이트쉐이딩 보정방법 및 장치 |
US5606365A (en) * | 1995-03-28 | 1997-02-25 | Eastman Kodak Company | Interactive camera for network processing of captured images |
US5694484A (en) | 1995-05-15 | 1997-12-02 | Polaroid Corporation | System and method for automatically processing image data to provide images of optimal perceptual quality |
JPH0998299A (ja) | 1995-10-02 | 1997-04-08 | Canon Inc | 画像処理装置及び方法 |
JP3409541B2 (ja) | 1995-11-14 | 2003-05-26 | 三菱電機株式会社 | 色補正方法及び色補正装置並びに色補正応用装置及びカラー画像システム |
EP0868690A1 (fr) * | 1995-12-19 | 1998-10-07 | TELEFONAKTIEBOLAGET L M ERICSSON (publ) | Programmation de travaux destinee a une unite de traitement d'instructions |
US5696850A (en) * | 1995-12-21 | 1997-12-09 | Eastman Kodak Company | Automatic image sharpening in an electronic imaging system |
JPH09214807A (ja) * | 1996-01-31 | 1997-08-15 | Canon Inc | 画像処理装置および画像処理方法 |
JP3950188B2 (ja) * | 1996-02-27 | 2007-07-25 | 株式会社リコー | 画像歪み補正用パラメータ決定方法及び撮像装置 |
JPH1083024A (ja) | 1996-09-09 | 1998-03-31 | Fuji Photo Film Co Ltd | カメラ及びプリンタ |
JP3791635B2 (ja) * | 1996-10-22 | 2006-06-28 | 富士写真フイルム株式会社 | 画像再生方法、画像再生装置、画像処理方法および画像処理装置 |
US6173087B1 (en) * | 1996-11-13 | 2001-01-09 | Sarnoff Corporation | Multi-view image registration with application to mosaicing and lens distortion correction |
US6100925A (en) * | 1996-11-27 | 2000-08-08 | Princeton Video Image, Inc. | Image insertion in video streams using a combination of physical sensors and pattern recognition |
US6094221A (en) * | 1997-01-02 | 2000-07-25 | Andersion; Eric C. | System and method for using a scripting language to set digital camera device features |
JPH10226139A (ja) * | 1997-02-14 | 1998-08-25 | Canon Inc | 画像形成システム及び画像形成装置及び媒体 |
US6249315B1 (en) * | 1997-03-24 | 2001-06-19 | Jack M. Holm | Strategy for pictorial digital image processing |
US6222583B1 (en) * | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
JP3225882B2 (ja) | 1997-03-27 | 2001-11-05 | 日本電信電話株式会社 | 景観ラベリングシステム |
US5990935A (en) * | 1997-04-04 | 1999-11-23 | Evans & Sutherland Computer Corporation | Method for measuring camera and lens properties for camera tracking |
JP3911354B2 (ja) * | 1997-09-02 | 2007-05-09 | 大日本スクリーン製造株式会社 | 画像処理方法および装置、並びにその処理を実行するためのプログラムを記録した記録媒体 |
JPH11146308A (ja) | 1997-11-13 | 1999-05-28 | Fuji Photo Film Co Ltd | 画像情報記録装置および画像プリントシステム |
DE19855885A1 (de) * | 1997-12-04 | 1999-08-05 | Fuji Photo Film Co Ltd | Bildverarbeitungsverfahren und -vorrichtung |
US6069982A (en) * | 1997-12-23 | 2000-05-30 | Polaroid Corporation | Estimation of frequency dependence and grey-level dependence of noise in an image |
JPH11220687A (ja) * | 1998-01-30 | 1999-08-10 | Fuji Photo Film Co Ltd | 画像処理方法および装置 |
US6381375B1 (en) * | 1998-02-20 | 2002-04-30 | Cognex Corporation | Methods and apparatus for generating a projection of an image |
DE19812028A1 (de) * | 1998-03-19 | 1999-09-23 | Heidelberger Druckmasch Ag | Verfahren zur Koordinatenumrechnung von Bilddaten mit zufälligem Offset der Bildpunkte |
JP3926918B2 (ja) | 1998-03-20 | 2007-06-06 | 富士通株式会社 | 画像補正処理装置及びそのプログラム記録媒体 |
US6603885B1 (en) * | 1998-04-30 | 2003-08-05 | Fuji Photo Film Co., Ltd. | Image processing method and apparatus |
JP4187830B2 (ja) | 1998-07-03 | 2008-11-26 | 東芝医用システムエンジニアリング株式会社 | 医用画像合成装置 |
EP1097431B1 (fr) * | 1998-07-15 | 2003-12-10 | Kodak Polychrome Graphics LLC | Systeme et procede d'imagerie |
JP4095184B2 (ja) * | 1998-10-29 | 2008-06-04 | キヤノン株式会社 | 画像処理装置及びその方法 |
JP2000165647A (ja) * | 1998-11-26 | 2000-06-16 | Seiko Epson Corp | 画像データ処理方法および画像データ印刷装置並びに画像データ処理プログラムを記録した記録媒体 |
JP4154053B2 (ja) * | 1998-12-25 | 2008-09-24 | キヤノン株式会社 | 画像記録・再生システム、画像記録装置及び画像再生装置 |
US6538691B1 (en) * | 1999-01-21 | 2003-03-25 | Intel Corporation | Software correction of image distortion in digital cameras |
JP4072302B2 (ja) * | 1999-04-13 | 2008-04-09 | キヤノン株式会社 | データ処理方法及び装置及び記憶媒体 |
US6856427B1 (en) * | 1999-05-20 | 2005-02-15 | Eastman Kodak Company | System for printing correct exposure in a rendered digital image |
US6693668B1 (en) * | 1999-06-04 | 2004-02-17 | Canon Kabushiki Kaisha | Self-diagnostic image sensor |
US6707950B1 (en) * | 1999-06-22 | 2004-03-16 | Eastman Kodak Company | Method for modification of non-image data in an image processing chain |
US6470151B1 (en) * | 1999-06-22 | 2002-10-22 | Canon Kabushiki Kaisha | Camera, image correcting apparatus, image correcting system, image correcting method, and computer program product providing the image correcting method |
JP2001016449A (ja) | 1999-06-25 | 2001-01-19 | Ricoh Co Ltd | 画像入力装置 |
US6633408B1 (en) | 1999-06-29 | 2003-10-14 | Kodak Polychrome Graphics, Llc | Spectral modeling of photographic printing based on dye concentration |
DE20080319U1 (de) | 1999-06-30 | 2002-05-16 | Logitech Inc | Videokamera, bei der die Hauptfunktionen in der Hauptrechnersoftware implementiert werden |
JP4822571B2 (ja) * | 1999-08-03 | 2011-11-24 | キヤノン株式会社 | デジタルx線撮影システム及び方法 |
DE19943183A1 (de) * | 1999-09-09 | 2001-03-15 | Heimann Systems Gmbh & Co | Verfahren zur Farbanpassung eines Bildes, insbesondere eines Röntgenbildes |
JP2001094848A (ja) | 1999-09-20 | 2001-04-06 | Canon Inc | モニター付カメラ |
KR100414083B1 (ko) * | 1999-12-18 | 2004-01-07 | 엘지전자 주식회사 | 영상왜곡 보정방법 및 이를 이용한 영상표시기기 |
US6816625B2 (en) * | 2000-08-16 | 2004-11-09 | Lewis Jr Clarence A | Distortion free image capture system and method |
JP3429280B2 (ja) * | 2000-09-05 | 2003-07-22 | 理化学研究所 | 画像のレンズ歪みの補正方法 |
JP4399133B2 (ja) * | 2000-09-08 | 2010-01-13 | カシオ計算機株式会社 | 撮影条件提供装置、撮影条件設定システム、撮影条件提供方法 |
US6956966B2 (en) * | 2001-04-03 | 2005-10-18 | Electronics For Imaging, Inc. | Method and apparatus for automated image correction for digital image acquisition |
DE60234207D1 (de) * | 2001-07-12 | 2009-12-10 | Do Labs | Verfahren und system zur verringerung der aktualisierungs-häufigkeit |
CN100345158C (zh) * | 2001-07-12 | 2007-10-24 | 杜莱布斯公司 | 用于产生涉及几何失真的格式化信息的方法和系统 |
FR2827459B1 (fr) * | 2001-07-12 | 2004-10-29 | Poseidon | Procede et systeme pour fournir a des logiciels de traitement d'image des informations formatees liees aux caracteristiques des appareils de capture d'image et/ou des moyens de restitution d'image |
US6873727B2 (en) * | 2001-07-23 | 2005-03-29 | Hewlett-Packard Development Company, L.P. | System for setting image characteristics using embedded camera tag information |
FR2895102B1 (fr) * | 2005-12-19 | 2012-12-07 | Dxo Labs | Procede pour traiter un objet dans une plateforme a processeur(s) et memoire(s) et plateforme utilisant le procede |
FR2895104A1 (fr) * | 2005-12-19 | 2007-06-22 | Dxo Labs Sa | Procede pour fournir des donnees a un moyen de traitement numerique |
FR2895103B1 (fr) * | 2005-12-19 | 2008-02-22 | Dxo Labs Sa | Procede et systeme de traitement de donnees numeriques |
-
2002
- 2002-06-05 DE DE60234207T patent/DE60234207D1/de not_active Expired - Lifetime
- 2002-06-05 CN CNB028139577A patent/CN1305010C/zh not_active Expired - Fee Related
- 2002-06-05 CN CNB028139526A patent/CN1316426C/zh not_active Expired - Fee Related
- 2002-06-05 EP EP02743349A patent/EP1410326B1/fr not_active Expired - Lifetime
- 2002-06-05 AU AU2002317219A patent/AU2002317219A1/en not_active Abandoned
- 2002-06-05 JP JP2003512931A patent/JP4452497B2/ja not_active Expired - Fee Related
- 2002-06-05 EP EP02745485.9A patent/EP1410331B1/fr not_active Expired - Lifetime
- 2002-06-05 WO PCT/FR2002/001908 patent/WO2003007243A2/fr active Application Filing
- 2002-06-05 JP JP2003512924A patent/JP4614657B2/ja not_active Expired - Fee Related
- 2002-06-05 AT AT02747506T patent/ATE310284T1/de not_active IP Right Cessation
- 2002-06-05 US US10/483,497 patent/US7724977B2/en not_active Expired - Lifetime
- 2002-06-05 WO PCT/FR2002/001910 patent/WO2003007236A2/fr active Application Filing
- 2002-06-05 JP JP2003512927A patent/JP4295612B2/ja not_active Expired - Fee Related
- 2002-06-05 US US10/483,496 patent/US7343040B2/en not_active Expired - Fee Related
- 2002-06-05 KR KR1020047000412A patent/KR100879832B1/ko not_active IP Right Cessation
- 2002-06-05 CN CNB028139534A patent/CN100361153C/zh not_active Expired - Fee Related
- 2002-06-05 US US10/483,322 patent/US7760955B2/en active Active
- 2002-06-05 AU AU2002317900A patent/AU2002317900A1/en not_active Abandoned
- 2002-06-05 EP EP02748933A patent/EP1415275B1/fr not_active Expired - Lifetime
- 2002-06-05 ES ES02748933T patent/ES2311061T3/es not_active Expired - Lifetime
- 2002-06-05 DE DE60207417T patent/DE60207417T2/de not_active Expired - Lifetime
- 2002-06-05 JP JP2003512930A patent/JP4367757B2/ja not_active Expired - Fee Related
- 2002-06-05 CA CA2453423A patent/CA2453423C/fr not_active Expired - Fee Related
- 2002-06-05 AT AT02747504T patent/ATE447216T1/de not_active IP Right Cessation
- 2002-06-05 EP EP02747506A patent/EP1410327B1/fr not_active Expired - Lifetime
- 2002-06-05 CN CNB028139569A patent/CN1316427C/zh not_active Expired - Fee Related
- 2002-06-05 WO PCT/FR2002/001911 patent/WO2003007240A1/fr active Application Filing
- 2002-06-05 US US10/483,495 patent/US7346221B2/en not_active Expired - Lifetime
- 2002-06-05 EP EP02747504A patent/EP1444651B1/fr not_active Expired - Lifetime
- 2002-06-05 AT AT02748933T patent/ATE400040T1/de not_active IP Right Cessation
- 2002-06-05 DE DE60227374T patent/DE60227374D1/de not_active Expired - Lifetime
- 2002-06-05 US US10/482,413 patent/US8675980B2/en not_active Expired - Fee Related
- 2002-06-05 EP EP02751241.7A patent/EP1442425B1/fr not_active Expired - Lifetime
- 2002-06-05 KR KR1020047000413A patent/KR100957878B1/ko not_active IP Right Cessation
- 2002-06-05 JP JP2003512928A patent/JP4020262B2/ja not_active Expired - Fee Related
- 2002-06-05 US US10/483,494 patent/US7792378B2/en active Active
- 2002-06-05 WO PCT/FR2002/001914 patent/WO2003007241A1/fr active Application Filing
- 2002-06-05 JP JP2003512929A patent/JP4295613B2/ja not_active Expired - Fee Related
- 2002-06-05 WO PCT/FR2002/001915 patent/WO2003007242A2/fr active IP Right Grant
- 2002-06-05 DE DE60239061T patent/DE60239061D1/de not_active Expired - Lifetime
- 2002-06-05 AU AU2002317902A patent/AU2002317902A1/en not_active Abandoned
- 2002-06-05 EP EP08100657.9A patent/EP2015247B1/fr not_active Expired - Lifetime
- 2002-06-05 AT AT02743349T patent/ATE497224T1/de not_active IP Right Cessation
- 2002-06-05 CN CNB028139518A patent/CN1305006C/zh not_active Expired - Fee Related
- 2002-06-05 ES ES02747506T patent/ES2253542T3/es not_active Expired - Lifetime
- 2002-06-05 KR KR1020047000417A patent/KR100940148B1/ko not_active IP Right Cessation
- 2002-06-05 CN CNB028139542A patent/CN1273931C/zh not_active Expired - Lifetime
- 2002-06-05 WO PCT/FR2002/001909 patent/WO2003007239A1/fr active IP Right Grant
- 2002-06-05 KR KR1020047000414A patent/KR100940147B1/ko active IP Right Grant
-
2010
- 2010-07-16 US US12/838,184 patent/US20100278415A1/en not_active Abandoned
- 2010-07-16 US US12/838,198 patent/US8559743B2/en not_active Expired - Lifetime
-
2012
- 2012-06-06 US US13/489,892 patent/US20120308160A1/en not_active Abandoned
-
2013
- 2013-09-09 US US14/021,235 patent/US9536284B2/en not_active Expired - Lifetime
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0647921A2 (fr) * | 1993-10-08 | 1995-04-12 | Xerox Corporation | Format structure d'image pour la description d'images en trame complexes en couleur |
EP0686945A2 (fr) * | 1994-05-26 | 1995-12-13 | Canon Kabushiki Kaisha | Méthode et appareil de traitement d'images |
JPH10319929A (ja) * | 1997-05-19 | 1998-12-04 | Matsushita Electric Ind Co Ltd | 表示装置 |
WO1999027470A1 (fr) * | 1997-11-26 | 1999-06-03 | Flashpoint Technology, Inc. | Procede et systeme d'extansion de formats de fichiers image disponibles dans un dispositif de capture d'image |
EP0964353A2 (fr) * | 1998-06-12 | 1999-12-15 | Canon Kabushiki Kaisha | Appareil de traitement d'images et mémoire adressable par ordinateur |
WO2001035052A1 (fr) * | 1999-11-12 | 2001-05-17 | Armstrong Brian S | Points de repere robustes pour vision artificielle et procede de detection desdits points de repere |
EP1104175A2 (fr) * | 1999-11-29 | 2001-05-30 | Xerox Corporation | Système de calibration d'un appareil de formation en couleur |
Non-Patent Citations (4)
Title |
---|
CHANG S-K ET AL: "SMART IMAGE DESIGN FOR LARGE IMAGE DATABASES", JOURNAL OF VISUAL LANGUAGES AND COMPUTING, LONDON, GB, vol. 3, no. 4, 1 December 1992 (1992-12-01), pages 323 - 342, XP000472773 * |
CLUNIE, D.: "Medical image format FAQ - Part 3. Proprietary formats", NONAME, 3 June 2001 (2001-06-03), XP002178669, Retrieved from the Internet <URL:http://www.dclunie.com/medical-image-faq/html/part3.html> [retrieved on 20010927] * |
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 03 31 March 1999 (1999-03-31) * |
WATANABE M ET AL: "AN IMAGE DATA FILE FORMAT FOR DIGITAL STILL CAMERA", FINAL PROGRAM AND ADVANCE PRINTING OF PAPERS. ANNUAL CONFERENCE. IMAGING ON THE INFORMATION SUPERHIGHWAY, XX, XX, 1995, pages 421 - 424, XP000618775 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008533550A (ja) | 2005-01-19 | 2008-08-21 | ドゥ ラブズ | イメージ記録および/または再現デバイスを製造するための方法、および前記方法によって得られるデバイス |
US11379725B2 (en) | 2018-06-29 | 2022-07-05 | International Business Machines Corporation | Projectile extrapolation and sequence synthesis from video using convolution |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1410326B1 (fr) | Procede et systeme pour modifier la qualite d'image | |
EP1412918B1 (fr) | Procede et systeme pour produire des informations formatees liees aux distorsions geometriques | |
FR2827459A1 (fr) | Procede et systeme pour fournir a des logiciels de traitement d'image des informations formatees liees aux caracteristiques des appareils de capture d'image et/ou des moyens de restitution d'image | |
FR2827460A1 (fr) | Procede et systeme pour fournir, selon un format standard, a des logiciels de traitement d'images des informations liees aux caracteristiques des appareils de capture d'image et/ou des moyens de resti | |
WO2023187170A1 (fr) | Procédé de correction d'aberrations optiques introduites par un objectif optique dans une image, appareil et système mettant en œuvre un tel procédé |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003512928 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020047000414 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20028139542 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002743349 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002743349 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10483494 Country of ref document: US |