CA2158988C - Method and system for image processing - Google Patents

Method and system for image processing Download PDF

Info

Publication number
CA2158988C
CA2158988C CA002158988A CA2158988A CA2158988C CA 2158988 C CA2158988 C CA 2158988C CA 002158988 A CA002158988 A CA 002158988A CA 2158988 A CA2158988 A CA 2158988A CA 2158988 C CA2158988 C CA 2158988C
Authority
CA
Canada
Prior art keywords
image
user
original
parameters
resolution independent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA002158988A
Other languages
French (fr)
Other versions
CA2158988A1 (en
Inventor
Bruno Delean
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intellectual Ventures I LLC
Original Assignee
Live Picture Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=26230196&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CA2158988(C) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from FR9303455A external-priority patent/FR2703170B1/en
Application filed by Live Picture Inc filed Critical Live Picture Inc
Priority claimed from PCT/US1994/003266 external-priority patent/WO1994022101A1/en
Publication of CA2158988A1 publication Critical patent/CA2158988A1/en
Application granted granted Critical
Publication of CA2158988C publication Critical patent/CA2158988C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/02
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4092Image resolution transcoding, e.g. client/server architecture

Abstract

An image processing system containing apparatus and a concomitant method for rapid image processing using a computer system. The system generates a special image format containing a plurality of subimages. When an image is edited, the system uses the information in the subimages to calculate a good depiction of the image on the display at the display resolution given the current pan and zoom. Image alterations are stored within a computer file as an expression tree. Each node in the expression tree defines a particular editing operation that resulted in modifications to the image. To display the image upon a computer monitor display, the expression tree defining the alterations is evaluated to form a composite output image having any resolution necessary to display an accurate rendition of the edited image. Once edited, the output image can be printed on a high resolution printer by combining expression tree with a full resolution image to produce a high resolution output image.

Description

METHOD AND SYSTEM FOR IMAGE PROCESSING
BACKGROUND OF THE INVENTION
1. Field of the Invention This invention relates to image processing in general, and, more particularly, to an image processing method and system that uses a computer system to provide rapid image processing capabilities.
2. Description of the Background Art The present invention was created in response to the shortcomings of the current generation of image retouching systems. Prior art retouching systems typically use one of two methods for handling images: (1) virtual image and (2) high resolution/low resolution. In addition, a text-based retouching method relying on user-programming can be found in the prior art. Each of these approaches overcomes some of the major obstacles confronting image retouching systems. However none of the prior art systems fulfills today's need for a computer system that provides rapid editing of high quality color images at an affordable cost to the system consumer.

.

The virtual image, approach, commonly used by desktop image .
editing packages, e.g.,.Macintosh or Windows based programs, manipulates a copy of the actual image held in memory.
Macintosh is a trademark of Apple Computer, Inc. of Cupertino, California and Windows is a trademark of Microsoft Corporation ' of Redmond, Washington. Typically, the original image is stored, unedited, in memory. In the course of editing, the ' virtual image method constructs one or more copies of intermediate drafts of the edited image. As such, if an error is introduced during editing, the user or operator may revert to a previous copy of the image to correct the error. Using the virtual image approach, the image itself is transformed as retouching effects are applied.
The virtual image approach suffers two important shortcomings: first, large amounts of memory are required to store the various intermediate edited images, and second, each effect is applied immediately to the entire image so that complex manipulation, such as large airbrushing, scaling and rotation, incur long processing delays.
Because prior art image retouching systems based on the virtual image approach did not yield acceptable performance when handling large images (over 10M or 10 million bytes), editing systems using the high resolution/low resolution method were developed. These systems operate on a smaller, i.e., low resolution image, to achieve improved response times for the operator. Using this approach, any retouching actions performed by the operator upon an image are sequentially stored in a script. When retouching is complete, the script is typically passed to a more powerful, and expensive, server and "executed".
As a resul~ of the execution of the script, the retouching actions contained in the script are applied to a high resolution image from which the editing system originally derived the low resolution image. Consequently, the high resolution/low , resolution method results in a high quality f final image that contains the retouching performed upon the low resolution image.
A problem with this approach is that the operator does not retouch and manipulate the actual (high resolution) image. As a result, it is not always possible to perform highly detailed retouching actions such as silhouetting and masking. One example of a high resolution/low resolution approach executing upon a mainframe computer is disclosed in U.S. Patent No.
' 5 5,142,616 issued August 25, 1992 to Kellas et al.
' An alternative approach to image processing is the expression tree method where the images are specified by a set of operations -- either by a computer program or by a mathematical formula. The use of textual expressions to specify an equation that defines image modifications is disclosed by G.
J. Holzmann in BEYOND PHOTOGRAPHY: THE DIGITAL DARKROOM 31-41 (Prentice Hall, 1988). When these textual expressions are written in executable code, a programmer creates an expression tree to facilitate evaluation of the expression on a computer system. The weakness of this method, as used in the prior art, is that the user need be very skilled not only in creative aspects of image creation and editing, but also very skilled in programming and mathematics. Such skill is necessary to accomplish both initially generating the expression tree and to subsequently modify the tree to accomplish image modifications.
In summary, current methods of computerized image processing for editing high resolution images require too much processing power, too much memory or too much programming and mathematical skill from the operator to address the full needs of image retouchers. In addition, much of the prior art imposes unacceptable limitations on the quality of the final result.
Consequently, there is a need in the art for a computerized image processing method and apparatus that enables an operator unskilled in mathematics or programming to accomplish advanced graphic operations rapidly, and to reverse image editing ~ decisions without, in any way, affecting the definition or precision of the final image.
SUMMARY OF TH~ INVENTION
This invention advantageously overcomes the disadvantages heretofore associated with prior art image processing systems. In particular, the invention is an image processing system for rapidly editing images using a desktop computer. Using a graphical user interface, the invention defines editing operations using an expression tree. With each image view change or image modification, the expression tree is either modified or incorporated into a new expression tree.
Ultimately, the expression tree is evaluated and an output image containing the image modifications is displayed or printed. By using an expression tree, the invention operates independently from the resolution of an original image that forms the input to the invention and independently from the resolution of the output image generated by the invention. Thus, the invention provides significant flexibility with respect to input and output image quality andformat.
The subject invention advantageously uses what is termed a Functional Interpolating Transfer System (FITS) to greatly enhance image editing speed using a conventional desktop computer. FITS divides image processing into three steps that are implemented as three distinct computer software routines:
(1) a preprocessing routine, (2) an image editing routine and (3) a FITS raster image processing (RIP) routine. This three step process results in each user image manipulation and editing being virtually instantaneously portrayed on a screen of a computer display monitor.
Specifically, the preprocessing routine creates a specially formatted version of an image that allows image editing to rapidly progress. The special format includes a full resolution image (input image) as well as a series of subimages derived from the full resolution image. Each subimage in the series has a resolution that is less than a preceding subimage in the series, i.e., the series forms an image pyramid of , specially formatted subimages. From this pyramid, the invention automatically selects a subimage, or portion thereof, for , display that contains an appropriate resolution for accurately displaying the image to a user at a selected image magnification.
level. As such, if the magnification level is subsequently WO 94/22101 ~ ~ PCT/US94/03266 _5-changed, a new subimage that best depicts the image on the screen is quickly recalled from memory and displayed.
The image editing routine applies any image editing ' 5 effects to an image (or subimage). The present invention is either implemented as a user operated image processing software package or as a computer controlled stand-alone routine. In either instance, the image editing routine of the invention applies the image modifications (editing effects) to an image.
Typical image editing effects include combining images, sharpening, blurring, brightening, darkening, distortion, and modifications to the color or appearance of all or part of the presently displayed image. The modifications to the image are stored as portions of an expression tree within a so-called FITS
file. The expression tree contains parameters that define the editing effects to ultimately be applied to the image. The content of the FITS file or the expression tree before it is saved to a FITS file is used by the raster image processing routine to generate an output image for display on a computer monitor, for printing on a high resolution printer, or for export to a desktop publishing system.
The FITS raster image processing (RIP) routine is executed in two instances: (1) each time a new screen view is generated for display on a monitor, i.e., after every image modification or image view change, and (2) when an output page is generated to print an edited image or export the edited image to another system such as a desktop publishing system. The FITS
RIP routine combines the input image (subimage) with the modifications (FITS file layers) generated during image editing to create either a screen or print image. The output image generated by the FITS RIP routine can have any resolution; thus it is said to be resolution independent.
The FITS RIP routine takes the ensemble of image manipulations (represented by the expression tree) that are perfornned during the image editing process, combines that ensemble with the unedited image, and computes a single image for purposes of printing or display on a monitor. Modifications w to the image, made during image editing, are characterized by an expression tree that is independent of the resolution of the input image or final output image. During execution of the FITS RIP routine, nodes within the expression tree are first combined mathematically. The result is mapped to a pixel location in the output image. As such, for each pixel in the output image, a single mathematical function is generated that describes the color, in an arbitrary color space, at that point within the output image. This function includes information concerning the original input image as well as any applicable modifications contained in expression tree.
There is thus provided in accordance with a preferred embodiment of the present invention a method for non-destructive image compositing of an original digital image residing in an original digital file, including displaying a first user-selected region of the original image at a first user-selected resolution, modifying the first user-selected region of the original image according to input received via a user interface, in response to input from a user, so as to define a modified original digital image, expressing the modifying step as parameters of at least one spatially resolution independent transformation, recording the parameters separate from the original digital image, and outputting a second user-selected region of the modified original image at a second user-selected resolution onto an output device, by applying the at least one spatially resolution independent transformation to the original digital image so as to produce the second user-selected region of the modified original image, and rendering the second user-selected region of the modified original image on the output device.
There is further provided in accordance with a preferred embodiment of the present invention a system for non-destructive image compositing of an original digital image residing in an original digital file, including a display processor displaying a first user-selected region of the original image at a first user-selected resolution, on a display device, a user interface processor modifying the first user-selected region of the original image, in response to input from a user, so as to define a modified original digital image, an interpreter expressing modification of the original image as parameters of at least one spatially resolution independent transformation, a storage device recording the parameters separate from the original digital image, and an output processor outputting a second user-selected region of the modified original image at a second user-selected resolution, onto an output device, including an image processor applying the at least one spatially resolution independent transformation to the original digital image so as to produce the second user-selected region of the modified original image, and an image renderer rendering the second user-selected region of the modified original image on the output device.
BRIEF DESCRIPTION OF THE DRAWINGS
The teachings of the present invention can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic illustration of the general form of an expression tree;
FIG. 2 is a special case of the expression tree of FIG. 1 wherein each node references either zero or one subnode;
FIG. 3 is a high level block diagram of a computer system within which the invention is implemented;
FIG. 4 is a flow chart of a FITS main routine as executed on the computer system depicted in FIG. 1;
6a FIG. 5 is an illustrative depiction of a tiled image pyramid created by a preprocessing routine in accordance with the invention;
FIG. 6 is a flow chart of a preprocessing routine;
FIG. 7 is a flow chart of an image editing routine;
FIG. 8 is a flow chart of a pixel update routine;
6b FIG. 9 is a flow chart of a FITS raster image processing (FITS RIP) routine; and ' 5 FIG. 10 is a general illustration of an output grid used by the FITS RIP routine to interpolate intermediate parameter values.
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
DETAILED DESCRIPTION OF THE INVENTION
The present invention is an image editing method and system for manipulating, altering, and retouching computerized, i.e., digital images, and for displaying those images upon either a computer monitor or a printer. This invention uses a functional interpolating transformation system (FITS) in which the underlying image, e.g., an input image from an image scanner, from a drawing program or some other image source, is preserved, and changes thereto are recorded as expressions within an expression tree. By processing only changes to an image shown on the current display screen, FITS computes only what is needed to modify the image and only what is needed to accurately display those modifications. Further, all modifications are resolution independent and can be used to generate output images having any level of resolution (commonly measured in dots per inch or dpi). Thus, once editing has been performed using the computer screen as a relatively low resolution display, the system can print a high resolution image using a high resolution printer.
Specifically, the image processing method and system of the . 35 present invention is for creating and editing images that are resolution independent where the images are characterized by an expression tree. The expression tree can evaluated by a computer subroutine at any point (x, y) within continuous, two-dimensional space, as needed, to display, print or export an image.
The expression tree is a conventional tree data structure familiar to those skilled in tlzs art of computer programming.
Each node in the tree defines an image (possibly an intermediate result during an editing operation) which can be computed from <
data stored in the node itself, data stored in any subtrees referenced by the node and data in any preprocessed image files which may be referenced by the node or its subtrees . In the preferred embodiment, each node is an object (in an object-oriented programming language such as C++) which contains an object method (a subroutine associated with the object) to compute the color F(x,y) of an image point given the coordinates x and y. F(x,y) represents a color with a vector of coordinates in some color space such as RGB, HSV or CMYK.
Each node (object) in the expression tree contains the following data:
- Pointers to zero or more child nodes Ci with associated functions Fi(x,y) - Zero or more pointers to objects representing external input images Ij(x,y).
- Zero or more pointers to objects defining position dependent terms D1(x,y).
- Zero or more position independent terms Gk.
In general, the position independent terms are parameters of the modifications which do not depend on the position of the modification within the image. On the other hand, position dependent terms are parameters of the modifications which vary as a function of the position of the modification within the Image.
A typical object method for computing F(x,y) could be , expressed as follows:
F(x,y) - H(F1(x,Y),F2(x,Y) . . . Fr(x~Y) I1(x,Y), I2(x,Y) . . . IS(x,Y) D1 (x,Y) ~ D2 (x.Y) . . . Dt (x.Y) G1, G2 . . . Gu (x, Y) ) where H is a function defining the desired editing operation at the given node in the expression tree. Alternatively, the function H could depend on values of the Fi, Ii and Di at points in a neighborhood of (x, y), i.e., the value F(x,y) is a function ° of parameters not located exactly at location (x,y). FIG. 1 schematically depicts a general form of an expression tree.
FIG. 2 depicts a special case of expression tree organization that occurs when no node has more than one subnode (child). In this case, the tree describes a series of "layers."
The root node (layer r) describes the topmost layer, its child describes the layer below and so on down to the bottom layer (layer 1) which has no children. For image retouching, this is a very important special expression tree structure. Many of the kinds of modifications desired by retouching artists can be described through a set of layers, without need for the additional generality provided by an expression tree such as that shown in FIG. 1.
To facilitate rapid processing, in the preferred embodiment, the objects representing external input images Ij(x,y) compute their results by accessing images stored in a preprocessed IVTIE format (described below). Generally, however, the external input images may be stored in any format and accessed through these objects.
FIG. 3 is a block diagram of a conventional computer system 100 capable of executing a FITS computer program 118 to enable a user to apply modifications to an image and display the edited image on a printer 110 and/or a display monitor 108.
Specifically, the computer system 100 contains an image input device 102, a computer 104, a user input device 106, a display a 35 monitor 108 and a printer 110. In operation, an original image is created by an image input device 102 such as a conventional image scanner. The scanned image is formatted by the scanner into an image file 120 that is stored in the memory 116 within computer 104. Alternatively, the original image could be 2~~~~~

generated by a drawing or drafting program (shown generally as .
image processing program 122) executed on the computer 104 or another computer and stored, as image file 120, in memory 116.
The computer 104 contains at least one central processing unit (CPU) 112, memory 116, and.'~various well-known CPU support circuits 114. An illustrative computer is a Macintosh Quadra model 900 manufactured by Apple Computer, Inc. of Cupertino, °
California. The FITS program 118 and the image editing program 122 as well as one or more images are stored in the memory 116. Images that are altered by the FITS program can be displayed, at user discretion, on a display monitor 108 and/or a printer 110.
In operation, a user typically manipulates the user command input device 106 such as a mouse, trackball, light pen, and/or keyboard, to control, via the computer 104, the image input device 102, e.g., an image scanner. The image scanner, in a conventional manner, scans a hardcopy of an image and stores a digitized representation of the hardcopy in the memory 116 as an image file 120. Subsequently, the user can instruct the CPU 112 to execute the FITS program 118 and also to recall an image file (original image) from the memory. The FITS program 118 is either a stand-alone program that is interfaced to a conventional image processing program 122 or is an imbedded routine within the image processing program. For simplicity, the FITS program is described herein as a stand-alone program.
Through manipulation of the user command input device, the user sends instructions to the FITS program 118 to facilitate manipulation, retouching, alteration and otherwise editing of the original image, i.e., a graphical user interface (GUI) is used. Generally speaking, the FITS program 118 applies the various modifications to the image and displays an edited image, on the display monitor. Once the desired image editing is .
complete, the user can have the edited image printed on the printer. Of course, the printer may be substituted with any , form of device that places the image on a tangible medium, e.g., film.

As those skilled in the art will realize from the foregoing discussion, the computer system can, of course, be a part of a computer network. As such, the memory 116 could reside in a server computer connected to the network bus. Additionally, the printer and image input device could be connected to nodes of a network bus while the computer, user input device and display monitor could be a computer workstation or microcomputer connected to the network bus such that the printer, image input device, and memory are accessible to the user's computer via the bus.
It should be understood that the various routines of the FITS program may be executed on different computers and/or peripherals. As such, the various computer files generated by the routines may be stored on disk or some other transportable storage medium. Consequently, the stored files can be transported between the various computers or peripherals and utilized therein.
FIG. 4 is a high level flow chart of the computer software routine (FITS main routine 118) used to implement the FITS
process. The FITS main routine 118 contains an image preprocessing routine 204, an image editing routine 208 and a FITS raster image processing (FITS RIP) routine 212. Forming an input to the FITS main routine is a digitized image contained in an image file 202. The input image file is typically produced by a conventional image scanner or a conventional drafting or drawing program. In either instance, the input image file can be formatted in any one of the well-known image file formats such as Tagged Image File Format (TIFF), Encapsulated PostScript Format (EPSF) and the like. PostScript is a registered trademark of Adobe Systems Incorporated of Mountain View, California. Another input contains image editing information that the FITS main routine applies to the input image. The image editing information is produced by the image processing program that operates in conjunction with the FITS program. The FITS

lla main routine produces an image file 216 as an output that contains the modifications, as defined by the image editing information, applied to the input image. The FITS main routine generates this output image file in any resolution necessary to .
accurately depict the image (or a portion thereof? on a printed page or a screen of a display monitor.
Generally, the image preprocessing routine 204 performs a preprocessing function that transforms the input image file format into a special file format that can be rapidly displayed on a display monitor by the FITS RIP routine and interactively edited by the image editing routine. This particular image file is known as an "IVtJE" file. The specific nature of the IWE
file format is discussed in detail below.
The IVI1E file is sent, along path 206, to the image editing routine 208 and, along path 214, to the FITS RIP routine 212.
As will be explained in detail below, the IWE file on path 206 can either be a "full IVLTE" file or a "compressed IVtTE" file.
Furthermore, those skilled in the art will realize that the preprocessing routine does not have to reside within the FITS
program itself. Alternatively, this routine could reside within the image input device. As such, the image input device, e.g., an image scanner, would directly generate the IVUE file representing the image generated by the device.
The image editing routine 208 applies, in an interactive manner, any modifications to an image, i.e., an image represented by an IVUE file.. This routine applies modifications to the image as a layer, or image object, and stores these modifications and represents the modifications as nodes in an expression tree stored as a FITS file.
Consequently, the FITS file contains one or more interconnected trees defining modifications to the original image. Since the modifications are stored as parameters for equations that represent only the modifications to the image rather than stored , as a complete copy of the modified image, the invention provides a significant reduction in memory use over the amount of memory , required by the prior art, e.g., the virtual image method.
Furthermore, by storing only the information necessary to represent an alteration to the image, the image can be updated more quickly. The FITS file is sent along path 210 for further processing by the FITS RIP routine 212.
Lastly, the FITS RIP routine 212 combines all the ' 5 modifications represented in an expression tree with the original image, as represented in the IVLJE file, to produce a single output image 216. This image incorporates each modification performed by the image editing routine 208.
Additionally, the FITS RIP routine formats the output image in any one of the many standard image formats available, e.g., TIFF, EPSF, and the like. As such, the output image can be exported to any one of the many desktop publishing systems that are available. The FITS RIP routine evaluates the expression tree to generate pixel values for the output image.
Alternatively, the FITS RIP routine could reside within the image display device, e.g., a printer. As such, a printer, or other device for generating images in a tangible medium, could be designed to utilize the rapid display generation capabilities of the FITS RIP routine.
Importantly, this FITS program is designed for editing images that are resolution independent where the images are characterized by a series of nodes on an expression tree that can be combined to yield an output image, at any resolution, for display or print.
Furthermore, unlike many high-end and mid-range color image editing systems that oblige the operator to modify a low resolution image, the FITS program operates upon high-resolution images, i.e., the operator may at any time access any information contained in the original, full resolution image ~ without being limited by the FITS processing approach.
Additionally, as will be discussed in detail below, by using the FITS program, the operator is not limited to working at a fixed image resolution. As sucr, image editing effects can be applied, and new original images inserted into an image being edited, at any level of magnification (image resolution).

~~.~~~~8 . -14-To provide the reader with an in-depth understanding of the .
FITS program, each individual routine, i.e., the preprocessing routine, the image editing routine and the FITS RIP routine, will be discussed separately below.
' A. Preprocessing Routine 204 Initially the input image, in TIFF or another standard format, is preprocessed to create a specially formatted new file, termed IVCTE. The IViTE file is used during image editing and also during the FITS RIP. It is preprocessed in such a way that a new screen full of image data may be quickly constructed.
The screen's best resolution can be used, both for the full image and for close-up details.
The first step in creating an IVLTE file from a sampled image is to construct a pyramid of smaller sampled images following techniques well known in the prior art. In the preferred embodiment of the present invention, the finest image in the pyramid is the original image and each coarser image has half as much resolution in each dimension as the next finer image. Two examples of computation methods for generating these pyramids can be found in L. Williams, "Pyramidal Parametrics", Proceedings of Siggraph '83, pp. 1-11 (1983) and the "Burt Pyramid" described in U.S. patent number 4,718,104 issued to Anderson.
After computing the pyramid, the pyramid of images is written to a storage device (memory) in sequence from coarsest to finest. Each image is divided into a series of p by q pixel rectangular tiles, and the tiles are written to the storage device in raster scan order. Within each tile, the pixels are written to the storage device in raster scan order. A simple , illustration of a tiled image pyramid is shown in FIG. 5.
The preprocessing routine 204 used to create an IVLIE file can be described more precisely as the sequence of steps shown in FIG. 6 and described below.

- 1. Reserve space at the beginning of the IVLTE fi=a for a pointer to the file header information ana offset table. (step 402) 2. Input a discrete sampled image I(i,j). (step 404) 3. Construct a series of images I1(i,j), I~(i,j), I3(i,j) . . . In(i,j) where I1(i,j) - I(i,j) and where ' Ik(i,j) for k>1 is derived from Ik-1(i,j) by low-pass filtering and decimation by a factor of two in accordance with the prior art for computing image pyramids. (step 406) 4. For k from 1 to n, do steps 5 to 8 below such that each image is tiled into p X q pixel tiles.

(steps 407, 408, 410, 411, 412, 413) 5. For r from 1 increasing by p each time while r is less than or equal to the maximum value of i in Ikc~,j) do step 6 and 7.
6. For s from 1 increasing by q each time while s is less than or equal to the maximum value of j in Ik(i,j) do step 7. The p by q rectangular region where i goes from r to r+p-1 and where q goes from s to s+q-1 as a tile.
7. Write the pixels of image Ik that fall within this tile to the storage device in raster-scan order.

(step 410) 8. Add the address of the start of each tile witin the storage device to the offset table. (step 411) 9. Write the offset table to the storage device.

(step 414) 10. Write the address of the offset table and any header ~ information to the storage device in the space reserved in step 1. (step 415) For most pu=poses, the header information should include the size of the original image, the values of p and q, and an:> other information deemed useful by a programmer.
11. Return to the FITS main routine. (step 416) It will be clear to a programmer skilled in the a.~t that the offset table may be embodied in many differen~ ways.
Generally speaking, the offset table is a data structure on the WO 94/22101 ~ ~ ~ ~ ~ PCT/LJS94/03266 sLOrage device which al~Toc~sv a program using the IWE file to calculate the location within the storage device of a tile in a subimage. As such, a simple embodiment is to use a fixed size table of pointers based on the number of subimages and tiles. A
more complicated embodiment could use variable size data structures to facilitate modifying the IWE file after it was written.
In the preferred embodiment, p - q, but there may be computer systems in which different values of p and q provide performance advantages. The size of the tile is chosen to provide the best possible performance for the image retouching system. Experiments on the Apple Macintosh Quadra 900 using a hard disk as the storage device and a screen retouching area of about 700 by 700 pixels indicate that the best value is a tile size between 100K and 300K bytes. Macintosh and Quadra are trademarks of Apple Computer Inc., Cupertino, CA. The best tile size on other machines or storage media may be substantially different.
Steps 7 of the method for preprocessing IVLTE files may be modified as follows in order to build smaller files called IVLTE/C files which may be more efficient for network access:
The p by q rectangular region where i goes from r to r+p-1 and where q goes from s to s+q-1 is a tile. Compress the pixels of Ik that fall within this tile and write the compressed result to the storage device. Add the address of the start of tile on the storage device to the offset table The compression method may be JPEG, vector quantization, or any other method for image compression. The compression may be lossy or lossless. There are two principal advantages of using this compression: (1) only the IVCTE/C file is used during image editing; thus, use of a compressed file decreases the disk .
requirement on a given retouching station within a network, and (2) during image editing if the image is on a network image server use of the compression option will greatly reduce WO 94/22101 ~ ~ PCT/US94/03266 operator wait times induced by network delay while retrieving an image from the network server.
The compressed image may be used during the screen editing step, where the quality of a lossy compressed image may be perfectly acceptable. In order to obtain the highest quality ' image for final output, however, the user may choose to use the full image in the uncompressed IWE format during the FITS RIP.
Thus, while lossy compression may be used during editing operations to improve processing speed and reduce memory requirements, it need not lessen the final image quality.
Programmers skilled in the art will understand that additional data may be included in an IWE or IWE/C file, such as the creator, the name of the compression method or anything else which the programmer finds useful to include. This may be done most conveniently at the beginning or end of the file, but could also be interspersed with the image data in a regular way if necessary for some reason. Programmers skilled in the art will also understand that the same data can be written to several files (for example, one file per image in the pyramid) without fundamentally altering the invention.
B. Image Editing Routine 208 Image editing refers to the process of retouching, creation and composition of images. Using the image editing routine, the operator successively applies effects such as blur, smooth, and transformations such as rotation and scaling. Additional images can be inserted at any time and, if desired,-with transparency and masking.
Each editing action is represented in the expression tree which is recorded in a file named FITS. The FITS file can be considered as a database of commands or modifications, and is a compact representation thereof.
FITS implements a set of operation types, each associated with a particular function in the expression tree. During WO 94/22101 ~ ~ ~ ~ ~ PCT/iJS94/03266 interactive editing, the user has available a series of actions which can modify the position dependent and the position independent parameters of ari editing operation. FITS operations include: image insertion (insertion of a scanned image), painting, lighting effects, and mirror, among others. ' During image editing operations, an operator may specify ' the location and degree of a desired image modification with the invention by using a computer pointing device such as a mouse or digitizing tablet. The use of such pointing devices in image retouching systems is standard in the prior art. However, the method by which the gestures of the operator are transformed into image modifications in the FITS system is innovative.
In the prior art, pointing devices are used to "brush"
effects onto sampled image grids. The operator chooses a size and shape of brush (typically circular), and the computer program repeatedly updates all the pixels in the image being modified which lie in the brush region. For a pleasing and effective interaction between the user and the appearance of the brush strokes on the screen, the pixels must be updated at least 10 times per second. Since the number of pixels to be updated is proportional to the square of the diameter of a circular brush, the amount of computation required to update a display increases rapidly with brush diameter. Thus, using the techniques of the prior art, the computational demands of an acceptable update rate limit practical brushes to relatively small diameters. In contrast, using the invention to effectively improve the image display update rate, retouching operators can interactively make modifications using extremely large brushes without the need for overly expensive computing devices to accelerate the computation.
FIG. 7 depicts a flow chart of the image editing routine 20~. Specifically, the routine starts at step 500, then , queries, at step 502, whether the user wishes to recall from memory a FITS file containing a previously edited image. If the query is answered affirmatively, the appropriate FITS file is recalled at step 504. If the query is negatively answered or after the FITS file is recalled, the routine displays, at step 502, an existing image using the FITS RIP process. The image is displayed in a designated area or window on the display device. If the existing expression tree is null, e.g., no previous FITS file or unedited IVUE image was recalled, the FITS
RIP process generates a black area on the display device. Upon the black area known as a canvas, a new image can be created.
For simplicity, the remainder of this discussion assumes that a FITS file has been recalled. Consequently, the FITS RIP uses the expression tree in the FITS file to generate the required pixels (The details of this process are described belo~~: in the section "FITS RIP"). As such, at step 506, the designated image area is displayed on the display device.
Next, the user may either select a modification operation or select a new view of the presently displayed image. At decision block 508, the user may start a modification of the presently displayed image or not. If not, the routine proceeds along "NO" path 510 to decision block 516 where the user may select a new view of the image. By selecting well-known commands such as pan and zoom, the operator can specify a portion of the image to be edited and simultaneously shown on the display. These commands can be used to specify an affine transformation between the image coordinate system and the coordinate system of the designated area (window) on the display device. The use of affine transformations for this purpose is well known in the prior art and can be found in Foley, van Dam, Feiner, and Hughes, COMPUTER GRAPHICS: PRINCIPLES AND PRACTICE, Addison Wesley (1990).
If, at step 516, a new view is selected, the routine proceeds to step 518. At step 518, the PDPs are created for . this layer based on data in the screen arrays. However, since an edit operation has not been selected yet, the screen arrays have not been allocated. As such, at this time, nothinc occurs at this step. At step 520, the FITS RIP routine (described below) is used to generate the new image view on the screen .
Thereafter, the routine returns to step 508 to query whether a modification has been selected and begun. Using the foregoinc process steps, the user may move about within an image as well -as zoom in and out of an image until a desired view is achieved.
Thereafter, the query at step 516 will be answered negatively and the routine proceeds along the "NO" path to step 522.
. ,.
At step 522, the user can select an editing operation (or new editing operation) typically from a menu of available operations. When the operation is selected, the PDPs are modified, at step 523, based on the present values in the screen array values and a new node (object) in the expression tree is created, at step 524, which references zero or more subnodes.
In the most common case, the new operation is regarded as a layer on top of the previous image, so the new expression tree has the expression tree representing the previous image as its only child and, therefore, references only that image. If a previous editing operation was being used, i.e., screen arrays are presently allocated, then the values of the screen arrays must be mapped into the expression tree before new screen arrays are allocated. The mapping is accomplished in step 523. The specific nature of this mapping is described below.
Each editing operation depends on a set of zero or more position dependent parameters (PDP's) which characterize the editing effect. For example, the "Painting One Color" is characterized by a single PDP which specifies the opacity of the single color at each point. The single color itself is specified by three position independent parameters when an RGB
color space is used. The "Multicolor Painting" operation is characterized by four PDP's: three to specify the color at each point and one that specifies the opacity.
More specifically, after the user selects an operation and a view (through pan and zoom, for instance) and attempts to .
perform an image editing operation, the query at step 508 is answered in the affirmative. Consequently, at step 508, two-dimensional screen arrays are allocated and filled with values from existing PDPs for the selected editing operation or default PDPs. One screen array is allocated for each PDP
associated wi~iz the presently selected editing operation. The arrays are called screen arrays because they represent the position dependent parameters at a resolution sufficient for the current view on the screen. Typically, the mesh resolution of the present screen arrays is the same as the pixel resolution ' 5 for the displayed image. However, in some instance the screen arrays have a coarser mesh than the displayed image. In such ' instances, the ultimately displayed pixel values are calculated from PDP values interpolated from the screen arrays. This interpolation,process is a function of the FITS RIP routine and is described further below. At step 514, the contents of these arrays are modified during the image editing operations. Using the modified contents of the screen array, the FITS RIP routine performs what is known as a "screen RIP" to rapidly redisplay the image. In a screen RIP, the FITS RIP routine only calculates new values for the pixels that are altered by the editing operation. As such, the FITS RIP routine, as described below, is only applied to those pixels altered by the editing operation, i . a . , those pixels with altered PDP values in the screen array. Thus, as edits are performed on the image, the modifications quickly appear in the displayed image.
When the user chooses a new view (step 516), the contents of the screen array is used in step 518 to modify the resolution independent representation of the PDP's, i.e., screen array values are mapped into PDP values within an expression tree .
Importantly, only the difference between the present screen array values and the previous PDP data structures (before any editing with this editing operation) are stored. As such, if a screen array was not altered by the editing operation, then the corresponding PDP is not altered. Thereafter, new screen arrays may be created (step 512) to store further modifications, and so on, until the editing session is terminated by the user.
Periodically during the editing session the user can save the results of the previous edits to a FITS file via steps 525 and 526. Also, via steps 528 and 530, the image can be processed by the FITS RIP routine to generate a high resolutior_ output for printing or for storing in memory. Finally, the user, at steps 532 and 534, may quit the editing session and -return to the main routine.
To rapidly depict the modifications on the screen, the screen arrays may have the same:'number of elements as the RIP ' area. Alternatively, for PDP's which are known to be smooth, such as those created with Gaussian brushes, the screen arrays ' may be stored at lower resolution and interpolated as needed to produce a display. This second method reduces storage requirements and increases processing speed.
Further, in order to provide rapid updating when the user paints in an effect with a brush, the FITS method makes use of a precomputed table for each brush size and for each possible brush displacement within a limited range of motion. The idea is as follows: As the brush is moved, instead of recalculating every pixel in the display under the brush, the FITS method only recalculates the pixels touched by the brush in its new position which were not touched in its previous position.
More precisely, let D be a set of allowed displacement vectors in pixels describing the translation of the brush between updates. In the preferred embodiment, D is a set of vectors (i,j) whose entries are integers. In other embodiments, one could instead use fractional or floating point vectors for additional flexibility at the cost of more computation. Let A(0,0) be the set of pixels affected by a brush centered on a particular pixel P1. Let A(i,j) be the set of pixels affected by the same brush centered on another image point P2 where the displacement from P1 to P2 in pixels is the vector (i,j). In this aspect of the invention, the invention precomputes a brush data structure for each allowed (i,j) which yields a list of all pixels in-A(i,j) but not A(0,0), i.e., the list of pixels (di,dj) are the pixels newly affected by the brush after the brush is moved along vector (i,j). The brush data structure defines the shape of the brush and, as such, the data structure defines the pixels that are contained in the list of affected pixels. In general, the brush can have any shape; however, typically, the brush has a circular shape. If the brush has a symmetrical shape, a smaller data structure can be used and accessed in a way that takes advantage of the symmetry. For example, if the brush is circular, the lists for (i,j), (-i,j), (i, -j ) , (-i, -7 ) , (j , i) , (-j , i) , (j , -i) , (-j , -i) can be calculated from one another very quickly, so only one need be stored explicitly. Using the data structure, the invention only recalculates values for the new pixels affected by the brush.
Specifically, if the brush has a circular shape, the new pixels affected when the brush is moved generally lie in a crescent shaped area.
More specifically, as shown in FIG. 8, every time the pointing device is moved while painting with a brush in an image editing operation, the invention executes the brush routine 800 which performs the following steps:
1. Access a brush data structure (step 804) and access the new position P2 - (r2, s2) and subtract it from the old position Pl = (rl, s1) yielding the vector (i,j) (step 806).
2. If the vector (i,j) is too large to be found in the brush data structure, approximate it as the sum of vectors (il,jl), (i2,j2) . . . (in,jn) all found in the pixel update data structure and such that i = il + i2 + . . .in and j - jl + j2 + . . .jn. Otherwise, if (i,j) itself is found in the brush data structure, let n = 1, il = i, j 1 = j , 3. For k from 1 to n do steps 4 through 7. (step 812) 4. For each element (di,dj) on the list of affected pixels accessed from the brush data structure for offset (ik,jk) do steps 5 through 7.
5. Update the display at pixel location (di + rl, dj + sl), i.e., perform a screen RIP to update the value of the affected pixel. (step 808) 23a 6. Update the screen arrays with this modifcation. (step 810 ) 7. Set the new value of rl to be rl + ik and the new value of sl to be sl + jk.
8. Return to image editing routine. (step 814) We turn now to the specific details of the PDPs and their -handling within the image editing routine. In general, each PDP
within a given layer (or node) of the expression tree is defined by a collection of so-called sub-PDPs. Each sub-PDP is derived from a particular screen array.thatlis generated whenever a new image view is modified or~~~a new editing operation is accomplished on the present image view. As such, for a given PDP D(x,y), a number of sub-PDP taken together form that PDP.
Mathematically this PDP structure is represented as follows:

D(x,y)=D,(T,(x,~~))+D.,(T'~(x,y)+D~(T~~X,y))+ . . .+D"(T"(x,v)) (1) where:
T1, T2, T3 . . . Tn represent affine transformations; and D1, D2, D3 . . . Dn represent sub-PDP values.
The affine transformations transform a PDP from location (x, y) to location (x',y') using the following equation.
(x' ,y' ) - Ti(x,Y) = Mi (x,Y,1)t (2) where:
Mi is a matrix having three columns and two rows; and superscript t denotes a matrix transpose.
The presence of the affine transformation in the PDPs gives. the editing routine the ability to do rapid translations and affine transformations of image editing operations. For example, if the operator uses the painting tools to create a PDP
representing a series of brush strokes, all the brush strokes can be transformed together by modifying the entries in the matrices Mi.
Furthermore, the functions Di are stored as a table of values having a uniform mesh grid, e.g., forming a screen array.
To determine a sub-PDP value that falls between the mesh points, a bilinear interpolation is used. Alternatively, to generate , the sub-PDP more accurately, a pyramid representation can be used to represent the mesh grid and the sub-PDP values can be found using triiinear interpolation.

In an object-oriented implementation of the invention, D(x,y) is computed by an object which contains a list of sub-PDPs corresponding to the editing operations D1, D2, . . .
Dn. The object method to compute D(x,y) calls the object ' 5 methods for D1, D2, . . . Dn, in turn, summing the results. The objects for Di may store rectangular regions Ri which specify ' regions outside of which Di is zero. With these regions the object method for Di first checks to see whether or not (x,y) is inside the rectangular region. If (x, y) is outside, the object returns zero. If (x, y) is inside the region, the object method computes the transformed point (x',y') by using equation 2 and then applies the internal function Di to the transformed point.
By way of example, after a specific editing operation is complete within a certain image region and the screen arrays for that edit operation are computed, the values in a given screen array are subtracted from the present PDP value D(x,y) at each point in the array. This produces what is known as a differential update array which is to be applied to the PDP
D(x,y). To generate a new sub-PDP Dn+1. the differential update array is stored as an object of the sub-PDP and a transformation matrix representing the current view, e.g., pan and zoom. From this differential update array, a pyramid of arrays is produced in the same manner that an image pyramid is produced, e.g., by repetitively low-pass filtering the array values and decimating the array be a factor of two. Consequently, if this sub-PDP
must be used in a subsequent calculation, a trilinear interpolation can be performed to generate a sub-PDP value at any (x, y) location.
C. FITS RASTER IMAGE PROCESSING (FITS RIP) 212 The invention provides a computerized procedure for creating a raster image. This procedure is used to create a new view of the image on a computer monitor, to create a sampled image for export, or to create a high resolution output image for printing. The objective is to compute the color value at a particular pixel location resulting from the application of all the operations in the expression tree. The color value is in an WO 94/22101 ~ ~ PCT/US94/03266 arbitrary color space. Commonly, this is in either the colorspace named RGB, defined by the three primaries red, green, , blue, or in CMYK, defined by the three colors cyan, magenta, yellow and an additional value for black. Note that throughout the following discussion, pixel coordinates are defined as ' integers (i,j); while the ~.ntage space is defined by continuous values (x,y). In general, the FITS RIP routine is utilized to ' determine a pixel color value at an arbitrary pixel location (i,j) defined in terms of a color value at a location (x, y) within the image space.
The simplest way to RIP (known as full RIP) is to compute directly, for each output pixel, the functional composition described by the expression tree. This procedure is shown in FIG. 9 and can be described as follows:
1. For each pixel at position (i,j) in the output image use a conventional inverse transform to find the corresponding point (x,y) in the edited image and do steps 2 through 5. (step 902) 2. Compute the continuous parameters (x, y) which correspond to the desired output pixels (i,j).

(step 904) 3. Call the object method for computing the pixel color value F(x,y) of the root node of the expression tree, providing it with the parameters (x,y) computed in step 2. (step 906) 4. Store the pixel value F(x,y) computed in step 3 in a buffer. (step 908) 5. If the buffer is full, display the buffered pixels on the display or write them to a storage device.

(step 910) 6. Return. (step 912) For a screen RIP, the PDP values are still contained in a screen array;
therefore, whenever a screen RIP is conducted, i.e., to display the changes to an image during the editing process, during step 904, the FITS RIP routine first must convert t he screen array value at the pixel being altered into a WO 94/22101 PCTlUS94/03266 _?7_ sub-PDP value as described above. This sub-PDP value is combined with the appropriate other sub-PDP values to produce the appropriate pixel value. The method by which the expression tree is evaluated to produce that pixel value is described below.
An alternative RIP procedure uses interpolation to achieve greater speed. A subset of points on the output grid are chosen for exact processing as described below, and the remainder are interpolated. The subset that are evaluated exactly may conveniently be one out of every four by four block of output pixels. At these points, the general expression F(x,y) for the color value of that point is computed. In practice, a simplified form of the general expression is generally used that can describe most image editing actions . This form is termed "elementary operation" and it has the advantage of being relative simple to compute.
If the expression tree is such that each node has no more than one subtree as a child, then the individual nodes in the expression tree can be described as layers (see FIG. 2 and its associated discussion) . The root node is referred to as the topmost layer, its child as the next layer below, etc. The bottom layer is the node which has no other nodes as children.
When we have this type of expression tree, we will let Li denote the ith layer and number the layers starting from the bottom to the top (root) node.
The elementary operations are broken down in turn into three terms which are summed to create the new result (layer i), based on the result of tine previous elementary operation (layer i-1). The three terms are:
- first, the color of the previous layer (i-1) at point (x,y) with a weighting !xl(~,y ranging from -1 to 1.
- second, the color of an external image (Ii) at the geometrically transformed point Pi(x,y)) multiplied by a scalar j3(x,y) with values from -1 to 1.

- third, an additional color term y~(x,v) applied to the point (x, y) of the layer (i). -This term may take into account painting or other chromatic'effects.
The values of cr, ~3 and y depend on the particular retouching operation desired. The value of ~x and ~ control the degree to which their associated functions are incorporated into the present function. For example, if an image is brought in to cover all layers below it then ~i=1, ai=0 and Yi=0.
Consequently, each elementary operation in layer (i) is defined by an equation that takes into account the previous layer or editing operation (i-1). This equation is as follows:
F'r(x>>') = a; (x>>') ~ F,_,(x~y)+~r(x> >')' I;(Pr(X~?'))+ YOx~)') (3 ) where:
lx~(x,y) is a scalar function of (x,y) corresponding to the presence at this position of the image resulting from the previous elementary operation Fi_1 (x, y);
Fi_1 (x,y) is a function representing the image defined by the previous layer Li_1;
~3i(x,y) is scalar function corresponding to the presence at (x,y) of the color of the imported image Ii;
Ii (x,y) represents an imported image represented in IVUE
format;
Pi(x,y) represents geometric transforms, including rotation, scaling, distortion of the imported image Ii; and ' , y;(x,~~) is an additional position dependent term that car- y affect the color value of pixel (x, y).
Due to tine "nested" form of the elementary operations, i.e., gnat each elementary equation Fz(x,y) includes another elementary equation F;_1(x,y) until the bottom layer, these equations can be combined to yield a global function that represents the entire expression tree. The global function, defined below, defines the color value at any point (x, y) within an image composed of a number of layers Li:
~=v F(z,y= y't~',O+~a;(x,yOlkclOP~ci~(~>>')) (4) ,m where y' and a~ play a role similar to their role in the elementary operations, but are computed from ~ii, ai and yi of the elementary operations as discussed below. q is the number of imported scanned images which contribute to the final result at (x, y) .
The combination of the layers into the above form is accomplished at any desired point (x, y) with the following steps:
1. Set y' - 0 and q = 0.
2. For each layer i from 1 to the number of layers do steps 2 through 8.
3. Set y' - ai~ y' + Yi 4. For j from 1 to q, set aJ - a~ a;.
5. if ~ii is not zero then execute steps 6 through 8 6. q = q+1 7 . ay -8. k(j) - i In this procedure, the global function can be generated, and computed for a subset of the points (Z) in the output grid (depicted in Figure 10). Since the grid represents a subset of the pixels required for the RIP, it is necessary to generate the remaining points (0), within each grid. If the global function is calculated at every fourth pixel horizontally and vertically, the invention interpolates the 15 remaining pixels in every 4 by 4 block by interpolating from the four surrounding pixels where the global function has been calculated. Let these four surrounding pixels be known as tine interpolation knots.

Note that the invention is not interpolating from four pixel values to produce interpolated pixel values. Such an interpolation would not produce a high quality image. In contrast, the invention uses the.kiibwn parameters for the global 7 .
equation at the four circled points Z to interpolate the parameters at the points (0). These interpolated parameters are ' then used in a global equation at each location to determine its corresponding pixel value. Consequently, the invention produces a far better quality interpolated image than would be produced by interpolating the pixel values.
More specifically, for maximum speed while maintaining image quality, the parameters y' (x, y) , tx~ (x, y) and Pk ( j ) (x, y) can be interpolated bilinearly at the intermediate points from the values calculated at four surrounding grid points. The value of the image term Ik(j~(Pk(j~(x,y)) can be evaluated with the following steps:
1. Evaluate (x',y') - Pk(j)(x,y) by bilinear interpolation.
2. Identify the proper level V of the IVUE pyramid from the values of (x',y') at the four interpolation knots using standard texture mapping techniques (Foley &
Van Dam).
3. The output color is the trilinear interpolation of the IVUE pyramid at (x',y') and level Z7.
The subject method is particularly efficient for image processing for two reasons: the global function has a relatively simple form and thus can be easily computed, and very little computation is required to generate the interpolated functions. Use of functional interpolation provides a major time saving. For example, when 4 x 4 grids of 16 pixels are used the global function is generated only for 1/16 of the total pixels.
It is because of this that high speed image processing can be achieved with relatively inexpensive hardware.

The changes to the image caused by the operator ac~ions are carried out and displayed interactively. The operator may, at any moment return and redo a elementary operation. This is because different actions and their results (i.e., the ~ayers or expression tree) are defined by simple elementary eauations.
These can be easily modified.
In this way, the invention allows for many image effects, such as airbrushing, blurring, contrasting, dissolving effects, and color modifications. The invention also enables geometrical transformations or modifications, such as rotation, changes of scale, etc. Using FITS, a computer system can follow the actions of the operator, using input means such as a mouse or digital stylus at interactive speed. ' This input (e. g. pen) provides two types of command signals: one is a position signal giving the coordinates (x, y) of the point concerned, and if necessary its environment (for example the path of an airbrush stroke); the other uses the pressure of the pen on the table to create a second type of signal. In the airbrush example, it would govern the density of the color being "sprayed".
The number of points at which the global function need be generated during image editing within a layer is re~atively small when the function varies slowly. Conversely, v:hen the function varies rapidly, a greater number of evaluations of the global function is required to achieve the same error.
Even if the final image is unsatisfactory, e.g. the control run has been carried out and a proof image printed, it is still possible to go back and correct any intermediate stage to yield a better result.
. 35 The present invention has been described as a method and system to create and edit images operated by a human being. It should be understood, however, that the invention includes the possibility that the method and system could be operated under cor~troi of another computer program.

Furthermore, the preferred embodiment of the invention has been described as being implemented using object-oriented programming, however, those skilled in the art will realize that the programs can also be embodied in a more traditional programming style.
D. EXAMPLES OF ELEMENTARY FUNCTION USAGE
Below are discussed a variety of image editing effects that are expressed as elementary functions in the form required for rapid simplification and interpolation as. generally described above.
1) Airbrushing:
The airbrushing effect involves making a line or curve with a color. As this line imitates that made by an airbrush, it can be treated as a colored area created by the airbrush spray. The distribution of the color density in a airbrush dot is approximately a Gaussian function. This means that the intensity of the color is at its greatest in the center of the dot, diminishing towards the edges as a Gauss function. In a real airbrush, the intensity depends on the pressure exerted on the trigger, which widens or otherwise changes the ink spray within the air jet. Such a pressure can be simulated in a computerized system by representing (as explained above) a dot by a circle of color with a density variation between the center and edge expressed as a Gauss function. The saturation at the center can vary between 0 and 1 (or zero and 1000 .
Based on the elementary equation (3) and the airbrush characteristics, this equation becomes the following:
F;(x, o) _ (x;(~, ~')F,_,(X~ ~')+ y;(x,y) ( 5 ) , The airbrush effect applies a new color on top of the existing composition, so it does not introduce a new imported image. Thus the coefficient of presence f~; of an external image is nil at all points of the layer. , The application of the airbrush consists in replacing ' 5 partially or totally the previous shade of a dot by the shade of the color "projected" by the airspray. Because of this, the function Yf(x,y) is expressed as a function of the color C and as a complement to the coefficient of presence of the previous image a, , that is Yr(x>>')=~~-ar(x~)')~W (6) The choice of scalar a;(x,y) at each dot translates the density of color left by the airbrush.
The function of color presence a~(x,y) can be represented by a Gauss function centered on one dot, limited for example to 10~ at the edge of the disk. In other words, the two extreme ends of the Gaussian curve beyond 10~ (or any other value which may be selected) are suppressed. This means that the Gauss function will not be applied beyond the disk radius chosen.
2) Image insertion:
This operation imports an external image into an existing one. Based on the elementary equation, this importation operation is defined as follows:
F;(x,f)=a,(x,y)F;_,(x,y)+~r(x~)')1,(P;(x~f)) The function y~ is zero and the coefficients a, and fSi are complementary coefficients, i.e. , fSi(x,y) - [1-a, (x, y) ] . The function Pi(x,y) for this operation is the two parameter identity function, i.e., Ii(Pi(x,y)) - Ii(x.Y) If a, is one, the imported image completely obscures the composition behind it. If a, is less than one, the result will be a blend between the imported image and the composition behind.

WO 94/22101 ~ ~ ~ ~ ~ PCT/US94/03266 3) Lightening/darkening To lighten or darken an image, we can use the functior_ y,(x,y). With a; - 1 and iii = 0, the general equation becomes:
F; (x~ y) = F;_, (x, ~') + Ya (x~ )') ( g ) If y;(x,y) is positive, the net effect will be lightening. If y;(x,y) is negative, the effect will be darkening. The color y;(x,y) should have the same hue as Fi_1(x,y), if no hue shift is desired in this effect.
4) Deformation/anamorphosis:
This operation can be applied to an inserted (imported) image. The deformation/anamorphosis of an image consists of linking to each node a vector of deformation with a direction and size corresponding to the desired deformation To achieve such a deformation, the general function of the layer i becomes as follows through the use of the equation defining image import:
F;(x,~~>=a;(x,o)F;_~(x,v)+~3;(x,1')1,(P;(x,)')) (9) The deformation or anamorphosis consists in working on the import function Pi(x,y).
5) Leveling:
Leveling a color in part of an image, as an example, in a-portrait, enables the operator to remove local skin defects, such as birthmarks. To achieve this, the average intensity of the color is calculated in a disk centered on the point of color evaluation. Depending on the radius selected, the color will be made more or less uniform. This operation combines the normal image with another which has been averaged out. For leveling operations, f~i(x,y) - 0 because there is no new imported image.
Let S be the average color of the previous composition in the region surrounding the point (x,y). Then the opera~ion of leveling can be expressed by the elementary operation: , F,(x,y)=F,_,(x,y+y;(x,y) (10) where y; (x, y) - [ 1- ~x, (x, y) ] S (x, y) and ~x; (x, y) is between 0 and 1.
6) Contrasting:
Opposite to the previous type of processing, contrasting involves accentuating the fineness of the lines in a drawing or photograph. In a portrait, for example, it would bring out individual hairs of a hairstyle.
To achieve this, it is necessary to increase the high-frequency components of the image relative to the low frequency ones. This can be achieved by using the same elementary operation as for leveling, but with (x,(x,y) between -1 and 0.
The subject invention has been described in terms of its preferred embodiments. Upon reading the disclosure, various alternatives will become obvious to those skilled in the art.
These variations are to be considered within the scope and spirit of the subject invention, which is only to be limited by tile claims which follow and their equivalents.
. 4' ,! .( r

Claims (36)

CLAIMS:
1. In an image editing system containing an input device, a computer, and a display device, a method of editing an image using a brush editing effect comprising the steps of:
defining an area at a first location in a displayed image within which a brush editing effect is to be performed;
moving, by manipulation of said input device, said area in said displayed image to a second location;
defining a displacement vector in response to the direction and distance of the movement of said area from said first location to said second location;
computing pixel values within said area at said second location that are not within said area at said first location;
and displaying the computed pixel values.
2. The method of claim 1 wherein said area is of unlimited size.
3. The method of claim 1 wherein a velocity of motion between said first and second locations for said area is unlimited.
4. The method of claim 1 wherein said pixel value computing step further comprises a step of updating elements within a screen array in response to the computed pixel values.
5. The method of claim 4 wherein said displaying step further comprises the steps of raster image processing said screen array elements and said displayed image to display an edited version of said displayed image.
6. The method of claim 5 wherein said displaying step further comprises the steps of raster image processing said screen array elements and said displayed image to print an edited version of said displayed image on a printer.
7. The method of claim 5 or 6 wherein said step of raster image processing further comprises the steps of:
updating position dependent parameters using said updated elements of said screen array;
performing an inverse transform to map a plurality of discrete pixel locations within said display to a plurality of positions in continuous space;
calculating a pixel color for each position in continuous space using the updated position dependent parameters;
buffering in a buffer said calculated pixel colors; and displaying, when said buffer is full, said pixel colors on said display device.
8. The method of claim 7 wherein said calculating step further comprises the step of:
solving a series of elementary operations having the form:
Fi (x, y) = .alpha.i (x, y) ~ Fi-1 (x, y) + .beta.i (x, y) ~ Ii (pi (x, y)) + .gamma.i (x, y) where:
Fi(x,y) is a pixel color at a position (x, y) in continuous space;
.alpha.i(x,y) is a scalar function of (x, y) representing a degree of presence at this position of an image resulting from a previously calculated elementary operation Fi-1(x,y);
Fi-1(x,y) is a function representing an image defined by a previous layer Li-1 in an expression tree;
.beta.i(x,y)is scalar function representing a degree of presence at position (x,y) of a color of an imported image Ii;
Ii(x,y) represents an imported image;

Pi(x,y) represents geometric transforms, including one or more of rotation, scaling, or distortion of the imported image Ii; and .gamma.i(x,y) is a position dependent parameter that can affect the color value at location (x, y).
9. The method of claim 8 wherein said calculating step further comprises the step of:
solving a global operation having the form:
j=q F(x,y) - .gamma.' (x,y) + .SIGMA. .alpha.'j (x,y) ~Ik(j) (Pk(j) (x,y) ) j=1 where:
.gamma.' and .alpha.'j are a function of parameters .beta.i, .alpha.i and .gamma.i; and q is a number of imported images which contribute to the pixel color at location (x, y).
10. The method of claim 8 wherein said calculating step further comprises the steps of:
calculating for a subset of said plurality of locations in said displayed image said pixel color;
interpolating the parameters .beta.i, .alpha.i and .gamma.i at selected locations not within the subset of location from the parameters associated with the subset of locations; and calculating pixel color at the locations not in the subset using the global operation and the interpolated parameters.
11. A method for non-destructive image compositing of an original digital image residing in an original digital file, comprising:
displaying a first user-selected region of said original image at a first user-selected resolution;

modifying said first user-selected region of said original image according to input received via a user interface, in response to input from a user, so as to define a modified original digital image;
expressing said modifying step as parameters of at least one spatially resolution independent transformation;
recording said parameters separate from said original digital image; and outputting a second user-selected region of said modified original image at a second user-selected resolution onto an output device, by:
applying said at least one spatially resolution independent transformation to said original digital image so as to produce said second user-selected region of said modified original image; and rendering said second user-selected region of said modified original image on said output device.
12. A method according to claim 11 and wherein said original digital file also contains original parameter data for at least one original spatially resolution independent transformation, and wherein said recording step records said original parameter data with said parameters, and wherein said applying step combines said at least one spatially resolution independent transformation with said at least one original spatially resolution independent transformation.
13. A method according to claim 11 and wherein said recording step represents the parameters as a tree data structure.
14. A method according to claim 11 and wherein said original digital image is represented as a pyramid of sub-images, each sub-image having a lower pixel resolution than its predecessor in the pyramid.
15. A method according to claim 14 and wherein said sub-images are partitioned into individually accessible rectangular image tiles.
16. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to an air-brushing effect.
17. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to a color leveling effect.
18. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to a color contrasting effect.
19. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to image insertion.
20. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to a lightening/darkening effect.
21. A method according to claim 11 and wherein at least one of said spatially resolution independent transformations corresponds to an image deformation.
22. A method according to claim 11 and wherein said expressing step determines a subset of said parameters of at least one spatially resolution independent transformation, and wherein said applying step includes determining missing parameters by interpolation.
23. A method according to claim 11 and wherein a user successively modifies said original digital image, producing repeated modifications which are expressed as spatially resolution independent transformations, organized into a tree data-structure, and recorded.
24. A system for non-destructive image compositing of an original digital image residing in an original digital file, comprising:
a display processor displaying a first user-selected region of said original image at a first user-selected resolution, on a display device;
a user interface processor modifying said first user-selected region of said original image, in response to input from a user, so as to define a modified original digital image;
an interpreter expressing modification of the original image as parameters of at least one spatially resolution independent transformation;
a storage device recording said parameters separate from said original digital image; and an output processor outputting a second user-selected region of said modified original image at a second user-selected resolution, onto an output device, comprising:
an image processor applying said at least one spatially resolution independent transformation to said original digital image so as to produce said second user-selected region of said modified original image; and an image renderer rendering said second user-selected region of said modified original image on said output device.
25. A system according to claim 24 and wherein said original digital file also contains original parameter data for at least one original spatially resolution independent transformation, and wherein said storage device records said original parameter data with said parameters, and wherein said image processor combines said at least one spatially resolution independent transformation with said at least one original spatially resolution independent transformation.
26. A system according to claim 24 and wherein said storage device uses a representation of the parameters as a tree data structure.
27. A system according to claim 24 and wherein said original digital image is represented as a pyramid of sub-images, each sub-image having a lower pixel resolution than its predecessor in the pyramid.
28. A system according to claim 24 and wherein said sub-images are partitioned into individually accessible rectangular image tiles.
29. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to an air-brushing effect.
30. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to a color leveling effect.
31. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to a color contrasting effect.
32. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to image insertion.
33. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to a lightening/darkening effect.
34. A system according to claim 24 and wherein at least one of said spatially resolution independent transformations corresponds to an image deformation.
35. A system according to claim 24 and wherein said interpreter determines a subset of said parameters of at least one spatially resolution independent transformation, and wherein said image processor determines missing parameters by interpolation.
36. A system according to claim 24 and wherein a user successively modifies said original digital image, producing repeated modifications which are expressed as spatially resolution independent transformations, organized into a tree data-structure, and recorded.
CA002158988A 1993-03-25 1994-03-25 Method and system for image processing Expired - Fee Related CA2158988C (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
FR9303455A FR2703170B1 (en) 1993-03-25 1993-03-25 Method for processing an image in a computerized system.
FR93/03455 1993-03-25
US8553493A 1993-06-30 1993-06-30
US08/085,534 1993-06-30
PCT/US1994/003266 WO1994022101A1 (en) 1993-03-25 1994-03-25 Method and system for image processing

Publications (2)

Publication Number Publication Date
CA2158988A1 CA2158988A1 (en) 1994-09-29
CA2158988C true CA2158988C (en) 2000-06-13

Family

ID=26230196

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002158988A Expired - Fee Related CA2158988C (en) 1993-03-25 1994-03-25 Method and system for image processing

Country Status (9)

Country Link
US (6) US5907640A (en)
EP (1) EP0691011B1 (en)
JP (1) JPH08510851A (en)
KR (1) KR100320298B1 (en)
CN (1) CN1147822C (en)
AT (1) ATE223601T1 (en)
AU (1) AU690551B2 (en)
CA (1) CA2158988C (en)
DE (1) DE69431294T2 (en)

Families Citing this family (159)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100320298B1 (en) * 1993-03-25 2002-04-22 마크 에이. 버거 Image processing method and system
US5485568A (en) 1993-10-08 1996-01-16 Xerox Corporation Structured image (Sl) format for describing complex color raster images
EP0773503B1 (en) * 1995-11-10 2004-03-31 Kabushiki Kaisha Toshiba File transfer method, method for a file requesting client device, and file server device
US5764235A (en) 1996-03-25 1998-06-09 Insight Development Corporation Computer implemented method and system for transmitting graphical images from server to client at user selectable resolution
JPH10210269A (en) * 1997-01-20 1998-08-07 Canon Inc Device and method for image processing
DE19705536A1 (en) * 1997-02-13 1998-08-20 Mannesmann Vdo Ag Clock face
JP2919428B2 (en) * 1997-04-10 1999-07-12 日本電気株式会社 Image transformation device
CN1062362C (en) * 1997-04-15 2001-02-21 英业达股份有限公司 Dynamic linking method for computer image
GB2325834B (en) 1997-05-30 2002-03-27 Quantel Ltd An electronic graphic system
US6483540B1 (en) * 1997-06-16 2002-11-19 Casio Computer Co., Ltd. Image data processing apparatus method and program storage medium for processing image data
US6741255B1 (en) * 1997-08-14 2004-05-25 Sun Microsystems, Inc. Method, apparatus and computer program product for using deferred execution for a tiling pull model in a tiled image processing architecture
US6252583B1 (en) 1997-11-14 2001-06-26 Immersion Corporation Memory and force output management for a force feedback system
JPH11161780A (en) * 1997-11-27 1999-06-18 Fujitsu Ltd Picture size converting device and method therefor and computer readable recording medium for recording picture size conversion program
US6144772A (en) * 1998-01-29 2000-11-07 Canon Kabushiki Kaisha Variable compression encoding of digitized images
US6239807B1 (en) * 1998-03-13 2001-05-29 Mgi Software Corporation Method and system for multi-resolution texture mapping
US6215485B1 (en) * 1998-04-03 2001-04-10 Avid Technology, Inc. Storing effects descriptions from a nonlinear editor using field chart and/or pixel coordinate data for use by a compositor
US6195101B1 (en) * 1998-04-06 2001-02-27 Mgi Software Corporation Method and system for image templates
US6970176B1 (en) * 1998-06-23 2005-11-29 Van Der Meulen Pieter Sierd Video processing in PC uses statistically tuned color cube
US6549212B1 (en) * 1998-07-16 2003-04-15 Silicon Graphics, Inc. System for user customization of attributes associated with a three-dimensional surface
US7010177B1 (en) * 1998-08-27 2006-03-07 Intel Corporation Portability of digital images
US6674485B2 (en) 1998-08-31 2004-01-06 Hitachi Software Engineering Co., Ltd. Apparatus and method for image compositing
JP3937028B2 (en) * 1998-09-18 2007-06-27 富士フイルム株式会社 Image conversion method, image conversion apparatus, and recording medium recording image conversion program
US6380934B1 (en) * 1998-11-30 2002-04-30 Mitsubishi Electric Research Laboratories, Inc. Estimating targets using statistical properties of observations of known targets
US6715127B1 (en) * 1998-12-18 2004-03-30 Xerox Corporation System and method for providing editing controls based on features of a raster image
US6304277B1 (en) * 1999-01-15 2001-10-16 Colorcentric.Com, Inc. Remote modification of digital images using scripts
US6762791B1 (en) * 1999-02-16 2004-07-13 Robert W. Schuetzle Method for processing digital images
US6532311B1 (en) 1999-02-26 2003-03-11 Lockheed Martin Corporation Image browser
FR2792441B1 (en) * 1999-04-14 2002-07-26 Iodp MEDICAL IMAGING SYSTEM
US6895557B1 (en) * 1999-07-21 2005-05-17 Ipix Corporation Web-based media submission tool
AU6757000A (en) * 1999-08-02 2001-02-19 Iviewit Holdings, Inc. System and method for providing an enhanced digital image file
US6473094B1 (en) * 1999-08-06 2002-10-29 Avid Technology, Inc. Method and system for editing digital information using a comparison buffer
US6987584B1 (en) * 1999-08-09 2006-01-17 Ether Visuals Llc Method and system for preventing artifacts that may be product when bottling PDL files converted from raster images
US6657702B1 (en) 1999-08-31 2003-12-02 Shutterfly, Inc. Facilitating photographic print re-ordering
US20050264832A1 (en) * 1999-08-31 2005-12-01 Baum Daniel R Printing images in an optimized manner
US7016059B1 (en) 1999-08-31 2006-03-21 Shutterfly, Inc. Printing images in an optimized manner
US6839803B1 (en) 1999-10-27 2005-01-04 Shutterfly, Inc. Multi-tier data storage system
US6817289B1 (en) * 1999-11-15 2004-11-16 Gateway, Inc. Method and apparatus for modifying and controlling print information
US6732162B1 (en) 1999-11-15 2004-05-04 Internet Pictures Corporation Method of providing preprocessed images for a plurality of internet web sites
US6583799B1 (en) 1999-11-24 2003-06-24 Shutterfly, Inc. Image uploading
DE60005404T2 (en) * 1999-12-15 2004-07-15 Sun Microsystems, Inc., Santa Clara SYSTEM AND METHOD FOR CREATING A GRAPHIC USER INTERFACE FROM A FILTER EXPRESSION TREE
US6690396B1 (en) * 1999-12-27 2004-02-10 Gateway, Inc. Scannable design of an executable
US6891550B1 (en) * 2000-03-10 2005-05-10 Paul Anthony John Nolan Image manipulation software
JP4029253B2 (en) * 2000-04-10 2008-01-09 富士フイルム株式会社 Image resizing apparatus and method
US6704712B1 (en) 2000-04-14 2004-03-09 Shutterfly, Inc. Remote film scanning and image transfer system, protocol and method
US20030079184A1 (en) * 2000-05-05 2003-04-24 International Business Machines Corporation Dynamic image storage using domain-specific compression
US6650790B1 (en) * 2000-06-09 2003-11-18 Nothshore Laboratories, Inc. Digital processing apparatus for variable image-size enlargement with high-frequency bandwidth synthesis
US6643410B1 (en) * 2000-06-29 2003-11-04 Eastman Kodak Company Method of determining the extent of blocking artifacts in a digital image
US7116843B1 (en) * 2000-07-24 2006-10-03 Quark, Inc. Method and system using non-uniform image blocks for rapid interactive viewing of digital images over a network
US7599854B1 (en) 2000-08-02 2009-10-06 Shutterfly, Inc. Method and system for collecting images from a plurality of customers
EP1314083A2 (en) * 2000-08-04 2003-05-28 Copan Inc. Method and system for presenting digital media
US7973970B2 (en) * 2000-08-09 2011-07-05 Ether Visuals Llc Preventing artifacts that may be produced when bottling PDL files converted from raster images
US6484101B1 (en) * 2000-08-16 2002-11-19 Imagelinks, Inc. 3-dimensional interactive image modeling system
US6940518B2 (en) * 2000-08-16 2005-09-06 Quark Media House Sarl System and method for editing digital images using inductive image generation with cached state-specific image tiles
US6639684B1 (en) * 2000-09-13 2003-10-28 Nextengine, Inc. Digitizer using intensity gradient to image features of three-dimensional objects
AU1342502A (en) * 2000-09-20 2002-04-02 Nik Multimedia Inc Digital image sharpening system
US6823089B1 (en) * 2000-09-28 2004-11-23 Eastman Kodak Company Method of determining the extent of blocking and contouring artifacts in a digital image
US7127380B1 (en) * 2000-11-07 2006-10-24 Alliant Techsystems Inc. System for performing coupled finite analysis
US6704467B2 (en) 2000-12-21 2004-03-09 Canon Kabushiki Kaisha Image editing with block selection
SE519884C2 (en) * 2001-02-02 2003-04-22 Scalado Ab Method for zooming and producing a zoomable image
US7085774B2 (en) * 2001-08-30 2006-08-01 Infonox On The Web Active profiling system for tracking and quantifying customer conversion efficiency
GB2379293B (en) * 2001-08-31 2005-07-06 Discreet Logic Inc Processing Data in an Application comprising a plurality of Application Modules
WO2003024090A1 (en) * 2001-09-07 2003-03-20 Koninklijke Philips Electronics N.V. Image device having camera and image perspective correction and possibly rotation and staggering correction
WO2003042923A1 (en) * 2001-11-13 2003-05-22 New York University Logic arrangements, storage mediums, and methods for generating digital images using brush strokes
US6844885B2 (en) * 2001-11-30 2005-01-18 Hewlett-Packard Development Company, L.P. Image editing via grid elements
US7751628B1 (en) * 2001-12-26 2010-07-06 Reisman Richard R Method and apparatus for progressively deleting media objects from storage
US7093202B2 (en) 2002-03-22 2006-08-15 Xerox Corporation Method and system for interpreting imprecise object selection paths
US7345782B2 (en) * 2002-05-13 2008-03-18 Texas Instruments Incorporated Efficient implementation of raster operations flow
US7082211B2 (en) * 2002-05-31 2006-07-25 Eastman Kodak Company Method and system for enhancing portrait images
US6888569B2 (en) * 2002-10-02 2005-05-03 C3 Development, Llc Method and apparatus for transmitting a digital picture with textual material
US20040083430A1 (en) * 2002-10-29 2004-04-29 Boonen Paul J. J. Method and apparatus to process portable document format data containing transparency
US20040093432A1 (en) * 2002-11-07 2004-05-13 Eastman Kodak Company Method and system for conducting image processing from a mobile client device
JP2004242290A (en) * 2003-01-14 2004-08-26 Ricoh Co Ltd Image processing apparatus and image processing method, image edit processing system, image processing program, and storage medium
US7269800B2 (en) * 2003-02-25 2007-09-11 Shutterfly, Inc. Restartable image uploading
US7039222B2 (en) * 2003-02-28 2006-05-02 Eastman Kodak Company Method and system for enhancing portrait images that are processed in a batch mode
JP2004297772A (en) * 2003-03-12 2004-10-21 Ricoh Co Ltd Image processing system, image forming apparatus, image processing method, program and recording medium
US7333238B2 (en) * 2003-03-28 2008-02-19 Hewlett-Packard Development Company, L.P. Rendering a printing device pixel map
US20040202356A1 (en) * 2003-04-10 2004-10-14 Stookey George K. Optical detection of dental caries
TWI228913B (en) * 2003-05-16 2005-03-01 Benq Corp Editing and display controller used in portable digital image capture device and method of the same
US7966499B2 (en) * 2004-01-28 2011-06-21 Irdeto Canada Corporation System and method for obscuring bit-wise and two's complement integer computations in software
US7609894B2 (en) * 2004-02-17 2009-10-27 Corel Corporation Adaptive sampling region for a region editing tool
US7711179B2 (en) 2004-04-21 2010-05-04 Nextengine, Inc. Hand held portable three dimensional scanner
US7270543B2 (en) * 2004-06-29 2007-09-18 Therametric Technologies, Inc. Handpiece for caries detection
CN100377171C (en) * 2004-08-13 2008-03-26 富士通株式会社 Method and apparatus for generating deteriorated numeral image
US20060033737A1 (en) * 2004-08-16 2006-02-16 Old William M Methods and system for visualizing data sets
US20060045174A1 (en) * 2004-08-31 2006-03-02 Ittiam Systems (P) Ltd. Method and apparatus for synchronizing a transmitter clock of an analog modem to a remote clock
US7173631B2 (en) * 2004-09-23 2007-02-06 Qualcomm Incorporated Flexible antialiasing in embedded devices
US20060117268A1 (en) * 2004-11-30 2006-06-01 Micheal Talley System and method for graphical element selection for region of interest compression
EP2256687A1 (en) * 2005-02-16 2010-12-01 Adobe Systems Incorporated Non-modal real-time interface
US7809215B2 (en) * 2006-10-11 2010-10-05 The Invention Science Fund I, Llc Contextual information encoded in a formed expression
US7873243B2 (en) 2005-03-18 2011-01-18 The Invention Science Fund I, Llc Decoding digital information included in a hand-formed expression
US8640959B2 (en) * 2005-03-18 2014-02-04 The Invention Science Fund I, Llc Acquisition of a user expression and a context of the expression
US8340476B2 (en) 2005-03-18 2012-12-25 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US20060212430A1 (en) * 2005-03-18 2006-09-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Outputting a saved hand-formed expression
US20070273674A1 (en) * 2005-03-18 2007-11-29 Searete Llc, A Limited Liability Corporation Machine-differentiatable identifiers having a commonly accepted meaning
US8232979B2 (en) * 2005-05-25 2012-07-31 The Invention Science Fund I, Llc Performing an action with respect to hand-formed expression
US7791593B2 (en) * 2005-03-18 2010-09-07 The Invention Science Fund I, Llc Machine-differentiatable identifiers having a commonly accepted meaning
US8599174B2 (en) * 2005-03-18 2013-12-03 The Invention Science Fund I, Llc Verifying a written expression
US8229252B2 (en) * 2005-03-18 2012-07-24 The Invention Science Fund I, Llc Electronic association of a user expression and a context of the expression
US8290313B2 (en) * 2005-03-18 2012-10-16 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US20090015869A1 (en) * 2005-03-22 2009-01-15 Su Mien Quek Image collage builder
WO2006138723A2 (en) * 2005-06-16 2006-12-28 Biolase Technology, Inc. Tissue coverings bearing cutomized tissue images
US7619628B2 (en) * 2005-06-24 2009-11-17 Microsoft Corporation Caching digital image data
US7519233B2 (en) * 2005-06-24 2009-04-14 Microsoft Corporation Accumulating transforms through an effect graph in digital image processing
US9082199B1 (en) * 2005-07-14 2015-07-14 Altera Corporation Video processing architecture
US7995834B1 (en) 2006-01-20 2011-08-09 Nextengine, Inc. Multiple laser scanner
US7706629B2 (en) * 2006-02-24 2010-04-27 Apple Inc. Methods and apparatuses for pixel transformations
US7778486B2 (en) * 2006-02-24 2010-08-17 The Go Daddy Group, Inc. Online image processing systems and methods
JP2007271940A (en) * 2006-03-31 2007-10-18 Toshiba Corp Video displaying device and video displaying method
US8504932B2 (en) * 2006-04-13 2013-08-06 Shutterfly, Inc. Image collage builder
US7656415B2 (en) * 2006-04-28 2010-02-02 Microsoft Corporation Aggregation of curve effects
US20070285720A1 (en) * 2006-06-09 2007-12-13 Guglielmi Joe M Flexible system for producing photo books
EP1873721A1 (en) * 2006-06-26 2008-01-02 Fo2PIX Limited System and method for generating an image document with display of an edit sequence tree
US8885208B2 (en) * 2006-07-21 2014-11-11 Adobe Systems Incorporated Progressive refinement of an edited image using secondary high resolution image processing
US7911627B2 (en) * 2006-09-19 2011-03-22 Shutterfly, Inc. Data structure for personalized photo-book products
US20080068665A1 (en) * 2006-09-19 2008-03-20 Kenneth Ray Niblett Manufacturing system for personalized photo-book products
US7974486B2 (en) * 2006-09-22 2011-07-05 Apple Inc. Plug-in architecture for exporting digital images
US7873233B2 (en) * 2006-10-17 2011-01-18 Seiko Epson Corporation Method and apparatus for rendering an image impinging upon a non-planar surface
US9063950B2 (en) * 2006-10-27 2015-06-23 Avenza Systems Inc. Methods and systems for modifying raster graphics software programs
US20080129033A1 (en) * 2006-12-01 2008-06-05 Sean Kevin Anderson Manufacturing system for personalized photo book kit
US7941002B2 (en) * 2006-12-01 2011-05-10 Hewlett-Packard Development Company, L.P. Apparatus and methods of producing photorealistic image thumbnails
US7614837B2 (en) * 2006-12-01 2009-11-10 Shutterfly, Inc. Manufacturing system for personalized photo books
US8360771B2 (en) * 2006-12-28 2013-01-29 Therametric Technologies, Inc. Handpiece for detection of dental demineralization
US8078969B2 (en) * 2007-03-05 2011-12-13 Shutterfly, Inc. User interface for creating image collage
EP2223239A4 (en) * 2007-11-07 2012-08-22 Skinit Inc Customizing print content
US20090202179A1 (en) * 2008-02-13 2009-08-13 General Electric Company method and system for providing region based image modification
US8487963B1 (en) 2008-05-30 2013-07-16 Adobe Systems Incorporated Preview representation of pixels effected by a brush tip area
US8280187B1 (en) 2008-07-31 2012-10-02 Adobe Systems Incorporated Seam carving and expansion of images with color frequency priority
US8265424B1 (en) 2008-07-31 2012-09-11 Adobe Systems Incorporated Variable seam replication in images with energy-weighted priority
US8280186B1 (en) 2008-07-31 2012-10-02 Adobe Systems Incorporated Seam-based reduction and expansion of images with table-based priority
US8270765B1 (en) 2008-07-31 2012-09-18 Adobe Systems Incorporated Hybrid seam carving and scaling of images with configurable energy threshold
US8290300B2 (en) 2008-07-31 2012-10-16 Adobe Systems Incorporated Seam-based reduction and expansion of images with color-weighted priority
US8218900B1 (en) 2008-07-31 2012-07-10 Adobe Systems Incorporated Non-linear image scaling with seam energy
US8270766B1 (en) 2008-07-31 2012-09-18 Adobe Systems Incorporated Hybrid seam carving and scaling of images with configurable carving tolerance
US8280191B1 (en) 2008-07-31 2012-10-02 Abode Systems Incorporated Banded seam carving of images with pyramidal retargeting
US8625932B2 (en) 2008-08-28 2014-01-07 Adobe Systems Incorporated Seam carving using seam energy re-computation in seam neighborhood
US8180177B1 (en) 2008-10-13 2012-05-15 Adobe Systems Incorporated Seam-based reduction and expansion of images using parallel processing of retargeting matrix strips
US8581937B2 (en) 2008-10-14 2013-11-12 Adobe Systems Incorporated Seam-based reduction and expansion of images using partial solution matrix dependent on dynamic programming access pattern
US8363888B2 (en) * 2009-03-18 2013-01-29 Shutterfly, Inc. Proactive creation of photobooks
US8437575B2 (en) * 2009-03-18 2013-05-07 Shutterfly, Inc. Proactive creation of image-based products
US8358876B1 (en) 2009-05-20 2013-01-22 Adobe Systems Incorporated System and method for content aware in place translations in images
US8963960B2 (en) 2009-05-20 2015-02-24 Adobe Systems Incorporated System and method for content aware hybrid cropping and seam carving of images
US8659622B2 (en) 2009-08-31 2014-02-25 Adobe Systems Incorporated Systems and methods for creating and editing seam carving masks
US20110097011A1 (en) * 2009-10-27 2011-04-28 Suk Hwan Lim Multi-resolution image editing
US8655893B2 (en) 2010-07-16 2014-02-18 Shutterfly, Inc. Organizing images captured by multiple image capture devices
US8712194B1 (en) 2011-12-06 2014-04-29 Google Inc. System for non-destructive image processing
US9396518B2 (en) * 2012-05-15 2016-07-19 Salvadore Ragusa System of organizing digital images
US9514157B2 (en) 2012-08-22 2016-12-06 Adobe Systems Incorporated Multi-dimensional browsing of content
US8983237B2 (en) 2012-08-22 2015-03-17 Adobe Systems Incorporated Non-destructive collaborative editing
US9390155B2 (en) 2012-08-22 2016-07-12 Adobe Systems Incorporated Accessing content in a content-aware mesh
US11455737B2 (en) * 2012-12-06 2022-09-27 The Boeing Company Multiple-scale digital image correlation pattern and measurement
US9002105B2 (en) 2013-03-06 2015-04-07 Xerox Corporation Automated contour detection methods, systems and processor-readable media
US9942426B2 (en) * 2014-04-29 2018-04-10 Hewlett-Packard Development Company, L.P. Editing an electronic document on a multipurpose peripheral device
US11210455B2 (en) 2014-06-11 2021-12-28 Red Hat, Inc. Shareable and cross-application non-destructive content processing pipelines
TWI544449B (en) * 2014-08-05 2016-08-01 三緯國際立體列印科技股份有限公司 Storing method for edited image file
US10127634B2 (en) 2015-04-08 2018-11-13 Google Llc Image editing and repair
US9712845B2 (en) * 2015-07-31 2017-07-18 Ecole Polytechnique Federale De Lausanne (Epfl) Media content processing method
US10565966B2 (en) * 2016-10-06 2020-02-18 Arm Limited Display controllers
US9607365B1 (en) * 2016-10-31 2017-03-28 HDRlog SARL Systems and methods for enhancing quality of image media
CN111724448A (en) * 2019-03-18 2020-09-29 华为技术有限公司 Image super-resolution reconstruction method and device and terminal equipment

Family Cites Families (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2920058C2 (en) * 1979-05-18 1983-09-29 Dr.-Ing. Rudolf Hell Gmbh, 2300 Kiel Method and circuit arrangement for partial electronic retouching in color image reproduction
US4288821A (en) * 1980-06-02 1981-09-08 Xerox Corporation Multi-resolution image signal processing apparatus and method
US4656467A (en) * 1981-01-26 1987-04-07 Rca Corporation TV graphic displays without quantizing errors from compact image memory
US4447886A (en) * 1981-07-31 1984-05-08 Meeker G William Triangle and pyramid signal transforms and apparatus
EP0111026B1 (en) * 1982-12-11 1986-03-05 DR.-ING. RUDOLF HELL GmbH Process and device for the copying retouching in the electronic colour picture reproduction
US4546385A (en) * 1983-06-30 1985-10-08 International Business Machines Corporation Data compression method for graphics images
JPS60148279A (en) 1983-12-28 1985-08-05 インタ−ナショナル ビジネス マシ−ンズ コ−ポレ−ション Image processing system
US4578713A (en) * 1984-07-20 1986-03-25 The Mead Corporation Multiple mode binary image processing
GB8425531D0 (en) * 1984-10-10 1984-11-14 Quantel Ltd Video image creation
GB8429879D0 (en) * 1984-11-27 1985-01-03 Rca Corp Signal processing apparatus
US4712141A (en) * 1985-03-30 1987-12-08 Dainippon Screen Mfg. Co., Ltd. Method and apparatus for interpolating image signals
GB2189106B (en) * 1986-04-14 1990-02-14 Philips Electronic Associated Image display
US4833625A (en) * 1986-07-09 1989-05-23 University Of Arizona Image viewing station for picture archiving and communications systems (PACS)
JP2508673B2 (en) * 1986-12-17 1996-06-19 ソニー株式会社 Display device
JP2745406B2 (en) * 1988-03-11 1998-04-28 株式会社リコー Control method of image filing system
US5070534A (en) * 1988-10-17 1991-12-03 International Business Machines Corporation Simplified cad parametric macroinstruction capability including variational geometrics feature
JPH02118680A (en) 1988-10-28 1990-05-02 Fuji Xerox Co Ltd Base color removing method for image forming device
JPH06101799B2 (en) 1988-10-28 1994-12-12 富士ゼロックス株式会社 Gray balance control method for image forming apparatus
US5179651A (en) * 1988-11-08 1993-01-12 Massachusetts General Hospital Apparatus for retrieval and processing of selected archived images for display at workstation terminals
US4910611A (en) * 1989-01-05 1990-03-20 Eastman Kodak Company Method for doing interactive image processing operations on large images
US5113251A (en) 1989-02-23 1992-05-12 Fuji Xerox Co. Editing control system and area editing system for image processing equipment
JPH02230383A (en) * 1989-03-03 1990-09-12 Hitachi Ltd Image processing device
JP2756301B2 (en) * 1989-04-10 1998-05-25 キヤノン株式会社 Image editing method and apparatus
US5249263A (en) 1989-06-16 1993-09-28 International Business Machines Corporation Color palette display interface for a computer-based image editor
US5245432A (en) * 1989-07-31 1993-09-14 Imageware Research And Development Inc. Apparatus and method for transforming a digitized signal of an image to incorporate an airbrush effect
GB2235856B (en) * 1989-09-01 1993-11-17 Quantel Ltd Improvements in or relating to electronic graphic systems
US5278950A (en) 1989-09-20 1994-01-11 Fuji Photo Film Co., Ltd. Image composing method
GB8923091D0 (en) 1989-10-13 1989-11-29 Quantel Ltd Improvements in or relating to electrical graphic systems
JPH03172075A (en) 1989-11-30 1991-07-25 Mita Ind Co Ltd Digital picture forming device
GB9007136D0 (en) * 1990-03-30 1990-05-30 Spaceward Ltd Video image creation
US5179639A (en) * 1990-06-13 1993-01-12 Massachusetts General Hospital Computer display apparatus for simultaneous display of data of differing resolution
GB2245460B (en) * 1990-06-18 1994-04-06 Link Miles Ltd Apparatus for generating a visual display
US5307452A (en) * 1990-09-21 1994-04-26 Pixar Method and apparatus for creating, manipulating and displaying images
WO1992006557A1 (en) 1990-09-28 1992-04-16 Eastman Kodak Company Color image processing system for preparing a composite image transformation module for performing a plurality of selected image transformations
US5208911A (en) 1990-09-28 1993-05-04 Eastman Kodak Company Method and apparatus for storing and communicating a transform definition which includes sample values representing an input/output relation of an image transformation
US5289570A (en) 1990-10-10 1994-02-22 Fuji Xerox Co., Ltd. Picture image editing system for forming boundaries in picture image data in a page memory device
US5119442A (en) * 1990-12-19 1992-06-02 Pinnacle Systems Incorporated Real time digital video animation using compressed pixel mappings
US5239625A (en) * 1991-03-05 1993-08-24 Rampage Systems, Inc. Apparatus and method to merge images rasterized at different resolutions
US5572499A (en) 1991-04-10 1996-11-05 Canon Kabushiki Kaisha Image processing apparatus for storing image data in storage medium and/or for reproducing image stored in storage medium
US5384899A (en) * 1991-04-16 1995-01-24 Scitex Corporation Ltd. Apparatus and method for emulating a substrate
US5263136A (en) * 1991-04-30 1993-11-16 Optigraphics Corporation System for managing tiled images using multiple resolutions
GB9109999D0 (en) * 1991-05-09 1991-07-03 Quantel Ltd Improvements in or relating to keying systems and methods for television image processing
US5157488A (en) * 1991-05-17 1992-10-20 International Business Machines Corporation Adaptive quantization within the jpeg sequential mode
EP0528631B1 (en) * 1991-08-13 1998-05-20 Xerox Corporation Electronic image generation
US5251271A (en) * 1991-10-21 1993-10-05 R. R. Donnelley & Sons Co. Method for automatic registration of digitized multi-plane images
CA2076687A1 (en) * 1991-11-27 1993-05-28 Thomas A. Pandolfi Photographic filter metaphor for control of digital image processing software
US5469536A (en) 1992-02-25 1995-11-21 Imageware Software, Inc. Image editing system including masking capability
JP3139831B2 (en) * 1992-05-27 2001-03-05 キヤノン株式会社 Image editing method and apparatus
US5384862A (en) * 1992-05-29 1995-01-24 Cimpiter Corporation Radiographic image evaluation apparatus and method
US5272760A (en) * 1992-05-29 1993-12-21 Cimpiter Corporation Radiographic image evaluation apparatus and method
US5475803A (en) * 1992-07-10 1995-12-12 Lsi Logic Corporation Method for 2-D affine transformation of images
US5367388A (en) 1992-07-27 1994-11-22 Scitex Corporation Ltd. Electronic separation scanner
US5270836A (en) * 1992-11-25 1993-12-14 Xerox Corporation Resolution conversion of bitmap images
FR2702861B1 (en) * 1993-03-15 1995-06-09 Sunline Method for processing an image in a computerized system.
KR100320298B1 (en) 1993-03-25 2002-04-22 마크 에이. 버거 Image processing method and system
JP3172075B2 (en) 1995-12-04 2001-06-04 新日本製鐵株式会社 Graphite uniformly dispersed steel excellent in toughness and method for producing the same
JP2845857B2 (en) * 1997-04-01 1999-01-13 コナミ株式会社 Translucent display device for image, translucent display method, and machine-readable recording medium recording computer program

Also Published As

Publication number Publication date
WO1994022101A2 (en) 1994-09-29
ATE223601T1 (en) 2002-09-15
US5907640A (en) 1999-05-25
US20030025921A1 (en) 2003-02-06
KR960701407A (en) 1996-02-24
US6763146B2 (en) 2004-07-13
US5790708A (en) 1998-08-04
EP0691011A1 (en) 1996-01-10
US6181836B1 (en) 2001-01-30
AU690551B2 (en) 1998-04-30
USRE43747E1 (en) 2012-10-16
EP0691011B1 (en) 2002-09-04
CN1147822C (en) 2004-04-28
CA2158988A1 (en) 1994-09-29
DE69431294D1 (en) 2002-10-10
US6512855B1 (en) 2003-01-28
DE69431294T2 (en) 2003-04-17
JPH08510851A (en) 1996-11-12
CN1124530A (en) 1996-06-12
KR100320298B1 (en) 2002-04-22
AU6697894A (en) 1994-10-11

Similar Documents

Publication Publication Date Title
CA2158988C (en) Method and system for image processing
JP4074000B2 (en) Image composition method by computer illustration system
KR100891428B1 (en) System and method for generating color gradients, interpolation method for facilitating color gradients, computer readable recording medium storing data packet for facilitating color gradients
US5835099A (en) Representing a region of a color image using a space-color separable model
EP0423930B1 (en) Electronic graphic system using low resolution control image
EP0727076B1 (en) Object-oriented graphic system and method
US5369739A (en) Apparatus and method for generating point sample masks in a graphics display system
JPH08508353A (en) Polymorphic graphics device
JP2004102998A (en) Method and device for rendering graphical object into chunk image, and for combining image layer into display image
US5222206A (en) Image color modification in a computer-aided design system
Naiman et al. Rectangular convolution for fast filtering of characters
JP4100765B2 (en) Simplified method of scene image synthesis by computer illustration system
US4910611A (en) Method for doing interactive image processing operations on large images
JP2957511B2 (en) Graphic processing unit
US6611632B1 (en) Device and method for interpolating image data and medium on which image data interpolating program is recorded
US7734118B2 (en) Automatic image feature embedding
JPS6282472A (en) Picture processing system
WO1994022101A1 (en) Method and system for image processing
JP3560124B2 (en) Image data interpolation device, image data interpolation method, and medium recording image data interpolation program
JP3170419B2 (en) Image shadowing method
JPH0955881A (en) Picture shading method
CA2256970A1 (en) Method for accessing and rendering an image

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed