WO1992021096A1 - Image synthesis and processing - Google Patents

Image synthesis and processing Download PDF

Info

Publication number
WO1992021096A1
WO1992021096A1 PCT/GB1992/000928 GB9200928W WO9221096A1 WO 1992021096 A1 WO1992021096 A1 WO 1992021096A1 GB 9200928 W GB9200928 W GB 9200928W WO 9221096 A1 WO9221096 A1 WO 9221096A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
lines
image
processing apparatus
image processing
Prior art date
Application number
PCT/GB1992/000928
Other languages
French (fr)
Inventor
Andrew Louis Charles Berend
Mark Jonathan Williams
Michael John Brocklehurst
Original Assignee
Cambridge Animation Systems Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB909026120A external-priority patent/GB9026120D0/en
Priority claimed from GB919100632A external-priority patent/GB9100632D0/en
Priority claimed from GB919102125A external-priority patent/GB9102125D0/en
Priority claimed from GB9110945A external-priority patent/GB2256118A/en
Priority claimed from GB9117409A external-priority patent/GB2258790A/en
Application filed by Cambridge Animation Systems Limited filed Critical Cambridge Animation Systems Limited
Priority to US08/150,100 priority Critical patent/US5598182A/en
Priority to JP4510509A priority patent/JPH06507743A/en
Publication of WO1992021096A1 publication Critical patent/WO1992021096A1/en
Priority to US08/643,322 priority patent/US5754183A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • This invention relates to an image processing apparatus for generating visible output images, including visually distinct objects.
  • This invention also relates to a method of generating visible output images, including visually distinct objects.
  • Image processing methods and apparatus are disclosed in copending PCT application nos. GB91/02122, GB91/02124, GB92/( 5130699) assigned to the present Assignee and are included herein as part of the present disclosure. Description of Background Art
  • images may now be represented in electronic form, in which the image is displayed on a cathode ray tube device or similar apparatus.
  • the cathode ray itself may be manipulated in response to vectors which trace the outline of an object, producing images which may be referred to as skeletons or wire frames.
  • Vector systems of this type have advantages in that they are memory efficient and may generate image data at a definition which is independent of the definition of the display device.
  • images may be produced on a cathode ray tube by raster scanning techniques, in which the whole of the screen is scanned periodically and the intensity of the cathode ray is modified in response to image data.
  • Raster scanning in this way is employed in televsision systems and raster scan monitors are readily available.
  • An advantage of the raster scanning approach is that pixel values may be stored in a frame store representing not only the outline of the image but also the overall colour and texture of the image. Thus, very realistic images may be produced and the amount of storage required to store an image is not dependent upon the level of detail within the image itself. Images stored in framestores in forms compatable with television standards are attractive to the video industry and allow artists to generate graphics for use in television programs. Systems of this type are disclosed in British patent 2059625 which relates to machines manufactured and sold by Quantel Limited under the trade mark "PAINTBOX". The art of manipulating pixel values within a framestore is commonly referred to "video graphics".
  • an operator selects the characteristics of the graphics tool he wishes to imitate and then manipulates a pressure sensitive stylus over a digitising pad to input a desired line.
  • the apparatus senses the stylus position and the pressure applied thereto, reads image data from a corresponding mapped area of an image store (e.g a frame buffer) modifies the data in accordance with the sensed pressure, and writes it back into the store.
  • an image store e.g a frame buffer
  • the system is arranged and intended to simulate conventional graphics tools, such as pencil paintbrush or airbrush, and the artist exerts control over the parameters of the line "drawn" in the image store in the same way, so that the width and other attributes of the line are controlled as the stylus moves.
  • the stored image data comprises a direct representation of the line itself, corresponding to a manually painted line.
  • a problem with video graphics systems is that much of the image manipulation can only be done in response to operator commands.
  • the image data is stored in a form which can be understood by a human operator and not in a form which can be easily understood by a machine. Consequently, machine manipulation of the image data is difficult.
  • image data is stored in machine readable form and final images are produced on a frame by frame basis in a process known as rendering.
  • Complex computer graphics algorithms are known, capable of producing very impressive images.
  • the rendering process may take several hours to complete, even when running on very fast machines. Such a situation may be acceptable when producing glossy one off images but in many applications such a demand on computer time is unacceptable and cannot compete with human operation.
  • An example of an operation in which a large number of images must be produced at reasonable cost is animation.
  • an animator produces a series of key frames as very rough pencil sketches.
  • the animators skill is that of being able to work in the temporal domain.
  • Other artists are then required to complete each of the key frames and, furthermore, to create inbetween frames, thereby providing sufficient frames such that, when shown as a sequence at sufficient speed, smooth movement of characters is perceived by a viewer.
  • a brush stroke is first defined by data specifying the linear trajectory of the brush (as point positions and tangents), and the pressure applied to the brush (which in turn specifies the width of the stroke normal to the line running along the trajectory), and then the colour profile laterally across the brush stroke is specified by the user defining a profile of individual bristle colours laterally across the brush stroke. It is suggested that profiles could be defined at the start and end of the stroke, and the colour profile along the stroke be interpolated from the end values.
  • WO84/02993 shows a system for generating images, in which an image path is dictated by Bezier control points. It is possible to vary the width of the entire stroke defined by the path, as a whole, but not to provide a varying width along the stroke; the object of that proposal is to create a stroke of uniform width.
  • an image processing apparatus for generating visible output images, including visually distinct objects, characterised by processing means for generating displayable image data; input means for supplying operator defined input signals to the processing means, said input signals defining boundary lines for said objects and a sectional line identifying two boundary lines as defining an object; and storage means for storing data generated in response to said input signals and for storing attribute data defining visual attributes of the object.
  • the processing means includes a programable processing unit and video frame storage means, wherein said displayable image data are generated by repeatedly reading pixel data from the frame storage means.
  • the memory means stores data relating to the position of selected points defining points on boundary lines and sectional lines and the processing means calculates the position of the boundary lines and the sectional lines in response to data read from the memory means.
  • a plurality of sectional lines connect the boundary lines, each sectional line having operator defined attribute data associated therewith.
  • end sectional lines connect the ends of the boundary lines defining a closed object.
  • the attribute data may define the colour and transparency of an object, each of which is variable across a sectional line.
  • pixel values within an object area are rendered by interpolating attribute data between sectional lines.
  • the position of boundary lines may be calculated at a greater definition than that of pixel positions within the frame storage means and values accorded to pixels of boundaries may depend upon the degree to which a pixel region is occupied by an object.
  • the definition of the boundary lines may be eight times that of the pixel positions, providing smooth anti-aliased lines, after a rendering process.
  • a method of generating visible output images including visually distinct objects characterised by the steps of supplying operator defined input signals to a processing means, defining boundary lines and a sectional line, said sectional line identifying the boundary lines as defining an object, storing data generated in response to said input signals, storing attribute data defining visual attributes of the object and generating displayable image data in response to said stored data.
  • the boundary lines are defined as Bezier curves and data is stored representing the position of the fixed ends of the curves, along with tangent points.
  • the sectional lines may be straight lines.
  • said sectional lines may also be Bezier curves, defined by data similar to that for defining the boundary lines.
  • Figure la shows a curved line
  • Figures 1b and 1c shows straight line approximations to the curved line of Figure la and Figures Id and le shown Bezier curves
  • Figure 2a shows the effect of modiyfing the length of a tangent vector to a Bezier curve
  • Figure 2b shows the effect of modifying the direction of a tangent vector for a Bezier curve
  • Figure 3 shows an image processing apparatus, including processing elements, storage elements and interface elements
  • Figure 4 schematically shows the arrangement of data in a memory forming part of the apparatus shown in Figure 3;
  • Figure 5a and 5b show displays produced by the apparatus shown in Figure 3;
  • Figure 5 shown a schematic representation of data stored in memory forming part of the apparatus shown in Figure 3;
  • Figure 7 shows a schematic representation of the functional elements of an apparatus for generating a display;
  • Figure 8 shows a schematic representation of the process by which the apparatus produces a display;
  • Figure 9 shows a schematic representation of the functional elements of the apparatus of Figure 3, for allowing the input of data to produce a display;
  • Figure 10 shows a process performed by the apparatus for receiving input data
  • Figure 11 shows a schematic represetation of the functional elements of the apparatus for producing a dispaly
  • Figure 12 illustrates boundary lines and sectional lines defining the position of an object, including a plurality of regions
  • Figure 13 details one of the regions shown in Figure 12, including a plurality of strips;
  • Figure 14 details one of the strips shown in Figure 13.
  • Figure 15 shows a schematic representation of an arrangement forming part of the apparatus.
  • Figure 16 schematically shows a process performed by the apparatus.
  • Parametric curves are referred to in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Edison Wesley, ISBN 0-201-17439-1, and "An Introduction to Splines for Use in Computer Graphics and Geometric Modelling", by R H Bartels, J C Beatty and B A Barsky, published by Morgan Kaufmann, ISBN 0-934613-27-3 (both incorporated herein by reference).
  • a fairly smooth freehand curve is shown.
  • one way of representing the curve would be to draw a series of straight line segments, meeting at points.
  • the number of straight line segments has to be large, as illustrated in Figure 1c, before the simulation is at all convincing.
  • the curve may be represented as a series of curve segments running between points. If, as in Figure 1d, adjacent curve segments have the same slope at the point of which they join, the curve can be made smooth.
  • x a x t 3 + b x t 2 + c x t + d x (1)
  • y a y t 3 + b y t 2 + c y t + d y (2)
  • x 1 a x + b x + c x + d x ( 3 )
  • y 1 a y + b y + c y + d y ( 4 )
  • the slope of the curved segment is also fixed or predetermined so that each segment can be matched to its neighbours to provide a continuous curve, if desired.
  • the shape of the curve between the end points is defined by the slopes at the end points and by a further item of information at each point, which is conveniently visualised as the length of a tangent vector at each point.
  • the curve between the two points may be thought of as clamped at its end points, with fixed slopes thereat, while the tangent vectors exercises a pull on the direction of the curve, proportional to their length so that, if the tangent vector is long, the curve tends to follow the tangent over much of its length.
  • the data used to define a curve segment is the coordinates of the end points, the slope of the tangent vector at each end point and the length of each tangent vector.
  • the data used to define a curve segment are the coordinates of the end points, and the coordinates of the ends of each tangent vectors. Conversion between the Hermite and Bezier format is merely a matter of converting between polar and rectangular coordinates.
  • Figure 2a shows the effect of varying the magnitude or lengths of the tangent vectors, while keeping their angle constant. It will be seen that the effect is to "pull" the curve towards the tangent vector, more or less strongly depending on the length of the tangent vector.
  • Figure 2b shows the effect of varying the angle of the tangent vector while keeping its magnitude fixed.
  • a smooth curve is defined by a number of such end points, and two adjacent segments will share a common end point. If the curve is to be smooth, the tangent angles defined at the end point in relation to each curve segment will be equal, although the tangent vector lengths will in general not be equal.
  • a significant advantage of this form of curve representation is that a smooth, bold curve can be defined using only a small number of coefficients or control points, and parts of it can be amended without extensive recalculation of the whole line.
  • apparatus comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
  • a monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110.
  • At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball” device or a joystick.
  • a cursor symbol is generated by the computer 180 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing.
  • a mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high.
  • the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow data to be transferred into and out from the computer 100.
  • printer 190 for producing a permanent visual output record of the image generated.
  • the output may be provided on a transparency or on a sheet of paper.
  • a picture input device 195 such as a scanner for scanning an image on, for example, a slide, and inputting a corresponding video image signal to the computer 150 may also be provided.
  • a suitable computer 100 is the NeXTCUBE computer including the NeXTdimension colour board, available from NeXTComputer, Inc., USA. This arrangement provides direct formatted outputs for connection to a videocassette recorder or other video storage device, and accepts video input signals. Further, it includes means for compressing images for storage on a disk store 180, and for decompressing such stored images for display.
  • the frame store device 130 comprises a pair of image stores 130a, 130b.
  • the image store 130a stores the image point, or pixel, data for the image to be generated or processed.
  • the second area, 130b stores a supervisory or control image displayed during generation or processing of the image stored in the store 130a.
  • the supervisory image may be represented at a lower resolution than the generated image and/or in monochrome and hence the image store 130b may be smaller than the store 130a.
  • the appearance of the contents of the generated image store 130a when displayed on the monitor 160 or output by the printer 180, comprises, as shown, objects a,b each having a trajectory and an background c also possess colour (or in a monochrome system, brightness).
  • the system of the preferred embodiment is arranged to generate visible output images, including visually distinct objects. These objects may be perceived as representing strokes of a graphic implement, such as a pen, pencil, brush or air brush. However, the stokes or objects may have sophistacted characteristics, such as variations in colour across their widths, which are not attainable by a single stroke of a traditional implement.
  • Image data may be full colour pixel related data displayable on a video monitor.
  • the pixel data is held in frame store 130b and the definition of this data, that is to say, the number of pixel locations present within frame store 130b, is dictated by the definition required for the final output image.
  • image data may be produced for broadcast television purposes, high definition television or film, requiring higher levels of definition.
  • the processing system for generating displayable image data also includes a programable processor for generating the displayable image data in response to operator defined data.
  • an input device which may consist of a mouse, a trackerball or a stylus and touch tablet combination, for example, or any other suitable user interface for defining two dimensional coordinate locations.
  • the operator does not define image data directly, unlike the aforesaid video graphics equipment and is instead provided with a system for defining objects in a machine readable form by defining boundary lines and a sectional line.
  • Display data relating to boundary lines and sectional lines is stored in a supervisory frame store, 130b, which, typically, has a lower definition than the image store 130a.
  • Boundary lines define the outer edge of an object and a sectional line connects boundary lines, thereby identifying the connected boundary lines as outlining an object.
  • image store 130a In addition to frame store 130b, for storing positional data generated by an operator, storage is also provided for storing attribute data, which defines visual attributes of the object. Thus, these attributes define the colour of the object and its transparency. Colour and transparency values are then generated for each pixel location, therefore image store 130a has pixel locations of sufficient depth for storing colour and transparency. Colour in the image store may be stored as luminance plus colour difference signals or, preferably, as full colour RGB signals.
  • Supervisory store 130b identifies supervisory data which may be represented in a plurality of colours, although full colour storage is not required.
  • the contents of the generated image frame store 130a therefore comprise a plurality of point data defining colour and/or intensity of each of a plurality of points to be displayed to form the display shown in FIG 5A, for example, 500 ⁇ 500 image point data, each comprising colour or brightness information as a multi-bit digital number.
  • 500 ⁇ 500 image point data each comprising colour or brightness information as a multi-bit digital number.
  • several bits representing each of Red (R), Green (G) and Blue (B) are provided.
  • the frame store 130a is of the type which stores additionally a transparency value for each image point to allow the generated image to be merged with another image.
  • the address within the store 130a of given point data is related, or mapped, to its position in the display of FIG 5A, which will hereafter be referred to in X (horizontal position) and Y (vertical position) Cartesian co-ordinates.
  • the contents of the supervisory image store 130b comprise point data for a plurality of points making up the image of FIG 3B; in this case, however, the display may comprise only a monochrome line and the point data may for each point merely comprise a single bit set to indicate either a dark point or a light point.
  • a line shown on the supervisory display image in FIG 5B is therefore represented by a plurality of pixel values at corresponding X,Y positions within the supervisory image store area 130b.
  • this representation of the line is difficult to manipulate if the line is to be amended.
  • a second representation of the line is therefore concurrently maintained in the working memory area 121 of the memory device 120.
  • This representation comprises a plurality of data defining the curve in vector form.
  • the curve 13 is represented by the position of points ("control points") between which intervening curve values can be derived by calculation.
  • display frames consisting of line drawings of objects, are created and/or edited with reference to stored control point data, preferably data stored in the Bezier format referred to above.
  • a stored representation of a display frame comprises a plurality of control points which define line segments which make up a line representation.
  • a table 122 is provided in the working memory 121 storing the control point data for that line as shown in FIG 4.
  • the data are stored as a plurality of point data each comprising point x,y co-ordinates, and data representing slope value for the tangent to the curve at those coordinates, and a tangent magnitude parameter indicating (broadly speaking) the extent to which the curve follows the tangent.
  • This format is used, for example, in control of laser printer output devices.
  • the data may be stored as point coordinates x and y, and tangent angle and length (Hermite form), but is conventionally and conveniently stored as point coordinates x and y and tangent end coordinates.
  • 'Bezier' format will be used to describe both. Full details will be found at "An Introduction to Splines For Use in Computer Graphics and Geometric Modelling" R H Bartels et al, especially at pages 211-245, published by Morgan Kaufmann, ISBN 0-934613-27-3.
  • Each control point is thus represented by data comprising positional data (x i ,y i ) representing the position within the area of the display of that control point, and tangent data (x ei ,y ei , x fi ,y fi ) defining two tangent end points associated with the curved segments on either side of the control point.
  • the tangent extent point data (x ei ,y ei , x fi ,y fi ) are stored as position data X, Y defining the position of the tangent end point. It would also be possible to store instead the x,y offsets from the control point position.
  • Complex curved lines can be represented by a number of such control points, two (at least) for each inflexion in the line.
  • the control points stored in the line table 122 each define, between adjacent points, a line segment described by a corresponding cubic equation, and are the values at which the parameter t in that equation is 0 and 1.
  • intervening points in the line e.g POINT 2
  • each is effectively two control points and consequently has data defining two stored tangents.
  • the above described Bezier format is particularly convenient, other parametric ways of representing a curve by control points may be employed, such as the B-spline form, in which the curve control points are not required to lie upon the curve which they characterise.
  • supervisory display generating means 111 reads the control point data from the corresponding line table 122 in the memory 120, and calculates the values of intervening points on the curve. It then accesses the supervisory display image store 130b and sets the values of the corresponding image points therein, to cause display of the generated line on the monitor 160.
  • the line generating means 111 comprises the CPU 110 operating under control of a program stored in a program store area 129 of the memory 120.
  • the supervisory display image store 130b is unnecessary and the generating means 111 supplies vector information from the table 122 to the display, for example as a command in the "Postscript" graphics computer language.
  • Separate monitor devices 160a,160b could be provided, one for each of the supervisory display and generated display; for instance, the supervisory display monitor may be a monochrome personal computer monitor provided with the computer 100 and the monitor 160b for the generated image a high resolution colour monitor.
  • the computer 100 may be arranged to alternately select one of the supervisory display and generated image display for display on the monitor 160, by alternately connecting the frame stores 130a or 130b thereto. Normally, the supervisory display would be shown, except where it is desired to view the effect of editing an object in the generated image.
  • a single monitor 160 could be arranged to display both displays adjacent or one overlaying the other as a window.
  • the outputs of both the frame stores 130a, 130b may be connected so that the supervisory display overlies the generated image; in this case, the supervisory display may be indicated by dashed lines or in any other convenient manner so as not to be confusable with the generated image.
  • the position and tangent data for a pair of adjacent control points is read from the table 122, and the parameters a,b,c,d of equation 1 are derived therefrom.
  • a large number of intervening values of the parameter t between 0 and 1 are then sequentially calculated to provide x,y coordinates of interventing points along the line, and these are quantised to reflect the number of image points available in the supervisory display, and corresponding point data in the supervisory image store 130b are set.
  • the supervisory display generator 111 accesses the next pair of points in the table 122. This method is relatively slow, however; faster methods will be found in the above Bartels reference.
  • the curve or path vector data held within the line tables 122 may have been stored therein from different sources.
  • the data may be read from a file within the mass storage (for example disk) device 180.
  • they could be derived by fitting a spline approximation to an input curve represented by data derived, for instance, from a scanner or from a user operated digitising pad and stylus.
  • a particularly preferred method of allowing the input and modification of the point data will now be described.
  • Editing may involve either modifying the trajectory of existing lines or (more rarely) adding new lines. It is therefore necessary both to amend the data held in the frame table 122, and desirably to amend the image data in the image store 130 so as to enable the user to view the effects of the change. It is found that the best way of providing the user with means for amending the frame data stored in the table 122 is to allow him to employ a position sensitive input device 170a, so as to appear to directly amend the displayed representation of the frame on the screen monitor 160.
  • a user manipulates the position sensing input device 170a, such as a mouse, by moving the device 170a so as to generate a signal indicating the direction and extent of the movement.
  • This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140.
  • the cursor position controller 112 accesses the supervisory display store area 130b and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the supervisory display shown on the monitor 160. The user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D.
  • the supervisory display line generator 111 is arranged not only to write data corresponding to the line A into the supervisory display store 130b, but also to generate a display of the control point data. Accordingly, for each control point A 1 ,A 2 , the supervisory image generator 111 writes data representing a control point symbol (for example, a small dark disc) into the image store 130b at address locations corresponding to the control point co-ordinates x,y.
  • a control point symbol for example, a small dark disc
  • the supervisory image generator 111 preferably, for each control point, correspondingly generates a second control point symbol E 1 located relative to the A 1 along a line defined by the control point tangent data at a length determined by the control point magnitude data; preferably, a line between the two points A 1 and E 1 is likewise generated to show the tangent itself.
  • the user signals an intention so to do
  • the cursor position controller 112 supplies the current cursor position data to the table 122 as control point position co-ordinates, and the supervisory display generator 111 correspondingly writes data representing a control point symbol into the image store 130b at address locations corresponding to the control point co-ordinates.
  • the user then inputs tangent information, for example via the keyboard 170b, or in the manner described below.
  • the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the supervisory display store 130a.
  • a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A 1 or E 1 on the display 160.
  • the user then generates a control signal, for example by clicking the mouse input device 170a.
  • the device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112.
  • the cursor position controller 112 supplies the cursor position data to a supervisory display editor 113, (comprising in practice the CPU 110 operating under stored program control) which compares the stored cursor position with, for each point, the point position (X,Y) and the position E of the end of the tangent.
  • the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A. with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor.
  • the supervisory display generator 111 regenerates the line segment affected by the control point in question within the supervisory display image store 130b so as to change the representation of the line on the supervisory display.
  • Figure 3b represents only lines, and corresponds to, for example, the output of an animation program or a PostScript (TM) page design program.
  • the objects a,b shown in Figure 3a correspond to the lines A,B shown in Figure 3b insofar as their general trajectory or path is concerned, but differ therefrom by displaying one or more of the following additional attributes:
  • Each object a,b in Figure 5a may be coloured, and the colour may vary along the line of each object.
  • the profile of colour across the width of the object may also be non-constant.
  • An object a,b may be given the appearance of a semitransparent object positioned in front of the background c, by providing that the colour of a part of the object a, be influenced by the colour of the background c, to an extent determined by an opacity parameter varying between 0 (for a transparent or invisible object, the colour of which is entirely dictated by the colour of the background c) and unity (for an entirely opaque object, the colour of which does not depend on that of the the background c).
  • the effect of the opacity of the object is significant when the object is moved, since parts of the object exhibiting some transparency will show an altered appearance depending upon the colour of the background c.
  • the objects a,b to be displayed are represented within the frame store 130a, in a similar form to that in which they would be represented by a computer painting system, that is, as an array of pixel data.
  • a computer painting system that is, as an array of pixel data.
  • changing the representation of attributes in this form requires a very large amount of data processing, since a large number of pixel values must be amended. Further, it is not possible to change the position or shape of a line while leaving other of the above listed attributes unaffected.
  • this embodiment Rather than storing colour information for every image pixel in the object a or b, this embodiment accordingly stores information corresponding to attributes of the object associated with predetermined points along the lines shown in the supervisory display and the corresponding values at intervening points in the object are generated by an image generator device 114 shown in Figure 11 comprising, in practice, the CPU 110 operating under stored program control, and the generated image data stored at corresponding positions within the generated image store 130a for display on the generated image display on the monitor 160.
  • the generation of an image may be considered as taking place in two stages. Firstly, in response to operator commands, a control image is generated, control data is stored and a representation of this data is supplied to the supervisory store, allowing the data to be interactively viewed by the operator.
  • the operator works in relation to a cartisian coordinate reference, the definition of which is user selectable up to the operating constraints of the system.
  • a suitable operating environment consists of a work station based around an intel 486 microprocessor, providing 32 bit floating point arithmetic. Thus, any point position may be calculated in 32 bit floating point arithmetic providing a very high spatical definition.
  • a notional x/y frame Upon start-up, a notional x/y frame is provided with coordinate locations separated by 1/72 inch points, providing 500 identifiable locations in both the x and y directions. To facilitate detailed work, this area may be zoomed up, by up to a factor of 64 times. However, these values are only arbitrarily defined and may be adjusted to suit a particular environment, if necessary.
  • x/y coordinates defining lines are recalculated, thereby retaining line smoothness irrespective of zoom magnification.
  • An object generated by the system may be considered as a brush stroke although, given the level of sophistication provided by the system, details such as colouring and transparency may be modified across the stroke and a whole object or a significant part of an object may be represented by a single stroke.
  • data representing a single stroke may be sufficient to define a complete cherry stalk, with colouring and shading creating the illusion of depth.
  • the extent of a stroke or object is defined by boundary lines.
  • Each boundary line is a Bezier curve, generated in response to control points.
  • a control point fixes the position of a Bezier curve within the coordinate frame.
  • Individual transformations may be performed on the control points, in response to user operations, or affine transformations may be performed on a collection of points.
  • Such modifications are stored by modifying data representing the Bezier descriptions and the actual lines resulting from these Bezier descriptions are redrawn on a frame by frame basis.
  • Two lines are defined as boundary lines for an object by connecting them with a sectional line.
  • a sectional line may connect with a boundary line at a control point, thereby fixing the position of the sectional line or, alternatively, the connecting point need not be a control point and is referred to as an attribute point, in which case the sectional line will be modified in accordance with movements to the boundary line.
  • FIG. 12 An example of a stroke is shown in Figure 12.
  • supervisory store 130b within frame store 130 ( Figure 3) is active and generates image signals which are supplied to monitor 160.
  • a cursor is displayed on monitor 160 which responds, spatially, to operation of the mouse 170a.
  • an operator identifies a desire to create a control point C1 and moves the mouse so that the cursor is displayed at the desired position of control point C1.
  • the x and y locations of the control point 1 are recorded in working memory, in a table of the form shown in Figure 6.
  • the operator now moves the mouse so that the cursor is displayed at the position required for a second control point C2 and on clicking the mouse again, point C2 is created.
  • a subroutine for generating Bezier curves is provided and on selecting the second control point C2, a call is made to this subroutine.
  • x/y coordinate positions along the Bezier curve are generated and this data is written to the supervisory store 130b, resulting in a boundary line Bl being displayed on the monitor 160.
  • boundary lines are drawn with a width of one pixel and cannot be antialiased.
  • this line is only a representation of the mathematical definition of the boundary line, which can be rendered at any desired definition.
  • control point C3 and C4 are generated which are then connected by a boundary line B2, using Bezier curves.
  • a two dimensional object is identified by connecting the boundary lines B1 and B2 together by a sectional line.
  • control point C1 is connected to control point C3 by a sectional line S1 and control point C2 is connected to control point C4 by a sectional line S2, thus defining a closed region or object C1, C2, C4, C3.
  • the sectional lines S1 and S2 are also Bezier curves and an additional line table, of the type shown in Figure 6, is required to define the position of such sectional lines.
  • a third sectional line S3 is generated between control point C5 and C6, again generated by placing the cursor, in response to movements of the mouse, at the required positions along the boundary lines B1 and B2.
  • control point C5 and C6 not only identify the position of sectional lines S3 but also control the shape of Bezier curves C1 C5, C5 C2, C3 C6 and C6 C4.
  • Sectional lines define attribute data consisting of colour and transparency.
  • the colour data stored comprises the colour value for each of a small number of points along the cross-section 53, and the image generating means 114 correspondingly generates the colour values at the intervening points by interpolation therebetween. Colour values are set at a colour control point.
  • the positions therebetween of the intervening points could be pre- determined but are preferably selectable by the user, in which case an indication of the position along a sectional line is stored with each value C 2 C 3
  • the position data stored comprises a fraction of the distance along the sectional line.
  • Opacity or transparency data specifying, as a fraction, the relative dependence of the colour of the object a on the colour data for the object a relative to the colour data for the background, is likewise stored in the line table 122 corresponding to opacity control points H 1 ,H 2 in the same manner as described above for colour data, except that an opacity value is stored rather than a colour value. It is therefore possible to vary the degree of transparency of the object across its section, as well as along its length.
  • the image generator 114 is therefore arranged preferably to derive colour data values by interpolating between colour control points, and to do likewise to derive transparency values and, finally, to set the colours of image points stored in the generated image store 130a by reading the stored background colour and forming, for each image point, the interpolated colour value multiplied by the interpolated opacity value, together with the background colour value multiplied by unity less the interpolated opacity value.
  • the user to set up the attribute values for an object a to be displayed on the monitor 160, the user generates a control signal (typically by typing an appropriate command on the keyboard 170b, or by. positioning the cursor symbol on a specified part of the screen of the monitor 160 and clicking the mouse, indicating that an attribute is to be input or added to the object.
  • a control signal typically by typing an appropriate command on the keyboard 170b, or by. positioning the cursor symbol on a specified part of the screen of the monitor 160 and clicking the mouse, indicating that an attribute is to be input or added to the object.
  • the supervisory display editor 113 receives the cursor position from the cursor controller 112, and writes a corresponding attribute control point symbol into a corresponding position in the supervisory display image store 130b, which is consequently subsequently displayed on the monitor 160.
  • the stored cursor position indicating the position along the line at which the control point is placed by the user is then processed for storage in the attribute line data within the line table 122 in the memory 120.
  • the cursor position is not directly stored since, if the user subsequently repositioned the line as discussed above, the attribute control point would no longer lie on the line. Instead, an indication of the relative position along the line, between its two neighbouring curve control points, is derived and this indication is stored so that the position of the attribute control point is defined relative to the position of the line, regardless of subsequent redefinitions of the line position.
  • the value of t at the closest point on the line is derived, for example to set (x-x t ) 2 + (y-y t ) 2 to a minimum.
  • the value of the parameter t is then stored as an entry in the attribute data within the line table 122.
  • a profile generating means 118 (comprising, conveniently, the CPU 100 acting under stored program control) causes the display 160 to display the contents of a profile display store 130c (as, for example, a display "window" overlying the supervisory display).
  • the contents of the profile display store 130c comprise image data defining a horizontal and vertical axes.
  • the display represents the profile of opacity across the brush, corresponding to a cross-section along the line A.
  • the horizontal axis represents position across the line A between the two lateral extents e 1 ,e 2 .
  • the vertical line represents opacity. Both axes are conveniently scaled between 0 and 1.
  • the cursor position controller 112 is arranged to write data into the profile display store 130c to cause the display of a cursor symbol D at a position therein defined by movements of the position sensitive input device 170b.
  • the user By positioning the cursor symbol at a point between the axes, and generating a control signal, the user signals an opacity value at a given distance across the object a transverse to the line A.
  • the corresponding position between the extents e 1 ,e 2 and opacity value thereat are derived by the profile generator 118 from the current cursor position supplied by the cursor tracker 112 and are written into the attribute data held within the line data store 122.
  • the profile generator 118 likewise causes the generation, at the current cursor position, of a point symbol. The cursor may then be repositioned, but the point symbol remains.
  • the profile generator 118 preferably calculates by interpolation, the coordinates of image data within the profile display store corresponding to intervening points along an interpolated line between the points for which opacity data is stored, and sets the value of those image points within the profile display store 130c, so that when displayed on the display device 160, so as to represent the profile which would be followed at that point.
  • Generating a schematic cross-section display of this type is found to be of assistance to a user in visualising the transparency of, for example, an object corresponding to an airbrush stroke.
  • the interpolation performed by the profile generator 118 is preferably the same as that which will be performed by the image generator 114.
  • the line table 122 is dimensioned to allow storage of two attribute values for each such lateral position C 2 ,C 3 ; as shown in FIG 12, one value is used to perform interpolation to one neighbouring point and the other to the other neighbouring point.
  • a corresponding profile display could be provided to allow the input and amendment of other attributes; for instance, brightness (of a monochrome object) or colour (of a coloured object).
  • predefined attribute data specifying colour profiles and opacity profiles are also stored on the mass storage device 180 corresponding, for example, to particular paintbrushes or airbrushes, or to particular previously defined objects.
  • the user may enter an appropriate command (via the keyboard 170b) to read such predetermined data from the mass storage device 180 into the line data table 122.
  • the data stored for each attribute control point can specify all, or only a subset of the available attributes; for instance, it may be used to specify only only colour or opacity.
  • the variations across the object a of these attributes may be separately controlled, and independently edited.
  • the default predetermined value assigned to each attribute is a flag indicating that the attribute is not set.
  • additional sectional lines S4 and S5 are generated, again in the form of Bezier curves.
  • the points at which the sectional lines S4 and S5 intersect with boundary lines B1 and B2 are not control points, in that they do not define the shapes of B1 and B2.
  • Such attribute points Al A2 are selected from the menu and poisitionally located by means of the mouse, as previously described.
  • section line S4 Before section line S4 is created, it would be common for attributes to have been set for sectional lines S1 and S3. After S4 is created, attributes of colour and transparency are generated for sectional line S4 by interperlating values between sectional lines S1 and S3. Thereafter, the colour and transparency of values set along line S4 may be modified by the operator.
  • the second stage may be initiated in order to render the stroke into a full colour image stored in the image store 130a of Figure 4.
  • Sectional lines S1, S4,' S3, S5 and S2 define regions R1, R2, R3 and R4.
  • the first step is to determine the colour and opacity profile for each of these regions.
  • Each region between each pair of adjacent sectional lines can be rendered independently and region R2 is detailed in Figure 13.
  • the region R2 is divided into a plurality of strips SS1, SS2, SS3 etc between sectional lines S4 and S3.
  • the strips SS1 etc are defined by forming additional sectional lines, generated by interpolating sectional lines SS4 and SS3.
  • the additional sectional lines are interpolated by transforming sectional lines S4 and S3 so that their end points are at, for example, (0,0) and (0,1) in Cartesian space.
  • the curve is translated so that its first end point lies at the origin.
  • the translated curve is then rotated about the origin so that its other end point lies along the y axis.
  • the curve is then scaled to bring the other end point to coordinate (0,1).
  • Strip SS2 is shown in greater detail in Figure 14.
  • the strip SS2 is defined by two additional sectional lines ASL1 and ASL2.
  • Addtional sectional lines ASL1 and ASL2 have no opacity and colour profiles, these parameters being generated by interpolating the values stored for sectional lines S4 and S3.
  • the strip is now divided into quadrilaterals Q1, Q2, Q3, Q4 and Q5 as shown in Figure 14.
  • the colour and opacity of each cooner of each qualrilateral is determined, from which colour and opacity values for each pixel within a quadrilateral may be calculated by linerally interpolating these values.
  • the strip is divided into qualrilaterals by placing notional control points along lines ASL1 and ASL2. Such control points are positioned at places where colours are defined and then additional points are identified, such that adjacent points may be connected by a straight line, without crossing pixel boundaries. Thus, a quadrilateral may be formed if its edges do not cross a pixel boundary.
  • smaller quadrilaterals are formed, such that, for example, straight lines defining quadrilaterals may be drawn to a definition of one eighth of the pixel spacing.
  • Pixels falling at the boundary of a quadrilateral are rendered separately, so that contributions from both quadrilateral regions may be combined before the image is placed against a background.
  • the attribute data could be arranged to store attributes other than colour and opacity.
  • the attribute data could be arranged to store attributes other than colour and opacity.
  • linear interpolation is performed, other forms of interpolation, such as Hermite interpolation could be performed if required.

Abstract

An image processing apparatus is disclosed for generating visible output images, including visually distinct images. A processor generates displayable image data by writing data to a frame store which is repeatedly read therefrom. Input means, preferably in the form of a mouse, supplies operator defined input signals to the processing means for defining boundary lines (B1, B2) of an object and a sectional line (51), which identifies two boundary lines as defining the object. In addition, storage means are provided for storing data generated in response to the input signals and for storing attribute data defining visual attributes of the objects.

Description

IMAGE SYNTHESIS AND PROCESSING
Field of the Invention
This invention relates to an image processing apparatus for generating visible output images, including visually distinct objects. This invention also relates to a method of generating visible output images, including visually distinct objects. Image processing methods and apparatus are disclosed in copending PCT application nos. GB91/02122, GB91/02124, GB92/( 5130699) assigned to the present Assignee and are included herein as part of the present disclosure. Description of Background Art
Traditional tools available to graphic artists include pencils and pens (generally producing substantially uniform lines) and brushes or airbrushes, producing a controllably non- uniform line, variable by varying the pressure and/or speed applied to the tool.
In addition to being represented in traditional form, images may now be represented in electronic form, in which the image is displayed on a cathode ray tube device or similar apparatus. The cathode ray itself may be manipulated in response to vectors which trace the outline of an object, producing images which may be referred to as skeletons or wire frames. Vector systems of this type have advantages in that they are memory efficient and may generate image data at a definition which is independent of the definition of the display device. Alternatively, images may be produced on a cathode ray tube by raster scanning techniques, in which the whole of the screen is scanned periodically and the intensity of the cathode ray is modified in response to image data. Raster scanning in this way is employed in televsision systems and raster scan monitors are readily available. An advantage of the raster scanning approach is that pixel values may be stored in a frame store representing not only the outline of the image but also the overall colour and texture of the image. Thus, very realistic images may be produced and the amount of storage required to store an image is not dependent upon the level of detail within the image itself. Images stored in framestores in forms compatable with television standards are attractive to the video industry and allow artists to generate graphics for use in television programs. Systems of this type are disclosed in British patent 2059625 which relates to machines manufactured and sold by Quantel Limited under the trade mark "PAINTBOX". The art of manipulating pixel values within a framestore is commonly referred to "video graphics".
With a video graphics system, an operator selects the characteristics of the graphics tool he wishes to imitate and then manipulates a pressure sensitive stylus over a digitising pad to input a desired line. As the stylus is moved over the tablet, the apparatus senses the stylus position and the pressure applied thereto, reads image data from a corresponding mapped area of an image store (e.g a frame buffer) modifies the data in accordance with the sensed pressure, and writes it back into the store. The system is arranged and intended to simulate conventional graphics tools, such as pencil paintbrush or airbrush, and the artist exerts control over the parameters of the line "drawn" in the image store in the same way, so that the width and other attributes of the line are controlled as the stylus moves. Thus, the stored image data comprises a direct representation of the line itself, corresponding to a manually painted line. A problem with video graphics systems is that much of the image manipulation can only be done in response to operator commands. The image data is stored in a form which can be understood by a human operator and not in a form which can be easily understood by a machine. Consequently, machine manipulation of the image data is difficult.
In computer graphics, as distinct from video graphics, image data is stored in machine readable form and final images are produced on a frame by frame basis in a process known as rendering. Complex computer graphics algorithms are known, capable of producing very impressive images. However, the rendering process may take several hours to complete, even when running on very fast machines. Such a situation may be acceptable when producing glossy one off images but in many applications such a demand on computer time is unacceptable and cannot compete with human operation.
An example of an operation in which a large number of images must be produced at reasonable cost is animation. Conventionally, an animator produces a series of key frames as very rough pencil sketches. The animators skill is that of being able to work in the temporal domain. Other artists are then required to complete each of the key frames and, furthermore, to create inbetween frames, thereby providing sufficient frames such that, when shown as a sequence at sufficient speed, smooth movement of characters is perceived by a viewer.
It is known in computer graphics to represents objects as parametric curves, the curve shape being specified and controlled by data representing the positions of points on the curve and the tangents thereat; as disclosed in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Addison Wesley, ISBN 0-201-17439-1.
In "Hairy Brushes", Strassman, 1986 Siggraph Conference Proceedings (Vol 20, No 4, Page 225-232), a system for emulating paintbrushes of a particular kind is described in which a brush stroke is first defined by data specifying the linear trajectory of the brush (as point positions and tangents), and the pressure applied to the brush (which in turn specifies the width of the stroke normal to the line running along the trajectory), and then the colour profile laterally across the brush stroke is specified by the user defining a profile of individual bristle colours laterally across the brush stroke. It is suggested that profiles could be defined at the start and end of the stroke, and the colour profile along the stroke be interpolated from the end values.
As that system is intended to simulate particular types of existing brush, it makes a distinction between properties of the stroke (its trajectory and its pressure - dictated width) and those of the brush (its colour profile).
WO84/02993 shows a system for generating images, in which an image path is dictated by Bezier control points. It is possible to vary the width of the entire stroke defined by the path, as a whole, but not to provide a varying width along the stroke; the object of that proposal is to create a stroke of uniform width.
US 4897638 shows an image processing system in which it is possible to specify a varying stroke width, but only at the control points used to specify the curvature of the path. Summary of the Invention
According to a first aspect of the present invention, there is provided an image processing apparatus for generating visible output images, including visually distinct objects, characterised by processing means for generating displayable image data; input means for supplying operator defined input signals to the processing means, said input signals defining boundary lines for said objects and a sectional line identifying two boundary lines as defining an object; and storage means for storing data generated in response to said input signals and for storing attribute data defining visual attributes of the object. In a preferred embodiment, the processing means includes a programable processing unit and video frame storage means, wherein said displayable image data are generated by repeatedly reading pixel data from the frame storage means. Preferably, the memory means stores data relating to the position of selected points defining points on boundary lines and sectional lines and the processing means calculates the position of the boundary lines and the sectional lines in response to data read from the memory means. In a preferred embodiment, a plurality of sectional lines connect the boundary lines, each sectional line having operator defined attribute data associated therewith. Preferably, end sectional lines connect the ends of the boundary lines defining a closed object. The attribute data may define the colour and transparency of an object, each of which is variable across a sectional line. Preferably, pixel values within an object area are rendered by interpolating attribute data between sectional lines. The position of boundary lines may be calculated at a greater definition than that of pixel positions within the frame storage means and values accorded to pixels of boundaries may depend upon the degree to which a pixel region is occupied by an object. The definition of the boundary lines may be eight times that of the pixel positions, providing smooth anti-aliased lines, after a rendering process.
According to a second aspect of the present invention, there is provided a method of generating visible output images, including visually distinct objects characterised by the steps of supplying operator defined input signals to a processing means, defining boundary lines and a sectional line, said sectional line identifying the boundary lines as defining an object, storing data generated in response to said input signals, storing attribute data defining visual attributes of the object and generating displayable image data in response to said stored data.
Preferably, the boundary lines are defined as Bezier curves and data is stored representing the position of the fixed ends of the curves, along with tangent points.
The sectional lines may be straight lines. Alternatively, said sectional lines may also be Bezier curves, defined by data similar to that for defining the boundary lines.
Other aspects and preferred embodiments of the invention will be apparent from the following description and claims.
Brief Description of the Drawings
The invention will now be illustrated by way of example only with reference to the accompanying drawings in which: Figure la shows a curved line, Figures 1b and 1c shows straight line approximations to the curved line of Figure la and Figures Id and le shown Bezier curves; Figure 2a shows the effect of modiyfing the length of a tangent vector to a Bezier curve and Figure 2b shows the effect of modifying the direction of a tangent vector for a Bezier curve; Figure 3 shows an image processing apparatus, including processing elements, storage elements and interface elements;
Figure 4 schematically shows the arrangement of data in a memory forming part of the apparatus shown in Figure 3;
Figure 5a and 5b show displays produced by the apparatus shown in Figure 3;
Figure 5 shown a schematic representation of data stored in memory forming part of the apparatus shown in Figure 3; Figure 7 shows a schematic representation of the functional elements of an apparatus for generating a display; Figure 8 shows a schematic representation of the process by which the apparatus produces a display;
Figure 9 shows a schematic representation of the functional elements of the apparatus of Figure 3, for allowing the input of data to produce a display;
Figure 10 shows a process performed by the apparatus for receiving input data; Figure 11 shows a schematic represetation of the functional elements of the apparatus for producing a dispaly;
Figure 12 illustrates boundary lines and sectional lines defining the position of an object, including a plurality of regions;
Figure 13 details one of the regions shown in Figure 12, including a plurality of strips; Figure 14 details one of the strips shown in Figure 13.
Figure 15 shows a schematic representation of an arrangement forming part of the apparatus; and,
Figure 16 schematically shows a process performed by the apparatus. Parametric curves are referred to in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Edison Wesley, ISBN 0-201-17439-1, and "An Introduction to Splines for Use in Computer Graphics and Geometric Modelling", by R H Bartels, J C Beatty and B A Barsky, published by Morgan Kaufmann, ISBN 0-934613-27-3 (both incorporated herein by reference).
Referring to Figure 1a a fairly smooth freehand curve is shown. Referring to Figure lb, one way of representing the curve would be to draw a series of straight line segments, meeting at points. However, the number of straight line segments has to be large, as illustrated in Figure 1c, before the simulation is at all convincing. Alternatively, the curve may be represented as a series of curve segments running between points. If, as in Figure 1d, adjacent curve segments have the same slope at the point of which they join, the curve can be made smooth.
One known type curve approximation technique employs a cubic curve, in which the coordinate variables x and y are each represented as a third order (cubic) polynomial of some parameter t. Commonly, the value of the parameter is constrained to lie between 0 and 1. Thus, each curve segment is described as:
x = axt 3+ bxt 2+ cxt + dx (1) y = ayt 3+ byt 2+ cyt + dy (2)
Each segment has two end points, at which t = 0 and t = 1. The coordinates of the t = 0 end point are
therefore x0= dx, y0= dy, and those of the t = 1 point are given by:
x1= ax + bx + cx + dx ( 3 ) y1 = ay + by + cy + dy ( 4 )
At the end points, the slope of the curved segment is also fixed or predetermined so that each segment can be matched to its neighbours to provide a continuous curve, if desired.
The shape of the curve between the end points is defined by the slopes at the end points and by a further item of information at each point, which is conveniently visualised as the length of a tangent vector at each point. The curve between the two points may be thought of as clamped at its end points, with fixed slopes thereat, while the tangent vectors exercises a pull on the direction of the curve, proportional to their length so that, if the tangent vector is long, the curve tends to follow the tangent over much of its length. The tangent vector may be derived from the above equations (1)-(4) and vice versa; for example, where the end of the Bezier tangent vector at the t = 0 point has coordinates x2,y2, and that at the end of the t = 1 point has coordinates x3 ,y3 , the coefficients a, b, c, d are given by: dx = x0 (likewise dy = y0) (5) b = 3(x0 - 2x2 + x3) (and likewise by) (6)
c = 3(x2 - x0) (and likewise cy) (7)
ax = 3x2 - x0 - 3x3 + x1 (vand likewise a y) (8)
The differential of the curve equation with respect to the variable t is:
c + 2bt + 3at2 (9)
The differential values at the t = 0 and t = 1 points are, respectively,
3(x2 - x0) = cx; 3(y2 - y0) = cy;
3(x1 - x3) = cx + 2bx + 3ax;
3(y1 - y3) = cy + 2by + 3ay
From these equations, by inspection, it will be seen that the length of the tangent to the Bezier control points
(x2,x2), (x3,x3) is 1/3 that of the actual tangent vector. Although the actual tangent vector could be employed, it is mathematically more convenient to employ the Bezier tangent vector (which has the same direction but 1/3rd the magnitude).
In the so called Hermite form of a cubic equation, the data used to define a curve segment is the coordinates of the end points, the slope of the tangent vector at each end point and the length of each tangent vector. In the Bezier format, the data used to define a curve segment are the coordinates of the end points, and the coordinates of the ends of each tangent vectors. Conversion between the Hermite and Bezier format is merely a matter of converting between polar and rectangular coordinates.
Figure 2a shows the effect of varying the magnitude or lengths of the tangent vectors, while keeping their angle constant. It will be seen that the effect is to "pull" the curve towards the tangent vector, more or less strongly depending on the length of the tangent vector.
Figure 2b shows the effect of varying the angle of the tangent vector while keeping its magnitude fixed.
Other types of cubic curve are known, for example, the B-spline, which is defined by two ends points and a plurality of intervening control points through which the curve does not pass. However, the Bezier curve description is attractive because it is relatively easy to manipulate. For example, in matching an approximated curve to an existing curve, the coordinates and tangent angles at points along the curve can directly be measured and employed. The PostScript command language used to control many laser printers employs this curve description, accepting values defining the coordinates of curve segment end points and the coordinates of corresponding tangent end points.
In general, a smooth curve is defined by a number of such end points, and two adjacent segments will share a common end point. If the curve is to be smooth, the tangent angles defined at the end point in relation to each curve segment will be equal, although the tangent vector lengths will in general not be equal.
However, as shown in Figure le, it is possible to represent a line with a curvature discontinuity by providing that the tangent angle at an end point is different for each of the two segments that it defines.
A significant advantage of this form of curve representation is that a smooth, bold curve can be defined using only a small number of coefficients or control points, and parts of it can be amended without extensive recalculation of the whole line.
Referring to FIG 1 , apparatus according to an embodiment of the invention comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150. A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110. At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball" device or a joystick. A cursor symbol is generated by the computer 180 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing.
A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high. Preferably, the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow data to be transferred into and out from the computer 100.
Also preferably provided, connected to input/output device 140, is a printer 190 for producing a permanent visual output record of the image generated. The output may be provided on a transparency or on a sheet of paper.
A picture input device 195 such as a scanner for scanning an image on, for example, a slide, and inputting a corresponding video image signal to the computer 150 may also be provided. One example of a suitable computer 100 is the NeXTCUBE computer including the NeXTdimension colour board, available from NeXTComputer, Inc., USA. This arrangement provides direct formatted outputs for connection to a videocassette recorder or other video storage device, and accepts video input signals. Further, it includes means for compressing images for storage on a disk store 180, and for decompressing such stored images for display. Referring to Figure 4, the frame store device 130 comprises a pair of image stores 130a, 130b. The image store 130a stores the image point, or pixel, data for the image to be generated or processed. The second area, 130b, stores a supervisory or control image displayed during generation or processing of the image stored in the store 130a. The supervisory image may be represented at a lower resolution than the generated image and/or in monochrome and hence the image store 130b may be smaller than the store 130a.
Referring to Figure 5, the appearance of the contents of the generated image store 130a, when displayed on the monitor 160 or output by the printer 180, comprises, as shown, objects a,b each having a trajectory and an background c also possess colour (or in a monochrome system, brightness).
The system of the preferred embodiment is arranged to generate visible output images, including visually distinct objects. These objects may be perceived as representing strokes of a graphic implement, such as a pen, pencil, brush or air brush. However, the stokes or objects may have sophistacted characteristics, such as variations in colour across their widths, which are not attainable by a single stroke of a traditional implement.
Image data may be full colour pixel related data displayable on a video monitor. The pixel data is held in frame store 130b and the definition of this data, that is to say, the number of pixel locations present within frame store 130b, is dictated by the definition required for the final output image. Thus, image data may be produced for broadcast television purposes, high definition television or film, requiring higher levels of definition. The processing system for generating displayable image data also includes a programable processor for generating the displayable image data in response to operator defined data.
In order to generate operator defined data, an input device is provided, which may consist of a mouse, a trackerball or a stylus and touch tablet combination, for example, or any other suitable user interface for defining two dimensional coordinate locations.
The operator does not define image data directly, unlike the aforesaid video graphics equipment and is instead provided with a system for defining objects in a machine readable form by defining boundary lines and a sectional line. Display data relating to boundary lines and sectional lines is stored in a supervisory frame store, 130b, which, typically, has a lower definition than the image store 130a.
In use, data from both stores is displayed on a monitor viewed by the operator. Each image may be selected independently or, if required, one image may be overlayed with the other. Boundary lines define the outer edge of an object and a sectional line connects boundary lines, thereby identifying the connected boundary lines as outlining an object.
In addition to frame store 130b, for storing positional data generated by an operator, storage is also provided for storing attribute data, which defines visual attributes of the object. Thus, these attributes define the colour of the object and its transparency. Colour and transparency values are then generated for each pixel location, therefore image store 130a has pixel locations of sufficient depth for storing colour and transparency. Colour in the image store may be stored as luminance plus colour difference signals or, preferably, as full colour RGB signals.
Data stored in supervisory store 130b identifies supervisory data which may be represented in a plurality of colours, although full colour storage is not required.
The contents of the generated image frame store 130a therefore comprise a plurality of point data defining colour and/or intensity of each of a plurality of points to be displayed to form the display shown in FIG 5A, for example, 500 × 500 image point data, each comprising colour or brightness information as a multi-bit digital number. Typically, several bits representing each of Red (R), Green (G) and Blue (B) are provided. Preferably, the frame store 130a is of the type which stores additionally a transparency value for each image point to allow the generated image to be merged with another image. The address within the store 130a of given point data is related, or mapped, to its position in the display of FIG 5A, which will hereafter be referred to in X (horizontal position) and Y (vertical position) Cartesian co-ordinates. Likewise, the contents of the supervisory image store 130b comprise point data for a plurality of points making up the image of FIG 3B; in this case, however, the display may comprise only a monochrome line and the point data may for each point merely comprise a single bit set to indicate either a dark point or a light point.
Referring to FIG 5B, a line shown on the supervisory display image in FIG 5B is therefore represented by a plurality of pixel values at corresponding X,Y positions within the supervisory image store area 130b. However, this representation of the line is difficult to manipulate if the line is to be amended. A second representation of the line is therefore concurrently maintained in the working memory area 121 of the memory device 120. This representation comprises a plurality of data defining the curve in vector form. Conveniently, the curve 13 is represented by the position of points ("control points") between which intervening curve values can be derived by calculation.
In this embodiment of the invention, display frames, consisting of line drawings of objects, are created and/or edited with reference to stored control point data, preferably data stored in the Bezier format referred to above. In other words, a stored representation of a display frame comprises a plurality of control points which define line segments which make up a line representation.
For each line A,B, a table 122 is provided in the working memory 121 storing the control point data for that line as shown in FIG 4. Conveniently, the curve connecting the points is a spline curve, as discussed above, and preferably a Bezier curve; it is therefore defined by x = axt3 + bxt2 + cxt + dx y = ayt3 + byt2 + cyt + dy where a,b,c,d are constants and t is a parameter allocated values between 0 and 1.
In the Bezier curve format the data are stored as a plurality of point data each comprising point x,y co-ordinates, and data representing slope value for the tangent to the curve at those coordinates, and a tangent magnitude parameter indicating (broadly speaking) the extent to which the curve follows the tangent. This format is used, for example, in control of laser printer output devices. The data may be stored as point coordinates x and y, and tangent angle and length (Hermite form), but is conventionally and conveniently stored as point coordinates x and y and tangent end coordinates. In the following, 'Bezier' format will be used to describe both. Full details will be found at "An Introduction to Splines For Use in Computer Graphics and Geometric Modelling" R H Bartels et al, especially at pages 211-245, published by Morgan Kaufmann, ISBN 0-934613-27-3.
Each control point is thus represented by data comprising positional data (xi,yi) representing the position within the area of the display of that control point, and tangent data (xei,yei, xfi,yfi) defining two tangent end points associated with the curved segments on either side of the control point. The tangent extent point data (xei,yei, xfi,yfi) are stored as position data X, Y defining the position of the tangent end point. It would also be possible to store instead the x,y offsets from the control point position. Complex curved lines can be represented by a number of such control points, two (at least) for each inflexion in the line. The control points stored in the line table 122 each define, between adjacent points, a line segment described by a corresponding cubic equation, and are the values at which the parameter t in that equation is 0 and 1. As intervening points in the line (e.g POINT 2) play a part in defining two neighbouring line segments, each is effectively two control points and consequently has data defining two stored tangents. Although the above described Bezier format is particularly convenient, other parametric ways of representing a curve by control points may be employed, such as the B-spline form, in which the curve control points are not required to lie upon the curve which they characterise.
Referring to FIG 7, to generate the supervisory display shown in FIG 3B, supervisory display generating means 111 reads the control point data from the corresponding line table 122 in the memory 120, and calculates the values of intervening points on the curve. It then accesses the supervisory display image store 130b and sets the values of the corresponding image points therein, to cause display of the generated line on the monitor 160. In practice, the line generating means 111 comprises the CPU 110 operating under control of a program stored in a program store area 129 of the memory 120. If the display device onto which the supervisory display is to be shown is arranged to accept a vector input signal, the supervisory display image store 130b is unnecessary and the generating means 111 supplies vector information from the table 122 to the display, for example as a command in the "Postscript" graphics computer language.
Separate monitor devices 160a,160b could be provided, one for each of the supervisory display and generated display; for instance, the supervisory display monitor may be a monochrome personal computer monitor provided with the computer 100 and the monitor 160b for the generated image a high resolution colour monitor. Alternatively, the computer 100 may be arranged to alternately select one of the supervisory display and generated image display for display on the monitor 160, by alternately connecting the frame stores 130a or 130b thereto. Normally, the supervisory display would be shown, except where it is desired to view the effect of editing an object in the generated image.
Alternatively, a single monitor 160 could be arranged to display both displays adjacent or one overlaying the other as a window. In a further alternative, the outputs of both the frame stores 130a, 130b may be connected so that the supervisory display overlies the generated image; in this case, the supervisory display may be indicated by dashed lines or in any other convenient manner so as not to be confusable with the generated image.
The general flow of operation in generating the path lines shown in the supervisory display on the display device 160, from the data held in the table 122, is shown in Figure 6.
In one method of generating the line, the position and tangent data for a pair of adjacent control points is read from the table 122, and the parameters a,b,c,d of equation 1 are derived therefrom. A large number of intervening values of the parameter t between 0 and 1 are then sequentially calculated to provide x,y coordinates of interventing points along the line, and these are quantised to reflect the number of image points available in the supervisory display, and corresponding point data in the supervisory image store 130b are set. Once all intervening points between that pair of control points have been calculated, the supervisory display generator 111 accesses the next pair of points in the table 122. This method is relatively slow, however; faster methods will be found in the above Bartels reference. The curve or path vector data held within the line tables 122 may have been stored therein from different sources. For instance, the data may be read from a file within the mass storage (for example disk) device 180. Alternatively, they could be derived by fitting a spline approximation to an input curve represented by data derived, for instance, from a scanner or from a user operated digitising pad and stylus. However, a particularly preferred method of allowing the input and modification of the point data will now be described.
Editing may involve either modifying the trajectory of existing lines or (more rarely) adding new lines. It is therefore necessary both to amend the data held in the frame table 122, and desirably to amend the image data in the image store 130 so as to enable the user to view the effects of the change. It is found that the best way of providing the user with means for amending the frame data stored in the table 122 is to allow him to employ a position sensitive input device 170a, so as to appear to directly amend the displayed representation of the frame on the screen monitor 160. In this embodiment, a user manipulates the position sensing input device 170a, such as a mouse, by moving the device 170a so as to generate a signal indicating the direction and extent of the movement. This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140. The cursor position controller 112 accesses the supervisory display store area 130b and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the supervisory display shown on the monitor 160. The user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D.
In a preferred embodiment, the supervisory display line generator 111 is arranged not only to write data corresponding to the line A into the supervisory display store 130b, but also to generate a display of the control point data. Accordingly, for each control point A1,A2, the supervisory image generator 111 writes data representing a control point symbol (for example, a small dark disc) into the image store 130b at address locations corresponding to the control point co-ordinates x,y. Further, the supervisory image generator 111 preferably, for each control point, correspondingly generates a second control point symbol E1 located relative to the A1 along a line defined by the control point tangent data at a length determined by the control point magnitude data; preferably, a line between the two points A1 and E1 is likewise generated to show the tangent itself. To enter a line A, the user signals an intention so to do
(for example by typing a command on the keyboard 170b, or by positioning the cursor symbol at a designated area of a displayed control menu), positions the cursor symbol d at desired point on the display 160, by manipulating the position sensitive input device 170a and generates a control signal to indicate that the desired point has been reached. The cursor position controller 112 supplies the current cursor position data to the table 122 as control point position co-ordinates, and the supervisory display generator 111 correspondingly writes data representing a control point symbol into the image store 130b at address locations corresponding to the control point co-ordinates. The user then inputs tangent information, for example via the keyboard 170b, or in the manner described below. When a second path control point has been thus defined and stored in the table 122, the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the supervisory display store 130a.
To amend the shape or path of the line A displayed on the supervisory display, a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A1 or E1 on the display 160. To indicate that the cursor is at the desired position, the user then generates a control signal, for example by clicking the mouse input device 170a. The device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112. The cursor position controller 112 supplies the cursor position data to a supervisory display editor 113, (comprising in practice the CPU 110 operating under stored program control) which compares the stored cursor position with, for each point, the point position (X,Y) and the position E of the end of the tangent.
When the cursor position is determined to coincide with any point position A or tangent end position E , the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A. with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor.
If the cursor is located at the point A1 on the curve A, manipulation by a user of the input device 170a amends the position data (X1,Y1) in the line table 122, but leaves the tangent data unaffected. If, on the other the cursor is located at an end of tangent point E1, manipulation by a user of the input device 170a alters tangent end point data in the line table 122 within the memory 120, leaving the position data (x,y) unaffected. In either case, after each such amendment to the contents of the line table 122, the supervisory display generator 111 regenerates the line segment affected by the control point in question within the supervisory display image store 130b so as to change the representation of the line on the supervisory display.
Once a line has been amended to a desired position, the user generates a further control signal, by clicking the mouse input device 170a, and the supervisory display editor 113 thereafter ceases to amend the contents of the memory 120. The cursor controller 112 continues to update the stored cursor position. This method of amending the line representation is found to be particularly simple and quick to use.
The relationship between the contents of the supervisory image store 130b and the generated image store 130a will now be discussed in greater detail.
Referring to Figure 5a and Figure 5b, the display of
Figure 3b represents only lines, and corresponds to, for example, the output of an animation program or a PostScript (TM) page design program. The objects a,b shown in Figure 3a correspond to the lines A,B shown in Figure 3b insofar as their general trajectory or path is concerned, but differ therefrom by displaying one or more of the following additional attributes:
Colour : Each object a,b in Figure 5a may be coloured, and the colour may vary along the line of each object. The profile of colour across the width of the object may also be non-constant.
Opacity : An object a,b may be given the appearance of a semitransparent object positioned in front of the background c, by providing that the colour of a part of the object a, be influenced by the colour of the background c, to an extent determined by an opacity parameter varying between 0 (for a transparent or invisible object, the colour of which is entirely dictated by the colour of the background c) and unity (for an entirely opaque object, the colour of which does not depend on that of the the background c). The effect of the opacity of the object is significant when the object is moved, since parts of the object exhibiting some transparency will show an altered appearance depending upon the colour of the background c.
The manner in which these attributes of the generated line shown in the display of Figure 3a may be manipulated by a user will now be discussed in general terms.
The objects a,b to be displayed are represented within the frame store 130a, in a similar form to that in which they would be represented by a computer painting system, that is, as an array of pixel data. However, changing the representation of attributes in this form requires a very large amount of data processing, since a large number of pixel values must be amended. Further, it is not possible to change the position or shape of a line while leaving other of the above listed attributes unaffected.
Rather than storing colour information for every image pixel in the object a or b, this embodiment accordingly stores information corresponding to attributes of the object associated with predetermined points along the lines shown in the supervisory display and the corresponding values at intervening points in the object are generated by an image generator device 114 shown in Figure 11 comprising, in practice, the CPU 110 operating under stored program control, and the generated image data stored at corresponding positions within the generated image store 130a for display on the generated image display on the monitor 160. The generation of an image may be considered as taking place in two stages. Firstly, in response to operator commands, a control image is generated, control data is stored and a representation of this data is supplied to the supervisory store, allowing the data to be interactively viewed by the operator.
The operator works in relation to a cartisian coordinate reference, the definition of which is user selectable up to the operating constraints of the system.
A suitable operating environment consists of a work station based around an intel 486 microprocessor, providing 32 bit floating point arithmetic. Thus, any point position may be calculated in 32 bit floating point arithmetic providing a very high spatical definition.
Upon start-up, a notional x/y frame is provided with coordinate locations separated by 1/72 inch points, providing 500 identifiable locations in both the x and y directions. To facilitate detailed work, this area may be zoomed up, by up to a factor of 64 times. However, these values are only arbitrarily defined and may be adjusted to suit a particular environment, if necessary. Upon each zooming operation, x/y coordinates defining lines are recalculated, thereby retaining line smoothness irrespective of zoom magnification. An object generated by the system may be considered as a brush stroke although, given the level of sophistication provided by the system, details such as colouring and transparency may be modified across the stroke and a whole object or a significant part of an object may be represented by a single stroke. For example, data representing a single stroke may be sufficient to define a complete cherry stalk, with colouring and shading creating the illusion of depth. The extent of a stroke or object is defined by boundary lines. Each boundary line is a Bezier curve, generated in response to control points. A control point fixes the position of a Bezier curve within the coordinate frame. Individual transformations may be performed on the control points, in response to user operations, or affine transformations may be performed on a collection of points. Such modifications are stored by modifying data representing the Bezier descriptions and the actual lines resulting from these Bezier descriptions are redrawn on a frame by frame basis.
Two lines are defined as boundary lines for an object by connecting them with a sectional line. A sectional line may connect with a boundary line at a control point, thereby fixing the position of the sectional line or, alternatively, the connecting point need not be a control point and is referred to as an attribute point, in which case the sectional line will be modified in accordance with movements to the boundary line.
An example of a stroke is shown in Figure 12. In the first mode of operation, supervisory store 130b (Figure 4) within frame store 130 (Figure 3) is active and generates image signals which are supplied to monitor 160. In addition, a cursor is displayed on monitor 160 which responds, spatially, to operation of the mouse 170a.
From a menu, or similar means, an operator identifies a desire to create a control point C1 and moves the mouse so that the cursor is displayed at the desired position of control point C1. On clicking the mouse the x and y locations of the control point 1 are recorded in working memory, in a table of the form shown in Figure 6. The operator now moves the mouse so that the cursor is displayed at the position required for a second control point C2 and on clicking the mouse again, point C2 is created.
Within the operating software of the system, a subroutine for generating Bezier curves is provided and on selecting the second control point C2, a call is made to this subroutine. In response to this call, x/y coordinate positions along the Bezier curve are generated and this data is written to the supervisory store 130b, resulting in a boundary line Bl being displayed on the monitor 160.
For the sake of clarity, boundary lines are drawn with a width of one pixel and cannot be antialiased. However, this line is only a representation of the mathematical definition of the boundary line, which can be rendered at any desired definition. In a similar operation to that described above, control point C3 and C4 are generated which are then connected by a boundary line B2, using Bezier curves.
A two dimensional object is identified by connecting the boundary lines B1 and B2 together by a sectional line. In a preferred mode of operation, control point C1 is connected to control point C3 by a sectional line S1 and control point C2 is connected to control point C4 by a sectional line S2, thus defining a closed region or object C1, C2, C4, C3.
The sectional lines S1 and S2 are also Bezier curves and an additional line table, of the type shown in Figure 6, is required to define the position of such sectional lines. A third sectional line S3 is generated between control point C5 and C6, again generated by placing the cursor, in response to movements of the mouse, at the required positions along the boundary lines B1 and B2. Thus, control point C5 and C6 not only identify the position of sectional lines S3 but also control the shape of Bezier curves C1 C5, C5 C2, C3 C6 and C6 C4.
Sectional lines define attribute data consisting of colour and transparency.
Rather than storing colour data for each point along the section through the line, preferably the colour data stored comprises the colour value for each of a small number of points along the cross-section 53, and the image generating means 114 correspondingly generates the colour values at the intervening points by interpolation therebetween. Colour values are set at a colour control point. The positions therebetween of the intervening points could be pre- determined but are preferably selectable by the user, in which case an indication of the position along a sectional line is stored with each value C 2C 3 Preferably, the position data stored comprises a fraction of the distance along the sectional line.
Opacity or transparency data, specifying, as a fraction, the relative dependence of the colour of the object a on the colour data for the object a relative to the colour data for the background, is likewise stored in the line table 122 corresponding to opacity control points H1,H2 in the same manner as described above for colour data, except that an opacity value is stored rather than a colour value. It is therefore possible to vary the degree of transparency of the object across its section, as well as along its length. The image generator 114 is therefore arranged preferably to derive colour data values by interpolating between colour control points, and to do likewise to derive transparency values and, finally, to set the colours of image points stored in the generated image store 130a by reading the stored background colour and forming, for each image point, the interpolated colour value multiplied by the interpolated opacity value, together with the background colour value multiplied by unity less the interpolated opacity value. The process of setting and amending the values of the above attributes will now be discussed in greater detail.
Referring to FIG 13, to set up the attribute values for an object a to be displayed on the monitor 160, the user generates a control signal (typically by typing an appropriate command on the keyboard 170b, or by. positioning the cursor symbol on a specified part of the screen of the monitor 160 and clicking the mouse, indicating that an attribute is to be input or added to the object.
An operator positions the cursor symbol at a point on the line A shown on the supervisory display and generates a further control signal by clicking the mouse 170a. The supervisory display editor 113 receives the cursor position from the cursor controller 112, and writes a corresponding attribute control point symbol into a corresponding position in the supervisory display image store 130b, which is consequently subsequently displayed on the monitor 160.
The stored cursor position indicating the position along the line at which the control point is placed by the user is then processed for storage in the attribute line data within the line table 122 in the memory 120. The cursor position is not directly stored since, if the user subsequently repositioned the line as discussed above, the attribute control point would no longer lie on the line. Instead, an indication of the relative position along the line, between its two neighbouring curve control points, is derived and this indication is stored so that the position of the attribute control point is defined relative to the position of the line, regardless of subsequent redefinitions of the line position.
This may be achieved, for example, by accessing the line table, reading the Bezier control point information, deriving therefrom the cubic spline equation 1 above and solving for a value t at the cursor X,Y coordinates if the cursor is not exactly on the line. The value of t at the closest point on the line is derived, for example to set (x-xt)2 + (y-yt)2 to a minimum. The value of the parameter t is then stored as an entry in the attribute data within the line table 122.
Referring to FIGS 15 and 16, an alternative and preferred method of inputting opacity data is illustrated. When the user generates a control signal indicating a desire to input opacity data, a profile generating means 118 (comprising, conveniently, the CPU 100 acting under stored program control) causes the display 160 to display the contents of a profile display store 130c (as, for example, a display "window" overlying the supervisory display). The contents of the profile display store 130c comprise image data defining a horizontal and vertical axes. The display represents the profile of opacity across the brush, corresponding to a cross-section along the line A. The horizontal axis represents position across the line A between the two lateral extents e1,e2.
The vertical line represents opacity. Both axes are conveniently scaled between 0 and 1. The cursor position controller 112 is arranged to write data into the profile display store 130c to cause the display of a cursor symbol D at a position therein defined by movements of the position sensitive input device 170b. By positioning the cursor symbol at a point between the axes, and generating a control signal, the user signals an opacity value at a given distance across the object a transverse to the line A. The corresponding position between the extents e1,e2 and opacity value thereat are derived by the profile generator 118 from the current cursor position supplied by the cursor tracker 112 and are written into the attribute data held within the line data store 122. The profile generator 118 likewise causes the generation, at the current cursor position, of a point symbol. The cursor may then be repositioned, but the point symbol remains. When two or more different point symbols are displayed and, correspondingly, two or more opacity data values and positions are stored within the line table 122, the profile generator 118 preferably calculates by interpolation, the coordinates of image data within the profile display store corresponding to intervening points along an interpolated line between the points for which opacity data is stored, and sets the value of those image points within the profile display store 130c, so that when displayed on the display device 160, so as to represent the profile which would be followed at that point. Generating a schematic cross-section display of this type is found to be of assistance to a user in visualising the transparency of, for example, an object corresponding to an airbrush stroke. The interpolation performed by the profile generator 118 is preferably the same as that which will be performed by the image generator 114. To permit discontinuities in the colour or opacity across the extent of the object to be defined, preferably, the line table 122 is dimensioned to allow storage of two attribute values for each such lateral position C2,C3; as shown in FIG 12, one value is used to perform interpolation to one neighbouring point and the other to the other neighbouring point.
A corresponding profile display could be provided to allow the input and amendment of other attributes; for instance, brightness (of a monochrome object) or colour (of a coloured object).
Preferably, predefined attribute data specifying colour profiles and opacity profiles are also stored on the mass storage device 180 corresponding, for example, to particular paintbrushes or airbrushes, or to particular previously defined objects. Rather than manually enter and edit the attribute control data, the user may enter an appropriate command (via the keyboard 170b) to read such predetermined data from the mass storage device 180 into the line data table 122. Preferably, the data stored for each attribute control point can specify all, or only a subset of the available attributes; for instance, it may be used to specify only only colour or opacity. Thus, the variations across the object a of these attributes may be separately controlled, and independently edited. In such a case, the default predetermined value assigned to each attribute is a flag indicating that the attribute is not set.
Referring to Figure 12, additional sectional lines S4 and S5 are generated, again in the form of Bezier curves. However, the points at which the sectional lines S4 and S5 intersect with boundary lines B1 and B2 are not control points, in that they do not define the shapes of B1 and B2. Such attribute points Al A2 are selected from the menu and poisitionally located by means of the mouse, as previously described.
Before section line S4 is created, it would be common for attributes to have been set for sectional lines S1 and S3. After S4 is created, attributes of colour and transparency are generated for sectional line S4 by interperlating values between sectional lines S1 and S3. Thereafter, the colour and transparency of values set along line S4 may be modified by the operator.
Once a stroke of the type shown in Figure 12 has been defined, the second stage may be initiated in order to render the stroke into a full colour image stored in the image store 130a of Figure 4. Sectional lines S1, S4,' S3, S5 and S2 define regions R1, R2, R3 and R4. In order to render the image, the first step is to determine the colour and opacity profile for each of these regions. Each region between each pair of adjacent sectional lines can be rendered independently and region R2 is detailed in Figure 13. The region R2 is divided into a plurality of strips SS1, SS2, SS3 etc between sectional lines S4 and S3. The strips SS1 etc are defined by forming additional sectional lines, generated by interpolating sectional lines SS4 and SS3.
The additional sectional lines are interpolated by transforming sectional lines S4 and S3 so that their end points are at, for example, (0,0) and (0,1) in Cartesian space. The curve is translated so that its first end point lies at the origin. The translated curve is then rotated about the origin so that its other end point lies along the y axis. The curve is then scaled to bring the other end point to coordinate (0,1).
It is possible that interpolation will occur between two curves having a different number of control points. A process for performing such an operation is described in "automatic curve fitting with quadratic b-spline functions and its applications to computer-assisted animation", Yang et-al computer vision, computer graphics and image processing 33, pages 346 to 363. The end points of the sectional lines are determined by interpolating along the boundary lines B1 and B2.
The interval between successive interpolated sectional lines must be small enough so that, when corresponding points are joined by straight lines, the maximum distance between the straight line approximation and the true path at that point is less than one pixel width in image space or, if the generated image is anti-aliased, to a definition of less than half a pixel width. Strip SS2 is shown in greater detail in Figure 14. The strip SS2 is defined by two additional sectional lines ASL1 and ASL2. Addtional sectional lines ASL1 and ASL2 have no opacity and colour profiles, these parameters being generated by interpolating the values stored for sectional lines S4 and S3.
The strip is now divided into quadrilaterals Q1, Q2, Q3, Q4 and Q5 as shown in Figure 14. The colour and opacity of each cooner of each qualrilateral is determined, from which colour and opacity values for each pixel within a quadrilateral may be calculated by linerally interpolating these values. The strip is divided into qualrilaterals by placing notional control points along lines ASL1 and ASL2. Such control points are positioned at places where colours are defined and then additional points are identified, such that adjacent points may be connected by a straight line, without crossing pixel boundaries. Thus, a quadrilateral may be formed if its edges do not cross a pixel boundary. When rendering in anti-aliased form, smaller quadrilaterals are formed, such that, for example, straight lines defining quadrilaterals may be drawn to a definition of one eighth of the pixel spacing.
Pixels falling at the boundary of a quadrilateral are rendered separately, so that contributions from both quadrilateral regions may be combined before the image is placed against a background.
The attribute data could be arranged to store attributes other than colour and opacity. Thus, it will be possible to allow an image to be mapped onto the stroke by taking the interpolence, along and across the stroke for transfering data from another image.
Although in a preferred embodiment linear interpolation is performed, other forms of interpolation, such as Hermite interpolation could be performed if required.

Claims

1. Image processing apparatus for generating visible output images, including visually distinct objects, characterised by
processing means (110, 130, 160) for generating displayable image data;
input means (170a, 170b, 140) for supplying input signals to the processing means, said input signals defining boundary lines (B1, B2) for said objects and a sectional line (S1) identifying two boundary lines as defining an object; and
storage means (120, 130) for storing data generated in response to said input signals and for storing attribute data defining visual attributes of the object.
2. Image processing apparatus according to claim 1, wherein the input signals supplied to the processing means are generated in response to manual operations made by an operation.
3. Image processing apparatus according to claim 1, wherein the input means is a mouse, a tracker ball or a stylus and touch tablet combination.
4. Image processing apparatus according to claim 1, wherein the memory means stores data relating to the position of selected points defining points on the boundary lines and sectional lines, and
the processing means calculates the position of boundary lines and sectional lines in response to data read from the memory means.
5. Image processing apparatus according to claim 4, wherein the processing means includes a programable processor unit and video frame storage means, and
said displayable image data are generated by repeatedly reading pixel data from the frame storage means.
6. Image processing apparatus according to claim 5, wherein data defining boundary lines and sectional lines, calculated by the processing means, are written to the frame storage means.
7. Image processing apparatus according to claim 6, wherein the processing means is arranged to modify data written to the frame storage means, defining boundary lines and sectional lines, in response to operations of the input means.
8. Image processing apparatus according to claim 1, wherein said storage means stores attribute data for positions along a sectional line.
9. Image processing apparatus according to claim 8, wherein a plurality of sectional lines connect boundary lines, each sectional line having operator defined attribute data associated therewith.
10. Image processing apparatus according to claim 1, wherein end sectional lines connect the ends of boundary lines to define a closed object.
11. Image processing apparatus according to claim 1, wherein attribute data defines the colour and transparency/opacity of an object, each of which is variable across a sectional line.
12. Image processing apparatus according to claim 1, wherein pixel values withing an object area are rendered by interpolating attribute data between sectional lines.
13. Image processing apparatus according to claim 1, wherein the position of boundary lines are calculated at a greater definition than pixel positions defined by the frame storage means and values accorded to pixels at boundaries depends upon the degress to which a pixel region is occupied by an object.
14. Image processing apparatus according to claim 1, wherein boundary lines are defined by curved lines calculated in response to stored point locations.
15. Image processing apparatus according to claim 14, wherein said curved lines are splines.
16. Image processing apparatus according to claim 15, wherein said curved lines are Bezier curves defined by fixed end points and tangents.
17. A method of generating visible ouput images, including visually distinct objects, characterised by the steps of
supplying input signals to a processing means, defining boundary lines and sectional lines;
said sectional line identifying two boundary lines as defining an object; storing data generated in response to said input signals;
storing attribute data defining visual attributes of the object and
generating displayable image data in response to said stored data.
18. A method according to claim 17, wherein the processing means includes a programable processing unit and video frame storage means and said displayable image data are generated by repeatedly reading pixel data from the frame storage means.
19. A method according to claim 17, wherein attribute data represents the attributes of the image positions along the sectional line.
20. A method according to claim 19, wherein a plurality of sectional lines connect boundary lines, and each sectional line has operator defined attribute data associated therewith.
PCT/GB1992/000928 1990-11-30 1992-05-21 Image synthesis and processing WO1992021096A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US08/150,100 US5598182A (en) 1991-05-21 1992-05-21 Image synthesis and processing
JP4510509A JPH06507743A (en) 1991-05-21 1992-05-21 Image synthesis and processing
US08/643,322 US5754183A (en) 1991-05-21 1996-05-06 Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
GB909026120A GB9026120D0 (en) 1990-11-30 1990-11-30 Computer animation and graphics systems
GB919100632A GB9100632D0 (en) 1991-01-11 1991-01-11 Animation system
GB919102125A GB9102125D0 (en) 1991-01-31 1991-01-31 Animation system
GB9110945A GB2256118A (en) 1991-05-21 1991-05-21 Image synthesis and processing
GB9110945.4 1991-05-21
GB9117409A GB2258790A (en) 1991-08-12 1991-08-12 Animation
GB9117409.4 1991-08-12
GBPCT/GB91/02124 1991-11-29
GBPCT/GB91/02122 1991-11-29

Publications (1)

Publication Number Publication Date
WO1992021096A1 true WO1992021096A1 (en) 1992-11-26

Family

ID=27517011

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
PCT/GB1992/000928 WO1992021096A1 (en) 1990-11-30 1992-05-21 Image synthesis and processing
PCT/GB1992/000927 WO1992021095A1 (en) 1990-11-30 1992-05-21 Animation

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/GB1992/000927 WO1992021095A1 (en) 1990-11-30 1992-05-21 Animation

Country Status (5)

Country Link
US (2) US5692117A (en)
EP (3) EP0559708A1 (en)
JP (2) JPH06503663A (en)
AU (2) AU9015891A (en)
WO (4) WO1992009965A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0702332A2 (en) * 1994-09-13 1996-03-20 Canon Kabushiki Kaisha Edge to edge blends
AU706423B2 (en) * 1994-09-13 1999-06-17 Canon Kabushiki Kaisha Edge to edge blends

Families Citing this family (183)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL106410A (en) * 1992-08-06 1996-09-12 Hughes Training Inc Interactive computerized witness interrogation recording tool
EP0589658B1 (en) * 1992-09-21 2002-07-17 Matsushita Electric Industrial Co., Ltd. Superimposing of graphic data with graphic parameter store
JPH07146931A (en) * 1993-03-08 1995-06-06 Canon Inf Syst Res Australia Pty Ltd Picture generating method
GB2277856A (en) * 1993-04-05 1994-11-09 Cambridge Animation Syst Computer generating animated sequence of pictures
JP3359401B2 (en) * 1993-12-02 2002-12-24 富士通株式会社 Figure editing apparatus and method
JPH08202850A (en) * 1995-01-26 1996-08-09 Sony Corp Paper fiber structure data generating method and device, paper fiber structure data and blotting plotting method and device
CA2167237A1 (en) * 1995-02-17 1996-08-18 Steven Charles Dzik Line smoothing techniques
WO1996036945A1 (en) 1995-05-19 1996-11-21 Sega Enterprises, Ltd. Picture processing device, picture processing method, game device using same, and memory medium
AUPN360295A0 (en) * 1995-06-16 1995-07-13 Canon Information Systems Research Australia Pty Ltd Blend control system
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
JP3785700B2 (en) * 1995-12-18 2006-06-14 ソニー株式会社 Approximation method and apparatus
US5854634A (en) 1995-12-26 1998-12-29 Imax Corporation Computer-assisted animation construction system using source poses within a pose transformation space
FR2743241B1 (en) * 1995-12-28 1998-02-13 Sagem METHOD FOR MODIFYING THE RESOLUTION OF A DIGITAL IMAGE
US5764814A (en) * 1996-03-22 1998-06-09 Microsoft Corporation Representation and encoding of general arbitrary shapes
JPH09326990A (en) * 1996-06-07 1997-12-16 Matsushita Electric Ind Co Ltd Video editor
US5982389A (en) * 1996-06-17 1999-11-09 Microsoft Corporation Generating optimized motion transitions for computer animated objects
US5889532A (en) * 1996-08-02 1999-03-30 Avid Technology, Inc. Control solutions for the resolution plane of inverse kinematic chains
US6115051A (en) * 1996-08-07 2000-09-05 Adobe Systems Incorporated Arc-length reparameterization
JP3211679B2 (en) * 1996-09-25 2001-09-25 松下電器産業株式会社 Editing device and editing method
US5977319A (en) * 1996-10-21 1999-11-02 Cambridge Antibody Technology Limited Specific binding members for estradiol; materials and methods
US6252604B1 (en) * 1997-01-10 2001-06-26 Tom Snyder Productions, Inc. Method of animating an image by squiggling the edges of image features
US7616198B2 (en) * 1998-02-20 2009-11-10 Mental Images Gmbh System and computer-implemented method for modeling the three-dimensional shape of an object by shading of a two-dimensional image of the object
US6400368B1 (en) * 1997-03-20 2002-06-04 Avid Technology, Inc. System and method for constructing and using generalized skeletons for animation models
US5999195A (en) * 1997-03-28 1999-12-07 Silicon Graphics, Inc. Automatic generation of transitions between motion cycles in an animation
CA2233380A1 (en) * 1997-04-04 1998-10-04 Microsoft Corporation Parametric function curve editing
US6128001A (en) * 1997-04-04 2000-10-03 Avid Technology, Inc. Methods and apparatus for changing a color of an image
US6351264B1 (en) * 1997-05-20 2002-02-26 Adam S. Iga Method for computer image color shading painting or recreation
JPH1118071A (en) * 1997-06-25 1999-01-22 Nec Corp Slow reproduction system
US6072502A (en) * 1997-06-25 2000-06-06 Adobe Systems Incorporated Characterization of corners of curvilinear segment
US6271864B1 (en) * 1997-06-30 2001-08-07 Sun Microsystems, Inc. Representing a path as an object with transformation capability
JP3047007B2 (en) * 1997-09-26 2000-05-29 株式会社島精機製作所 Image processing device
US6070167A (en) * 1997-09-29 2000-05-30 Sharp Laboratories Of America, Inc. Hierarchical method and system for object-based audiovisual descriptive tagging of images for information retrieval, editing, and manipulation
US5977965A (en) * 1997-09-29 1999-11-02 Intergraph Corporation Automatic frame accumulator
US6307576B1 (en) * 1997-10-02 2001-10-23 Maury Rosenfeld Method for automatically animating lip synchronization and facial expression of animated characters
US6119123A (en) * 1997-12-02 2000-09-12 U.S. Philips Corporation Apparatus and method for optimizing keyframe and blob retrieval and storage
US6260044B1 (en) 1998-02-04 2001-07-10 Nugenesis Technologies Corporation Information storage and retrieval system for storing and retrieving the visual form of information from an application in a database
US6404435B1 (en) * 1998-04-03 2002-06-11 Avid Technology, Inc. Method and apparatus for three-dimensional alphanumeric character animation
WO1999052063A1 (en) * 1998-04-05 1999-10-14 Automedia Ltd. Feature motivated tracking and processing
US6240198B1 (en) * 1998-04-13 2001-05-29 Compaq Computer Corporation Method for figure tracking using 2-D registration
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6256418B1 (en) 1998-04-13 2001-07-03 Compaq Computer Corporation Method and system for compressing a sequence of images including a moving figure
US6323879B1 (en) * 1998-05-14 2001-11-27 Autodesk, Inc. Method and system for determining the spacing of objects
US6317125B1 (en) 1998-06-19 2001-11-13 Interplay Entertainment Corp. Saxs video object generation engine
JP3432149B2 (en) * 1998-07-13 2003-08-04 株式会社島精機製作所 Image processing method and apparatus
US6377259B2 (en) * 1998-07-29 2002-04-23 Inxight Software, Inc. Presenting node-link structures with modification
US7536706B1 (en) 1998-08-24 2009-05-19 Sharp Laboratories Of America, Inc. Information enhanced audio video encoding system
AUPP557898A0 (en) * 1998-08-28 1998-09-24 Canon Kabushiki Kaisha Method and apparatus for orientating a character stroke
GB2342026B (en) * 1998-09-22 2003-06-11 Luvvy Ltd Graphics and image processing system
US6535213B1 (en) * 1998-09-22 2003-03-18 Sony Corporation Curve edition system, curve-loop detecting system, curve-loop removing system
US6201551B1 (en) * 1998-09-30 2001-03-13 Xerox Corporation PDL operator overloading for line width management
US6246419B1 (en) * 1998-09-30 2001-06-12 Xerox Corporation PDL operator overloading for line width management
US6331854B1 (en) * 1998-10-05 2001-12-18 Azi International Srl Method and apparatus for accelerating animation in a video graphics system
JP3427973B2 (en) * 1998-12-09 2003-07-22 日本電気株式会社 Object display description document conversion device and browser
JP4288449B2 (en) * 1999-02-16 2009-07-01 株式会社セガ Image display device, image processing device, and image display system
US7188353B1 (en) 1999-04-06 2007-03-06 Sharp Laboratories Of America, Inc. System for presenting synchronized HTML documents in digital television receivers
US6512522B1 (en) * 1999-04-15 2003-01-28 Avid Technology, Inc. Animation of three-dimensional characters along a path for motion video sequences
WO2000063843A1 (en) * 1999-04-16 2000-10-26 Avid Technology, Inc. A method and apparatus for hierarchically combining regions
US6870550B1 (en) * 1999-04-26 2005-03-22 Adobe Systems Incorporated Digital Painting
US6681043B1 (en) * 1999-08-16 2004-01-20 University Of Washington Interactive video object processing environment which visually distinguishes segmented video object
US6633300B1 (en) * 1999-12-22 2003-10-14 Adobe Systems Incorporated Method and apparatus for painting groups of objects
US7082436B1 (en) 2000-01-05 2006-07-25 Nugenesis Technologies Corporation Storing and retrieving the visual form of data
GB2360919A (en) * 2000-01-20 2001-10-03 Anthropics Technology Ltd Appearance modelling
TWI282957B (en) * 2000-05-09 2007-06-21 Sharp Kk Drive circuit, and image display device incorporating the same
US7647340B2 (en) 2000-06-28 2010-01-12 Sharp Laboratories Of America, Inc. Metadata in JPEG 2000 file format
US7079158B2 (en) * 2000-08-31 2006-07-18 Beautyriot.Com, Inc. Virtual makeover system and method
EP1187066A3 (en) * 2000-09-01 2004-04-21 Sony Computer Entertainment Inc. Method and apparatus for image enlargement/reduction
AU2001292202A1 (en) * 2000-09-19 2002-04-02 Technion Research And Development Foundation Ltd. Method and apparatus for shape deformation and placement
US7006694B1 (en) * 2000-10-05 2006-02-28 Coreco Imaging, Inc. System and method for pattern identification
EP1207498A3 (en) * 2000-11-15 2003-10-15 Sega Corporation Display object generation method in information processing equipment
US6765589B1 (en) * 2000-11-16 2004-07-20 Adobe Systems Incorporated Brush for warping and water reflection effects
CN1537300A (en) * 2000-12-22 2004-10-13 Communication system
US20040135788A1 (en) * 2000-12-22 2004-07-15 Davidson Colin Bruce Image processing system
GB2370709A (en) * 2000-12-28 2002-07-03 Nokia Mobile Phones Ltd Displaying an image and associated visual effect
NO313477B1 (en) * 2001-01-08 2002-10-07 Simsurgery As Method and system for simulating a thread in computer-based graphical simulations
US8750382B2 (en) 2001-01-23 2014-06-10 Kenneth Martin Jacobs System and method for calculating 3Deeps action specs motion estimation from the motion vectors in an MPEG file
US9781408B1 (en) 2001-01-23 2017-10-03 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US10742965B2 (en) 2001-01-23 2020-08-11 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US20020130872A1 (en) * 2001-03-15 2002-09-19 Elena Novikova Methods and systems for conflict resolution, summation, and conversion of function curves
US6963350B1 (en) * 2001-07-03 2005-11-08 Adobe Systems Incorporated Painting interface to computer drawing system curve editing
KR20010113584A (en) * 2001-11-08 2001-12-28 (주)시스튜디오 a method for providing comics-animation by computers and a computer-readable medium storing data of comics-animation
US7026960B2 (en) * 2001-11-27 2006-04-11 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key data
KR100480787B1 (en) * 2001-11-27 2005-04-07 삼성전자주식회사 Encoding/decoding method and apparatus for key value of coordinate interpolator node
US7336713B2 (en) * 2001-11-27 2008-02-26 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding data
KR100426313B1 (en) * 2001-12-28 2004-04-06 한국전자통신연구원 Method for modifying posture of an articulated object in manufacturing picture
AU2003200347C1 (en) * 2002-02-07 2005-04-21 Canon Kabushiki Kaisha A Method for Stroking Flattened Paths
US7199805B1 (en) * 2002-05-28 2007-04-03 Apple Computer, Inc. Method and apparatus for titling
US6822653B2 (en) * 2002-06-28 2004-11-23 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US6970169B1 (en) 2002-09-24 2005-11-29 Adobe Systems Incorporated Digitally synthesizing seamless texture having random variations
US7809204B2 (en) 2002-10-18 2010-10-05 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key value data of coordinate interpolator
US7319764B1 (en) * 2003-01-06 2008-01-15 Apple Inc. Method and apparatus for controlling volume
AU2003900809A0 (en) * 2003-02-24 2003-03-13 Aristocrat Technologies Australia Pty Ltd Gaming machine transitions
US7333111B2 (en) * 2003-04-25 2008-02-19 Honda Giken Kogyo Kabushiki Kaisha Joint component framework for modeling complex joint behavior
US7164423B1 (en) * 2003-04-30 2007-01-16 Apple Computer, Inc. Method and apparatus for providing an animated representation of a reorder operation
US7259764B2 (en) * 2003-05-14 2007-08-21 Pixar Defrobulated angles for character joint representation
GB2418475B (en) 2003-06-09 2007-10-24 Immersion Corp Interactive gaming systems with haptic feedback
US7372464B2 (en) * 2003-07-21 2008-05-13 Autodesk, Inc. Processing image data
US7317457B2 (en) * 2003-07-21 2008-01-08 Autodesk, Inc. Processing image data
US7164410B2 (en) * 2003-07-28 2007-01-16 Sig G. Kupka Manipulating an on-screen object using zones surrounding the object
GB2406028A (en) * 2003-09-11 2005-03-16 Autodesk Canada Inc Tangent handle adjustment for Bezier curves
US7593015B2 (en) * 2003-11-14 2009-09-22 Kyocera Wireless Corp. System and method for sequencing media objects
US8237712B2 (en) * 2004-03-18 2012-08-07 Apple Inc. Manipulation of image content using various image representations
US8121338B2 (en) * 2004-07-07 2012-02-21 Directsmile Gmbh Process for generating images with realistic text insertion
US20060093309A1 (en) * 2004-10-05 2006-05-04 Magix Ag System and method for creating a photo movie
US7376894B2 (en) * 2004-11-18 2008-05-20 Microsoft Corporation Vector path merging into gradient elements
US20060130679A1 (en) 2004-12-20 2006-06-22 Dubois Radford E Iii Automated cutting system for customized field stencils
US7920144B2 (en) * 2005-01-18 2011-04-05 Siemens Medical Solutions Usa, Inc. Method and system for visualization of dynamic three-dimensional virtual objects
JP4866013B2 (en) * 2005-03-31 2012-02-01 富士通株式会社 Character image generation program, system thereof, and method thereof
US7830384B1 (en) * 2005-04-27 2010-11-09 Image Metrics Limited Animating graphical objects using input video
US8260056B2 (en) * 2005-08-19 2012-09-04 Telefonaktiebolaget Lm Ericsson (Publ) Resizing video and aligning video image edges to block boundaries and picture boundaries
US20070091112A1 (en) * 2005-10-20 2007-04-26 Pfrehm Patrick L Method system and program for time based opacity in plots
US7782324B2 (en) * 2005-11-23 2010-08-24 Dreamworks Animation Llc Non-hierarchical unchained kinematic rigging technique and system for animation
JP5111772B2 (en) * 2006-03-24 2013-01-09 株式会社沖データ Printing device
US8281281B1 (en) * 2006-06-07 2012-10-02 Pixar Setting level of detail transition points
US8902233B1 (en) 2006-06-09 2014-12-02 Pixar Driving systems extension
US7965294B1 (en) * 2006-06-09 2011-06-21 Pixar Key frame animation with path-based motion
US8147315B2 (en) * 2006-09-12 2012-04-03 Aristocrat Technologies Australia Ltd Gaming apparatus with persistent game attributes
US8081187B2 (en) * 2006-11-22 2011-12-20 Autodesk, Inc. Pencil strokes for vector based drawing elements
US7697002B2 (en) * 2007-01-25 2010-04-13 Ricoh Co. Ltd. Varying hand-drawn line width for display
US7884834B2 (en) * 2007-04-13 2011-02-08 Apple Inc. In-context paint stroke characteristic adjustment
US20080291212A1 (en) * 2007-05-23 2008-11-27 Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust Software for creating engraved images
US20080310747A1 (en) * 2007-05-23 2008-12-18 Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust Software for creating engraved images
KR100917887B1 (en) * 2007-06-11 2009-09-16 삼성전자주식회사 Graphic processing method and apparatus for supporting line acceleration function
US8000529B2 (en) * 2007-07-11 2011-08-16 Hewlett-Packard Development Company, L.P. System and method for creating an editable template from a document image
US20090058863A1 (en) * 2007-09-04 2009-03-05 Apple Inc. Image animation with transitional images
US20090079743A1 (en) * 2007-09-20 2009-03-26 Flowplay, Inc. Displaying animation of graphic object in environments lacking 3d redndering capability
US8310483B2 (en) * 2007-11-20 2012-11-13 Dreamworks Animation Llc Tinting a surface to simulate a visual effect in a computer generated scene
US8134558B1 (en) 2007-12-06 2012-03-13 Adobe Systems Incorporated Systems and methods for editing of a computer-generated animation across a plurality of keyframe pairs
US8253728B1 (en) * 2008-02-25 2012-08-28 Lucasfilm Entertainment Company Ltd. Reconstituting 3D scenes for retakes
US20090284532A1 (en) * 2008-05-16 2009-11-19 Apple Inc. Cursor motion blurring
US20090295791A1 (en) * 2008-05-29 2009-12-03 Microsoft Corporation Three-dimensional environment created from video
JP4561883B2 (en) * 2008-06-19 2010-10-13 コニカミノルタビジネステクノロジーズ株式会社 Image forming apparatus, program, and image forming processing method
US8788963B2 (en) * 2008-10-15 2014-07-22 Apple Inc. Scrollable preview of content
DE102008057512A1 (en) * 2008-11-15 2010-07-01 Diehl Aerospace Gmbh Method for displaying line trains
WO2010083272A1 (en) * 2009-01-15 2010-07-22 Simquest Llc Interactive simulation of biological tissue
WO2010129263A2 (en) * 2009-04-27 2010-11-11 Sonoma Data Solutions Llc A method and apparatus for character animation
US8566721B2 (en) * 2009-04-30 2013-10-22 Apple Inc. Editing key-indexed graphs in media editing applications
US8286081B2 (en) 2009-04-30 2012-10-09 Apple Inc. Editing and saving key-indexed geometries in media editing applications
KR101080255B1 (en) * 2009-07-21 2011-11-08 (주)펜앤프리 Apparatus and method for inputting handwriting in accordance with the handwriting pattern
US9672646B2 (en) * 2009-08-28 2017-06-06 Adobe Systems Incorporated System and method for image editing using visual rewind operation
JP5476103B2 (en) 2009-11-27 2014-04-23 富士フイルム株式会社 Page description data processing apparatus, method and program
US20110276891A1 (en) * 2010-05-06 2011-11-10 Marc Ecko Virtual art environment
US8860734B2 (en) 2010-05-12 2014-10-14 Wms Gaming, Inc. Wagering game object animation
JP5494337B2 (en) 2010-07-30 2014-05-14 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
JP5552947B2 (en) 2010-07-30 2014-07-16 ソニー株式会社 Information processing apparatus, display control method, and display control program
US9305398B2 (en) * 2010-10-08 2016-04-05 City University Of Hong Kong Methods for creating and displaying two and three dimensional images on a digital canvas
US8861890B2 (en) * 2010-11-24 2014-10-14 Douglas Alan Lefler System and method for assembling and displaying individual images as a continuous image
US8988461B1 (en) 2011-01-18 2015-03-24 Disney Enterprises, Inc. 3D drawing and painting system with a 3D scalar field
US9142056B1 (en) * 2011-05-18 2015-09-22 Disney Enterprises, Inc. Mixed-order compositing for images having three-dimensional painting effects
JP5741282B2 (en) * 2011-07-26 2015-07-01 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
US8907957B2 (en) 2011-08-30 2014-12-09 Apple Inc. Automatic animation generation
US9164576B2 (en) 2011-09-13 2015-10-20 Apple Inc. Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US8819567B2 (en) 2011-09-13 2014-08-26 Apple Inc. Defining and editing user interface behaviors
US9478058B2 (en) * 2012-08-06 2016-10-25 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US8947216B2 (en) * 2012-11-02 2015-02-03 Immersion Corporation Encoding dynamic haptic effects
US9898084B2 (en) 2012-12-10 2018-02-20 Immersion Corporation Enhanced dynamic haptic effects
US20140198080A1 (en) * 2013-01-11 2014-07-17 Research In Motion Limited Method and Apparatus Pertaining to Pre-Associated Stylus-Input User Preferences
JP6472171B2 (en) * 2013-08-27 2019-02-20 キヤノン株式会社 Image processing apparatus and method
US8986691B1 (en) 2014-07-15 2015-03-24 Kymab Limited Method of treating atopic dermatitis or asthma using antibody to IL4RA
US8980273B1 (en) 2014-07-15 2015-03-17 Kymab Limited Method of treating atopic dermatitis or asthma using antibody to IL4RA
JP6307873B2 (en) * 2013-12-24 2018-04-11 富士通株式会社 Object line detection apparatus, method, and program
USD906348S1 (en) * 2014-11-26 2020-12-29 Intergraph Corporation Computer display screen or portion thereof with graphic
US10089291B2 (en) 2015-02-27 2018-10-02 Microsoft Technology Licensing, Llc Ink stroke editing and manipulation
US9792723B2 (en) * 2015-04-07 2017-10-17 Disney Enterprises, Inc. Method and system for progressively sculpting three-dimensional geometry
EP3286718A4 (en) 2015-04-23 2018-12-05 Hasbro, Inc. Context-aware digital play
US9740310B2 (en) * 2015-05-22 2017-08-22 Adobe Systems Incorporated Intuitive control of pressure-sensitive stroke attributes
US9741133B2 (en) * 2015-09-29 2017-08-22 Adobe Systems Incorporated Identifying shapes in an image by comparing Bézier curves
US20170236318A1 (en) * 2016-02-15 2017-08-17 Microsoft Technology Licensing, Llc Animated Digital Ink
JP6062589B1 (en) 2016-04-28 2017-01-18 株式会社Live2D Program, information processing apparatus, influence derivation method, image generation method, and recording medium
US10299750B2 (en) * 2016-08-05 2019-05-28 Toshiba Medical Systems Corporation Medical image processing apparatus and X-ray CT apparatus
CN106476479B (en) * 2016-09-20 2017-10-17 北京理工大学 A kind of variable drawing ratio drawing dolly for supporting Freehandhand-drawing and SVG file to import
JP6930091B2 (en) * 2016-11-15 2021-09-01 富士フイルムビジネスイノベーション株式会社 Image processing equipment, image processing methods, image processing systems and programs
WO2018115469A1 (en) * 2016-12-22 2018-06-28 Episurf Ip-Management Ab System and method for optimizing an implant position in an anatomical joint
US10712840B2 (en) * 2017-10-13 2020-07-14 Dell Products L.P. Active pen system
US10424086B2 (en) * 2017-11-16 2019-09-24 Adobe Inc. Oil painting stroke simulation using neural network
US10510186B2 (en) 2017-12-22 2019-12-17 Adobe Inc. Digital media environment for intuitive modifications of digital graphics
US10388045B2 (en) 2018-01-04 2019-08-20 Adobe Inc. Generating a triangle mesh for an image represented by curves
BR102018004967A2 (en) * 2018-03-13 2019-10-01 Samsung Eletrônica da Amazônia Ltda. METHOD FOR PROCESSING MOVEMENT OF VIRTUAL POINTERS
US10410317B1 (en) * 2018-03-26 2019-09-10 Adobe Inc. Digital image transformation environment using spline handles
RU2702498C1 (en) * 2018-05-15 2019-10-08 Юрий Александрович Акименко Method of converting main types into sets of axonometric views
US10832446B2 (en) 2019-01-07 2020-11-10 Adobe Inc. Bone handle generation
US10943375B2 (en) 2019-04-17 2021-03-09 Adobe Inc. Multi-state vector graphics
US11630504B2 (en) * 2021-03-16 2023-04-18 Htc Corporation Handheld input device and electronic system
US11631207B2 (en) 2021-09-09 2023-04-18 Adobe Inc. Vector object stylization from raster objects

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3821322A1 (en) * 1988-06-24 1990-01-04 Rolf Prof Dr Walter Method of controlling a graphic output device
EP0453044A1 (en) * 1990-04-20 1991-10-23 Neurones Cartoon A method and apparatus for storing and animation of data

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3364382A (en) * 1967-01-03 1968-01-16 Control Image Corp Automatic generation and display of animated figures
BE793543A (en) * 1971-12-30 1973-04-16 Ibm MECHANISM POSITION CODING METHODS
US3898438A (en) * 1972-09-28 1975-08-05 Walt Disney Prod Programmable method for digital animation apparatus for assembling animation data
GB1437795A (en) * 1973-07-04 1976-06-03 Computer Image Corp Digitally controlled computer animation generating system
US4189743A (en) * 1976-12-20 1980-02-19 New York Institute Of Technology Apparatus and method for automatic coloration and/or shading of images
US4189744A (en) * 1976-12-20 1980-02-19 New York Institute Of Technology Apparatus for generating signals representing operator-selected portions of a scene
DE2806820C2 (en) * 1978-02-17 1982-02-25 Messerschmitt-Bölkow-Blohm GmbH, 8000 München Method for the synthetic generation of animated films
JPS5837543A (en) * 1981-08-31 1983-03-04 Meidensha Electric Mfg Co Ltd Analysis of curing agent in epoxy resin
NL8300872A (en) * 1983-03-10 1984-10-01 Philips Nv MULTIPROCESSOR CALCULATOR SYSTEM FOR PROCESSING A COLORED IMAGE OF OBJECT ELEMENTS DEFINED IN A HIERARCHICAL DATA STRUCTURE.
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US4582520A (en) * 1982-09-30 1986-04-15 Owens-Corning Fiberglas Corporation Methods and apparatus for measuring and controlling curing of polymeric materials
US4620287A (en) * 1983-01-20 1986-10-28 Dicomed Corporation Method and apparatus for representation of a curve of uniform width
US4646075A (en) * 1983-11-03 1987-02-24 Robert Bosch Corporation System and method for a data processing pipeline
US4739317A (en) * 1984-03-30 1988-04-19 International Business Machines Corporation Draw graphics capabilities
US4683468A (en) * 1985-03-11 1987-07-28 International Business Machines Corp. Method for manipulation of graphic sub-objects in an interactive draw graphic system
JPS62103540A (en) * 1985-10-30 1987-05-14 Mitsubishi Heavy Ind Ltd Method for measuring curing time of organic adhesive
US4764763A (en) * 1985-12-13 1988-08-16 The Ohio Art Company Electronic sketching device
EP0246340A1 (en) * 1986-05-17 1987-11-25 Andreas Dipl.-Math. Wetjen Method to simulate the movements of human dummies
US4760548A (en) * 1986-06-13 1988-07-26 International Business Machines Corporation Method and apparatus for producing a curve image
JPH0785271B2 (en) * 1986-06-27 1995-09-13 株式会社日立製作所 Shape modeling method
JPS63109581A (en) * 1986-10-27 1988-05-14 Video Toron Kk Animation picture processor
JPH0743774B2 (en) * 1986-12-05 1995-05-15 富士通株式会社 Animation creation processing device
US4897638A (en) * 1987-02-27 1990-01-30 Hitachi, Ltd. Method for generating character patterns with controlled size and thickness
AU2999189A (en) * 1988-02-15 1989-08-17 Information Concepts Pty. Ltd. Electronic drawing tools
SE8801043D0 (en) * 1988-03-22 1988-03-22 Orjan Strandberg GeniMator
EP0342752B1 (en) * 1988-05-20 1997-08-06 Koninklijke Philips Electronics N.V. A computer method and an aparatus for generating a display picture representing a set of object elements including a brush object element
US5025394A (en) * 1988-09-09 1991-06-18 New York Institute Of Technology Method and apparatus for generating animated images
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
CA1329433C (en) * 1988-10-24 1994-05-10 Lemuel L. Davis Computer animation production system
US5233671A (en) * 1989-02-22 1993-08-03 Ricoh Company Ltd. Image coding method for coding characters using a modified Bezier curve
US5155805A (en) * 1989-05-08 1992-10-13 Apple Computer, Inc. Method and apparatus for moving control points in displaying digital typeface on raster output devices
US5214758A (en) * 1989-11-14 1993-05-25 Sony Corporation Animation producing apparatus
US5155813A (en) * 1990-01-08 1992-10-13 Wang Laboratories, Inc. Computer apparatus for brush styled writing
US5416899A (en) * 1992-01-13 1995-05-16 Massachusetts Institute Of Technology Memory based method and apparatus for computer graphics

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3821322A1 (en) * 1988-06-24 1990-01-04 Rolf Prof Dr Walter Method of controlling a graphic output device
EP0453044A1 (en) * 1990-04-20 1991-10-23 Neurones Cartoon A method and apparatus for storing and animation of data

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0702332A2 (en) * 1994-09-13 1996-03-20 Canon Kabushiki Kaisha Edge to edge blends
EP0702332A3 (en) * 1994-09-13 1996-06-05 Canon Kk Edge to edge blends
AU706423B2 (en) * 1994-09-13 1999-06-17 Canon Kabushiki Kaisha Edge to edge blends
US6339433B1 (en) 1994-09-13 2002-01-15 Canon Kabushiki Kaisha Creating a blend of color and opacity between arbitrary edges

Also Published As

Publication number Publication date
WO1992009965A1 (en) 1992-06-11
AU8932191A (en) 1992-06-25
US5692117A (en) 1997-11-25
JPH06505817A (en) 1994-06-30
EP0559714A1 (en) 1993-09-15
WO1992009966A1 (en) 1992-06-11
AU9015891A (en) 1992-06-25
EP0559708A1 (en) 1993-09-15
WO1992021095A1 (en) 1992-11-26
JPH06503663A (en) 1994-04-21
EP0585298A1 (en) 1994-03-09
US5611036A (en) 1997-03-11

Similar Documents

Publication Publication Date Title
US5754183A (en) Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object
WO1992021096A1 (en) Image synthesis and processing
EP0950988B1 (en) Three-Dimensional image generating apparatus
US4609917A (en) Three-dimensional display system
US4475104A (en) Three-dimensional display system
US5742294A (en) Method and apparatus for synthesizing images
US6373490B1 (en) Using remembered properties to create and regenerate points along an editable path
US4189744A (en) Apparatus for generating signals representing operator-selected portions of a scene
Fekete et al. TicTacToon: A paperless system for professional 2D animation
US5252953A (en) Computergraphic animation system
JP3862759B2 (en) Computer system and process for defining and producing images using structured objects with variable edge characteristics
US5903270A (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
Wallace Merging and transformation of raster images for cartoon animation
EP0990223B1 (en) Method and apparatus for changing a color of an image
US5412402A (en) Electronic graphic systems
US7420574B2 (en) Shape morphing control and manipulation
US5596692A (en) Computer graphics
GB2312120A (en) Producing a transition region surrounding an image
Durand The “TOON” project: requirements for a computerized 2D animation system
GB2258790A (en) Animation
EP0586444A1 (en) Image synthesis and processing
WO1997045782A2 (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
WO1997045782A8 (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
GB2256118A (en) Image synthesis and processing
JPH04229379A (en) Image editing system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR CA CH CS DE DK ES FI GB HU JP KP KR LK LU MG MN MW NL NO PL RO RU SD SE US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU MC ML MR NL SE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1992910492

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 08150100

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 1992910492

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWR Wipo information: refused in national office

Ref document number: 1992910492

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1992910492

Country of ref document: EP