WO2001020554A1 - Method and apparatus for rendering images with refractions - Google Patents

Method and apparatus for rendering images with refractions Download PDF

Info

Publication number
WO2001020554A1
WO2001020554A1 PCT/JP2000/006136 JP0006136W WO0120554A1 WO 2001020554 A1 WO2001020554 A1 WO 2001020554A1 JP 0006136 W JP0006136 W JP 0006136W WO 0120554 A1 WO0120554 A1 WO 0120554A1
Authority
WO
WIPO (PCT)
Prior art keywords
background image
viewpoint
vertexes
axis
determining
Prior art date
Application number
PCT/JP2000/006136
Other languages
French (fr)
Inventor
Hajime Sugiyama
Original Assignee
Sony Computer Entertainment Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc. filed Critical Sony Computer Entertainment Inc.
Priority to EP00957038A priority Critical patent/EP1214689B1/en
Priority to BR0013900-9A priority patent/BR0013900A/en
Priority to DE60002487T priority patent/DE60002487T2/en
Priority to CA002384569A priority patent/CA2384569A1/en
Priority to MXPA02002373A priority patent/MXPA02002373A/en
Priority to AU68755/00A priority patent/AU6875500A/en
Publication of WO2001020554A1 publication Critical patent/WO2001020554A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/62Semi-transparency

Definitions

  • the present invention relates to a method of and an apparatus for rendering an image to express light rays passing through an object and refracted thereby, i.e., a phenomenon known as refraction, at a high speed with a three- dimensional image, a recording medium which stores a program and data for performing such image processing, and a program for performing such image processing.
  • a plurality of three-dimensional shapes are generated by three-dimensional modeling of CAD, and a rendering process is performed by applying colors and shades to the objects, adding optical properties including mirror reflection, diffuse reflection, refraction, transparency, etc. to the ob- jects, adding surface patterns to the objects, and plotting images depending on surroundings such as window and scenery reflections and ambient light.
  • the ray tracing technique light rays are traced in a space where an object is placed, and the object is rendered with points of intersection between the light rays and the object. Stated otherwise, the intensity of light rays that arrive at a viewpoint is tracked back from the viewpoint while reproducing reflections and refractions at the surfaces of the object according to the realistic be- havior of the light rays.
  • a point of intersection between a light ray from a fixed viewpoint and an object as a displayed pixel is sought, and if there is such a point of intersection, then the light ray as it is reflected or refracted by the object is traced.
  • Information as to the point of intersection is stored as information as to the displayed pixel.
  • the information determined with respect to each pixel repre- sents inherent color characteristics including hue, saturation, and brightness, textures including reflections, refractions, gloss, and luster, or shadows and highlights.
  • the ray tracing technique is disadvantageous in that since the above information is associated with each pixel, the overall amount of information that is required is large, and the time required to perform calculations for ray tracing is long.
  • a method of rendering an image comprising the steps of rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
  • an apparatus for rendering an image comprising ren- dering means for rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
  • a recording medium storing a program and data, the program comprising the steps of rendering surfaces of an ob- ject which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
  • a program which can be read and executed by a computer, comprising the steps of rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint , and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
  • a portion of the background image in a range projected by way of perspective projection with vec- tors which are directed from the viewpoint toward the background image and take into account refractions at vertexes of the surface is used as the texture when the surface is rendered .
  • the vectors may be determined based on at least the directions of normals to the surface in a viewpoint coordinate system and the directions of line segments directed from the viewpoint toward the vertexes.
  • the positions of the vertexes as projected onto a uz plane which is made up of a u-axis of the background image and a z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes on the u-axis of the background image are determined based on at least the directions of line segments directed from the viewpoint toward the projected positions and the directions of the normals in the uz plane.
  • the positions of the vertexes as projected onto a vz plane which is made up of a v-axis of the background image and the z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes on the v-axis of the background image are determined based on at least the directions of line segments directed from the viewpoint toward the projected positions and the directions of the normals in the vz plane.
  • the background image to be mapped onto the surface can easily be determined, and hence a phenomenon of refraction can be displayed as a three-dimensional image at a high speed.
  • FIG. 1 is a block diagram of a general arrangement of an entertainment apparatus according to the present invention
  • FIG. 2 is a diagram of a displayed image of a mountain scene rendered as a background image and a cube of a mate- rial such as glass placed in front of the mountain scene;
  • FIG. 3 is a diagram illustrating the manner in which the vector of light rays radiated from a viewpoint is refracted by the cube;
  • FIG. 4A is a diagram illustrating the range of a tex- ture image for use as a rear surface of a cube
  • FIG. 4B is a diagram illustrating the texture image shown in FIG. 4A as applied to the cube and the range of a texture image for use as a front surface of the cube;
  • FIG. 4C is a diagram illustrating the texture image shown in FIG. 4B as applied to the cube;
  • FIG. 5 is a diagram illustrating the manner in which vertexes of the rear surface of the cube are projected onto the background image in view of refractions by way of perspective projection
  • FIG. 6 is a diagram illustrating the manner in which vertexes of the front surface of the cube are projected onto the background image in view of refractions by way of per- spective projection
  • FIG. 7 is a diagram of a displayed image of a mountain scene rendered as a background image and two cubes of a material such as glass placed in front of the mountain scene;
  • FIG. 8 is a functional block diagram of a rendering means according to the present invention.
  • FIGS. 9 and 10 are a flowchart of a processing sequence of the rendering means shown in FIG. 8.
  • FIGS. 1 through 10 An embodiment in which a method of and an apparatus for rendering an image are applied to an entertainment apparatus for performing three-dimensional CG processing, and a recording medium and a program are applied to a recording me- dium storing a program and data executed by the entertainment apparatus and such a program, will be described below with reference to FIGS. 1 through 10.
  • an entertainment apparatus 10 comprises an MPU 12 for controlling the entertainment apparatus 10, a main memory 14 for storing various programs to be run and various data, a vector operation unit 16 for performing floating-point vector operations required for geometry processing, an image processor 20 for generating image data under the control of the MPU 12 and outputting the generated image data to a display monitor 18, e.g., a CRT, a graphic interface (GIF) 22 for arbitrating transfer paths between the MPU 12, the vector operation unit 16, and the image processor 20, an input/output port 24 for sending data to and receiving data from external devices, a ROM (OSDROM) 26 with an OSD function, which may comprise a flash memory or the like, for controlling the kernel, etc., and a real-time clock 28 having a calendar and clock function.
  • a display monitor 18 e.g., a CRT
  • GIF graphic interface
  • an input/output port 24 for sending data to and receiving data from external devices
  • a ROM (OSDROM) 26 with an OSD function which may comprise a flash
  • the main memory 14, the vector operation unit 16, the GIF 22, the OSDROM 26, the real-time clock 28, and the input/output port 24 are connected to the MPU 12 via a bus 30.
  • an in- put/output device 32 for inputting data (key entry data, coordinate data, etc.) to the entertainment apparatus 10, and an optical disk drive 36 for playing back an optical disk 34 such as a CD-ROM or the like in which various programs and data (object-related data, texture data, etc.) are stored.
  • the image processor 20 comprises a rendering engine 70, a memory interface 72, an image memory 74, and a display controller 76 such as a programmable CRT controller or the like.
  • the rendering engine 70 serves to render image data in the image memory 74 via the memory interface 72 based on a rendering command supplied from the MPU 12.
  • a first bus 78 is connected between the memory interface 72 and the rendering engine 70, and a second bus 80 is connected between the memory interface 72 and the image mem- ory 74.
  • Each of the first and second buses 78, 80 has a
  • 128-bit width for example, for allowing the rendering engine 70 to render image data in the image memory 74 at a high speed.
  • the rendering engine 70 is capable of rendering image data of 320 x 240 pixels or image data of 640 x 480 pixels according to the NTSC or PAL system on a real-time fashion, i.e., in 1/60 seconds to 1/30 seconds, more than ten times to several ten times.
  • the image memory 74 is of a unified memory structure that is able to designate a texture rendering area and a display rendering area as the same area.
  • the image controller 76 writes texture data read from the optical disk 34 via the optical disk drive 36 or texture data generated in the main memory 14 via the memory interface 72 into the texture rendering area of the image memory 74, and reads image data rendered in the display rendering area of the image memory 74 via the memory interface 72 and outputs the read image data to the display monitor 18 to display an image on its display screen.
  • the surfaces of the object are rendered successively from the one remotest from a viewpoint , and the background image of a surface is used as a texture when each surface is rendered.
  • a portion of the background image in a range projected by way of perspective projection with vectors which are di- rected from the viewpoint toward the background image and take into account refractions at vertexes of the surface is employed as the texture, for example, when the surface is rendered.
  • a vector taking into account the refraction at each of the vertexes of a surf ce can be determined based on at least the direction of a normal to the surface in a viewpoint coordinate system and the direction of a line segment directed from the viewpoint toward each vertex.
  • FIG. 2 shows a displayed image of a mountain scene rendered as a background image 100 and a cube 102 of a material such as glass placed in front of the moun- tain scene. Light rays passing through the cube 102 in the displayed image are refracted according to the above characteristic function.
  • the vector of a light ray La radiated from a viewpoint 104 is refracted when the light ray La enters the cube 102 and when the light ray La leaves the cube 102.
  • the cube 102 has a refractive index n2
  • the light ray La is applied to a surface, closer to the viewpoint 104, of the cube 102 at an incident angle ⁇ l, i.e., an angle between the light ray La and a normal 106 to the surface
  • the light ray La leaves the surface of the cube 102 at an exit angle ⁇ 2, i.e., an angle be- tween the light ray La and a normal 106 to the surface
  • the relationship represented by the above equation also holds
  • a method of rendering an image according to the present invention determines the positions of the surfaces of the cube 102 in the background image based on the above relationship.
  • a surface Al remotest from the viewpoint is selected, and the positions in the background image 100 of the vertexes a, b, c, d of the surface Al, more accurately, the position of the surface Al in the background image 100, are determined based on at least the direction of a normal to the surface Al in a viewpoint coordinate system and the directions of line segments directed from the view- point toward the vertexes a, b, c, d.
  • the position of the surface Al in the background image 100 is a position in the background image 100 that is rendered in the image memory 74. Determining the position of the surface Al in the background image 100 is equivalent to determining the position of the surface Al in a uv coordinate system having a u-axis as the horizontal axis of the background image 100 and a v-axis as the vertical axis of the background image 100.
  • the positions of the vertexes a, b, c, d as projected onto a uz plane which is made up of the u-axis of the background image 100 and a z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes a, b, c, d on the u-axis of the background image 100 are determined based on at least the directions of line segments directed from the viewpoint 104 toward the projected positions and the direction of the normal to the surface Al on the uz plane.
  • Al is analyzed.
  • the direction, i.e., the exit angle ⁇ 2, of the line segment 110 from the vertex a toward the background image 100 is determined based on the angle (incident angle) ⁇ l between the line segment 110 and the normal 112, the refractive index n2 of the cube 102, and the refractive index nl of the space.
  • the position of a u coordinate of the vertex a in the background image 100 is determined by project- ing the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment
  • the positions of the vertexes a, b, c, d as projected onto a vz plane which is made up of the v-axis of the background image 100 and the z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes a, b, c, d on the v-axis of the background image 100 are determined based on at least the directions of line seg- ments directed from the viewpoint 104 toward the projected positions and the direction of the normal to the surface Al on the vz plane.
  • line segment from the viewpoint 104 toward the vertex a is analyzed.
  • the direc- tion, i.e., the exit angle ⁇ 2 , of the line segment from the vertex a toward the background image 100 is determined based on the angle (incident angle) ⁇ l between the line segment and the normal, the refractive index n2 of the cube 102, and the refractive index nl of the space.
  • the position of a v coordinate of the vertex a in the background image 100 is determined by projecting the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment that is determined by the exit angle ⁇ 2.
  • the positions of v coordinates of the other vertexes b, c, d in the background image 100 are similarly determined.
  • the u, v coordinates (u, v) of the vertex a of the rear surface Al in the background image 100 are determined.
  • the coordinates of the other vertexes in the background image 100 are determined.
  • a range projected by vectors directed from the viewpoint 104 via the surface Al toward the background image 100 and taking into account the refractions at the vertexes a, b, c, d of the surface Al, i.e., a range 120 of texture to be used is determined.
  • the texture of the range 120 is then mapped onto the surface Al, which is rendered in the image memory 74.
  • three front surfaces A4, A5, A6 are selected, and then processed in the same manner as described above. For example, as shown in FIG. 6, a line segment 122 from the viewpoint 104 toward a vertex a of the front surface A5 is analyzed.
  • ⁇ 2 of the line segment 122 from the vertex a toward the background image 100 is determined based on the angle (inci- dent angle) ⁇ l between the line segment 122 and a normal 124 to the surface A5, the refractive index n2 of the cube 102, and the refractive index nl of the space.
  • the position of a u coordinate of the vertex a in the background image 100 is determined by projecting the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment 126 that is determined by the exit angle ⁇ 2.
  • the positions of u coordinates of the other vertexes b. g, h of the surface A5 in the background image 100 are similarly determined.
  • the position of a v coordinate of the vertex a in the background image 100 is determined in the same manner as described above.
  • the u, v coordinates (u, v) of the vertex a of the front surface A5 in the background image 100 are determined.
  • the u, v coordinates of the other vertexes b, g, h in the background image 100 are de- termined.
  • a range projected by vectors directed from the viewpoint 104 via the surface A5 toward the background image 100 and taking into account the refractions at the vertexes a, b, g, g of the surface A5, i.e., a range 130 of texture to be used, is de- termined.
  • the texture of the range 130 is then mapped onto the surface A5, which is rendered in the image memory 74.
  • images of the three rear surfaces Al , A2, A3 are rendered in the background image 100 in the front surface A5, and the texture of the front surface A5 includes the images of the three rear surfaces Al, A2, A3.
  • FIG. 7 After the surface A5 has been rendered in the image memory 74, another front surface A6 is selected, and then processed in the same manner as described above with respect to the surface A5.
  • one cube 102 is placed in the mountain scene.
  • the method according to the present invention is also applicable to a plurality of (two in FIG. 7) cubes 102A, 102B placed in a mountain scene.
  • the cubes 102A, 102B are processed successively in the order from the cube 102A remoter from the viewpoint 104.
  • An example of software, i.e., a rendering means 200 (see FIG. 8), to perform the above function will be described below with reference to FIGS. 8 through 10.
  • the rendering means 200 is supplied to the entertainment apparatus 10 from a randomly accessible recording me- dium such as a CD-ROM or a memory card or via a network. It is assumed here that the rendering means 200 is read into the entertainment apparatus 10 from the optical disk 34 such as a CD-ROM.
  • the rendering means 200 is downloaded in advance from the optical disk 34 played back by the entertainment apparatus 10 into the main memory 14 of the entertainment apparatus 10 according to a predetermined process, and executed by the MPU 12.
  • the rendering means 200 comprises an object selecting means 206 for selecting object data 204 of an object in the order from the one remotest from the viewpoint, among a number of objects registered in an object data file 202, a surface selecting means 208 for selecting a surface in the order from the one remotest from the view- point, among a plurality of surfaces of the objects indicated by the selected object data 204, a normal direction determining means 210 for determining the directions of nor- mals in the uz and vz coordinate systems in the selected surface, a first projected position calculating means 212 for determining the position of a vertex of the selected surface as projected onto the uz plane, an angle calculating means 214 for determining an incident angle ⁇ l and an exit angle ⁇ 2 , and a first coordinate calculating means 216 for determining u coordinates in the background image 100 based on the projected position of the vertex on the uz plane and the exit angles ⁇ 2 at the vertex.
  • an object selecting means 206 for selecting object data 204 of an object
  • the rendering means 200 also has a second projected position calculating means 218 for determining the position of a vertex of the selected surface as projected onto the vz plane, a second coordinate calculating means 220 for determining v coordinates in the background image 100 based on the projected position of the vertex on the vz plane and the exit angles ⁇ 2 at the vertex, a texture image determining means 222 for determining a texture image to be used from the background image 100 based on the uv coordinates of the vertexes, a texture mapping means 224 for mapping the deter- mined texture image onto the selected surface, and an end determining means 226 for determining whether the processing sequence of the rendering means 200 is completed or not.
  • a second projected position calculating means 218 for determining the position of a vertex of the selected surface as projected onto the vz plane
  • a second coordinate calculating means 220 for determining v coordinates in the background image 100 based on the projected position of the vertex on the vz plane and the exit angles ⁇
  • step SI shown in FIG. 9 the rendering means 200 stores an initial value "1" into an index register m used to retrieve objects, thereby initializing the index register m.
  • step S2 the object selecting means 206 selects object data 204 of an object (mth object) in the order from the one remotest from the viewpoint, among a number of objects registered in the object data file 202.
  • step S3 the rendering means 200 stores an initial value "1" into an index register i used to retrieve surfaces of objects, thereby initializing the index register i.
  • step S4 the surface selecting means 208 selects a surface (ith surface) in the order from the one remotest from the viewpoint, among a plurality of surfaces of the objects indicated by the selected object data 204.
  • step S5 the rendering means 200 determines the number N of vertexes of the ith surface. Thereafter, in step S6, the normal direction determining means 210 determines the directions of normals to the ith surface in the uz and vz coordinate systems.
  • step S7 the rendering means 200 stores an initial value "1" into an index register j used to retrieve vertexes , thereby initializing the index register j .
  • step S8 the first projected position calculating means 212 determines the position (coordinates) of a jth vertex as projected onto the uz plane.
  • step S9 the angle calculating means 214 determines an angle (incident angle) ⁇ l between a line segment inter- connecting the viewpoint and the jth vertex and the normal in the uz plane.
  • step S10 the angle calculating means 214 determines an angle (exit angle) ⁇ 2 based on the refrac- tive index of the incident medium, the refractive index of the exit medium, and the incident angle ⁇ l.
  • step Sll the first coordinate calculating means 216 projects the jth vertex by way of perspective projection in the direction determined by the position (coordinates) of the jth vertex as projected onto the uz plane and the exit angle ⁇ 2, and determines the position (u coordinates) of the jth vertex in the background image 100.
  • step S12 shown in FIG. 10 the second projected po- sition calculating means 218 determines the position (coordinates) of the jth vertex as projected onto the vz plane.
  • step S13 the angle calculating means 214 determines an angle (incident angle) ⁇ l between a line segment interconnecting the viewpoint and the jth vertex and the normal in the vz plane.
  • step S14 the angle calculating means
  • step S14 the second coordinate calculating means 220 projects the jth vertex by way of perspective projection in the direction determined by the position (coordinates) of the jth vertex as projected onto the vz plane and the exit angle ⁇ 2, and determines the position (v coordinates) of the jth vertex in the background image 100.
  • step S16 the rendering means 200 increments the value of the index register j by "1".
  • step S17 the rendering means 200 decides whether uv coordinates of all the vertexes of the ith surface have been determined or not, based on whether the value of the index register j is greater than the number N of vertexes or not.
  • step S18 the texture image determining means 222 determines a portion of the background image 100 in a range surrounded by the uv coordinates of the vertexes of the ith surface, as a texture image .
  • step S19 the texture mapping means 224 maps the determined texture image onto the ith surface, and renders the ith surface with the mapped texture image in the image memory 74.
  • step S20 the rendering means 200 increments the value of the index register i by "1".
  • step S21 the end determining means 226 decides whether the processings with respect to all the surfaces of the object have been completed or not, based on whether the value of the index register i is greater than the number M of surfaces of the object or not.
  • step S4 If the texture mapping has not been completed for all the surfaces, then control goes back to step S4 for processing a next surface. If the texture mapping has been completed for all the surfaces, then control proceeds to step S22 in which the rendering means 200 increments the value of the index register m by "1". In step S23, the end determining means 226 decides whether the processings with respect to all the objects have been completed or not, based on whether the value of the index register m is greater than the number P of objects or not.
  • control goes back to step S4 for processing a next object. If the processing with respect to all the objects have been completed, then the processing sequence of the rendering means 200 is brought to an end.
  • the rendering means 200 when the rendering means according to the present embodiments 200 renders each surface of an object where light ray are refracted, the rendering means 200 uses only the background image 100 in each surface as a texture image, and hence can process images at a high speed. Consequently, light rays passing through an object and refracted thereby can be expressed at a high speed with a three-dimensional image, for thereby expressing a moving transparent object simply and at a high speed.

Abstract

Surfaces of an object are rendered in the order from a surface remotest from a viewpoint (104). When a rear surface, e.g., a rear surface (A1), is to be rendered, a range, i.e., a range (120) of a texture to be used, projected by way of perspective projection with vectors which are directed from the viewpoint toward a background image (100) and take into account refractions at vertexes (a, b, c, d) of the surface is determined, and the texture in the determined range is mapped onto the surface. When a front surface, e.g., a front surface (A5), is to be rendered, a range, i.e., a range (130) of a texture to be used, projected by way of perspective projection with vectors which are directed from the viewpoint toward a background image (100) and take into account refractions at vertexes (a, b, c, d) of the surface is determined, and the texture in the determined range is mapped onto the surface.

Description

DESCRIPTION
METHOD OF AND APPARATUS FOR RENDERING IMAGE, RECORDING MEDIUM, AND PROGRAM
Technical Field
The present invention relates to a method of and an apparatus for rendering an image to express light rays passing through an object and refracted thereby, i.e., a phenomenon known as refraction, at a high speed with a three- dimensional image, a recording medium which stores a program and data for performing such image processing, and a program for performing such image processing.
Background Art
Recently, various computer graphics (CG) processing techniques including hidden line processing, hidden surface removal, smooth shading, texture mapping, etc. have been in rapid progress in combination with quickly growing hardware technologies .
According to a general CG processing scheme, a plurality of three-dimensional shapes (objects) are generated by three-dimensional modeling of CAD, and a rendering process is performed by applying colors and shades to the objects, adding optical properties including mirror reflection, diffuse reflection, refraction, transparency, etc. to the ob- jects, adding surface patterns to the objects, and plotting images depending on surroundings such as window and scenery reflections and ambient light.
If light rays passing through an object and refracted thereby, for example, are to be expressed as a three- dimensional image, then it is necessary to reproduce such an optical phenomenon. The vector of a light ray that is radiated from a viewpoint is refracted when the light ray enters the object and also when the light ray leaves the object. In order to express the above phenomenon of refraction with a three-dimensional image, it is customary to employ ray tracing rather than polygons .
According to the ray tracing technique, light rays are traced in a space where an object is placed, and the object is rendered with points of intersection between the light rays and the object. Stated otherwise, the intensity of light rays that arrive at a viewpoint is tracked back from the viewpoint while reproducing reflections and refractions at the surfaces of the object according to the realistic be- havior of the light rays.
Specifically, a point of intersection between a light ray from a fixed viewpoint and an object as a displayed pixel is sought, and if there is such a point of intersection, then the light ray as it is reflected or refracted by the object is traced. Information as to the point of intersection is stored as information as to the displayed pixel. The information determined with respect to each pixel repre- sents inherent color characteristics including hue, saturation, and brightness, textures including reflections, refractions, gloss, and luster, or shadows and highlights.
However, the ray tracing technique is disadvantageous in that since the above information is associated with each pixel, the overall amount of information that is required is large, and the time required to perform calculations for ray tracing is long.
Disclosure of Invention
It is therefore an object of the present invention to provide a method of and an apparatus for rendering an image, a recording medium, and a program to express light rays passing through an object and refracted thereby at a high speed with a three-dimensional image, for thereby expressing a moving transparent object simply and at a high speed.
According to the present invention, there is provided a method of rendering an image, comprising the steps of rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
According to the present invention, there is also provided an apparatus for rendering an image, comprising ren- dering means for rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
According to the present invention, there is further provided a recording medium storing a program and data, the program comprising the steps of rendering surfaces of an ob- ject which causes refraction in the order from a surface remotest from a viewpoint, and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
According to the present invention, there is still fur- ther provided a program which can be read and executed by a computer, comprising the steps of rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint , and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
When each surface is to be rendered, since only the background image in each surface is used as a texture image, images can be processed at a high speed. Consequently, light rays passing through an object and refracted thereby can be expressed at a high speed with a three-dimensional image, for thereby expressing a moving transparent object simply and at a high speed.
Preferably, a portion of the background image in a range projected by way of perspective projection with vec- tors which are directed from the viewpoint toward the background image and take into account refractions at vertexes of the surface is used as the texture when the surface is rendered .
The vectors may be determined based on at least the directions of normals to the surface in a viewpoint coordinate system and the directions of line segments directed from the viewpoint toward the vertexes.
Specifically, the positions of the vertexes as projected onto a uz plane which is made up of a u-axis of the background image and a z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes on the u-axis of the background image are determined based on at least the directions of line segments directed from the viewpoint toward the projected positions and the directions of the normals in the uz plane. The positions of the vertexes as projected onto a vz plane which is made up of a v-axis of the background image and the z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes on the v-axis of the background image are determined based on at least the directions of line segments directed from the viewpoint toward the projected positions and the directions of the normals in the vz plane.
The background image to be mapped onto the surface can easily be determined, and hence a phenomenon of refraction can be displayed as a three-dimensional image at a high speed. The above and other objects, features, and advantages of the present invention will become more apparent from the following description when taken in conjunction with the ac- companying drawings in which a preferred embodiment of the present invention is shown by way of illustrative example.
Brief Description of Drawings FIG. 1 is a block diagram of a general arrangement of an entertainment apparatus according to the present invention;
FIG. 2 is a diagram of a displayed image of a mountain scene rendered as a background image and a cube of a mate- rial such as glass placed in front of the mountain scene; FIG. 3 is a diagram illustrating the manner in which the vector of light rays radiated from a viewpoint is refracted by the cube;
FIG. 4A is a diagram illustrating the range of a tex- ture image for use as a rear surface of a cube;
FIG. 4B is a diagram illustrating the texture image shown in FIG. 4A as applied to the cube and the range of a texture image for use as a front surface of the cube;
FIG. 4C is a diagram illustrating the texture image shown in FIG. 4B as applied to the cube;
FIG. 5 is a diagram illustrating the manner in which vertexes of the rear surface of the cube are projected onto the background image in view of refractions by way of perspective projection; FIG. 6 is a diagram illustrating the manner in which vertexes of the front surface of the cube are projected onto the background image in view of refractions by way of per- spective projection;
FIG. 7 is a diagram of a displayed image of a mountain scene rendered as a background image and two cubes of a material such as glass placed in front of the mountain scene; FIG. 8 is a functional block diagram of a rendering means according to the present invention; and
FIGS. 9 and 10 are a flowchart of a processing sequence of the rendering means shown in FIG. 8.
Best Mode for Carrying Out the Invention
An embodiment in which a method of and an apparatus for rendering an image are applied to an entertainment apparatus for performing three-dimensional CG processing, and a recording medium and a program are applied to a recording me- dium storing a program and data executed by the entertainment apparatus and such a program, will be described below with reference to FIGS. 1 through 10.
As shown in FIG. 1, an entertainment apparatus 10 comprises an MPU 12 for controlling the entertainment apparatus 10, a main memory 14 for storing various programs to be run and various data, a vector operation unit 16 for performing floating-point vector operations required for geometry processing, an image processor 20 for generating image data under the control of the MPU 12 and outputting the generated image data to a display monitor 18, e.g., a CRT, a graphic interface (GIF) 22 for arbitrating transfer paths between the MPU 12, the vector operation unit 16, and the image processor 20, an input/output port 24 for sending data to and receiving data from external devices, a ROM (OSDROM) 26 with an OSD function, which may comprise a flash memory or the like, for controlling the kernel, etc., and a real-time clock 28 having a calendar and clock function.
The main memory 14, the vector operation unit 16, the GIF 22, the OSDROM 26, the real-time clock 28, and the input/output port 24 are connected to the MPU 12 via a bus 30. To the input/output port 24, there are connected an in- put/output device 32 for inputting data (key entry data, coordinate data, etc.) to the entertainment apparatus 10, and an optical disk drive 36 for playing back an optical disk 34 such as a CD-ROM or the like in which various programs and data (object-related data, texture data, etc.) are stored. As shown in FIG. 1, the image processor 20 comprises a rendering engine 70, a memory interface 72, an image memory 74, and a display controller 76 such as a programmable CRT controller or the like.
The rendering engine 70 serves to render image data in the image memory 74 via the memory interface 72 based on a rendering command supplied from the MPU 12.
A first bus 78 is connected between the memory interface 72 and the rendering engine 70, and a second bus 80 is connected between the memory interface 72 and the image mem- ory 74. Each of the first and second buses 78, 80 has a
128-bit width, for example, for allowing the rendering engine 70 to render image data in the image memory 74 at a high speed.
The rendering engine 70 is capable of rendering image data of 320 x 240 pixels or image data of 640 x 480 pixels according to the NTSC or PAL system on a real-time fashion, i.e., in 1/60 seconds to 1/30 seconds, more than ten times to several ten times.
The image memory 74 is of a unified memory structure that is able to designate a texture rendering area and a display rendering area as the same area. The image controller 76 writes texture data read from the optical disk 34 via the optical disk drive 36 or texture data generated in the main memory 14 via the memory interface 72 into the texture rendering area of the image memory 74, and reads image data rendered in the display rendering area of the image memory 74 via the memory interface 72 and outputs the read image data to the display monitor 18 to display an image on its display screen.
A characteristic function of the entertainment apparatus 10 will be described below with reference to FIGS. 2 through 10.
According to the characteristic function, for rendering an object that causes refraction, the surfaces of the object are rendered successively from the one remotest from a viewpoint , and the background image of a surface is used as a texture when each surface is rendered.
A portion of the background image in a range projected by way of perspective projection with vectors which are di- rected from the viewpoint toward the background image and take into account refractions at vertexes of the surface is employed as the texture, for example, when the surface is rendered. A vector taking into account the refraction at each of the vertexes of a surf ce can be determined based on at least the direction of a normal to the surface in a viewpoint coordinate system and the direction of a line segment directed from the viewpoint toward each vertex. A specific process for performing the above characteristic function will be described below with reference to FIGS. 2 through 7. FIG. 2 shows a displayed image of a mountain scene rendered as a background image 100 and a cube 102 of a material such as glass placed in front of the moun- tain scene. Light rays passing through the cube 102 in the displayed image are refracted according to the above characteristic function.
As shown in FIG. 3, the vector of a light ray La radiated from a viewpoint 104 is refracted when the light ray La enters the cube 102 and when the light ray La leaves the cube 102. If it is assumed that the space outside of the cube 102 has a refractive index nl, the cube 102 has a refractive index n2, the light ray La is applied to a surface, closer to the viewpoint 104, of the cube 102 at an incident angle θl, i.e., an angle between the light ray La and a normal 106 to the surface, and the light ray La leaves the surface of the cube 102 at an exit angle Θ2, i.e., an angle be- tween the light ray La and a normal 106 to the surface, then the refraction of the light lay La at the surface is represented by the following equation known as the Snell's law: nlsinθl = n2sinθ2 The relationship represented by the above equation also holds true for the refraction of the light lay La at another surface, remoter from the viewpoint 104, of the cube 102.
A method of rendering an image according to the present invention determines the positions of the surfaces of the cube 102 in the background image based on the above relationship.
Specifically, as shown in FIGS. 4A and 4B, of the six surfaces of the cube 102, a surface Al remotest from the viewpoint is selected, and the positions in the background image 100 of the vertexes a, b, c, d of the surface Al, more accurately, the position of the surface Al in the background image 100, are determined based on at least the direction of a normal to the surface Al in a viewpoint coordinate system and the directions of line segments directed from the view- point toward the vertexes a, b, c, d.
The position of the surface Al in the background image 100 is a position in the background image 100 that is rendered in the image memory 74. Determining the position of the surface Al in the background image 100 is equivalent to determining the position of the surface Al in a uv coordinate system having a u-axis as the horizontal axis of the background image 100 and a v-axis as the vertical axis of the background image 100.
The positions of the vertexes a, b, c, d as projected onto a uz plane which is made up of the u-axis of the background image 100 and a z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes a, b, c, d on the u-axis of the background image 100 are determined based on at least the directions of line segments directed from the viewpoint 104 toward the projected positions and the direction of the normal to the surface Al on the uz plane.
For example, as shown in FIG. 5, a line segment 110 from the viewpoint 104 toward a vertex a of the rear surface
Al is analyzed. The direction, i.e., the exit angle Θ2, of the line segment 110 from the vertex a toward the background image 100 is determined based on the angle (incident angle) θl between the line segment 110 and the normal 112, the refractive index n2 of the cube 102, and the refractive index nl of the space. The position of a u coordinate of the vertex a in the background image 100 is determined by project- ing the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment
114 that is determined by the exit angle Θ2. The positions of u coordinates of the other vertexes b, c, d of the surface Al in the background image 100 are similarly deter- mined.
Then, the positions of the vertexes a, b, c, d as projected onto a vz plane which is made up of the v-axis of the background image 100 and the z-axis of the viewpoint coordinate system are determined, and the coordinates of the vertexes a, b, c, d on the v-axis of the background image 100 are determined based on at least the directions of line seg- ments directed from the viewpoint 104 toward the projected positions and the direction of the normal to the surface Al on the vz plane.
For example, although not shown, line segment from the viewpoint 104 toward the vertex a is analyzed. The direc- tion, i.e., the exit angle Θ2 , of the line segment from the vertex a toward the background image 100 is determined based on the angle (incident angle) θl between the line segment and the normal, the refractive index n2 of the cube 102, and the refractive index nl of the space. The position of a v coordinate of the vertex a in the background image 100 is determined by projecting the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment that is determined by the exit angle Θ2. The positions of v coordinates of the other vertexes b, c, d in the background image 100 are similarly determined.
In this manner, the u, v coordinates (u, v) of the vertex a of the rear surface Al in the background image 100 are determined. Similarly, the coordinates of the other vertexes in the background image 100 are determined. Now, as shown in FIGS. 4A and 4B, a range projected by vectors directed from the viewpoint 104 via the surface Al toward the background image 100 and taking into account the refractions at the vertexes a, b, c, d of the surface Al, i.e., a range 120 of texture to be used, is determined. The texture of the range 120 is then mapped onto the surface Al, which is rendered in the image memory 74. After the surface Al has been rendered in the image memory 74, another rear surface A2 is selected, and then processed in the same manner as described above with respect to the surface Al. The surfaces Al through A6 that make up the cube 102 are successively selected and processed in the order of Al → A2 → A3 → A4 → A5 → A6.
After the three rear surfaces Al, A2, A3 have been rendered in the image memory 74, three front surfaces A4, A5, A6 are selected, and then processed in the same manner as described above. For example, as shown in FIG. 6, a line segment 122 from the viewpoint 104 toward a vertex a of the front surface A5 is analyzed. The direction, i.e., the exit angle
Θ2, of the line segment 122 from the vertex a toward the background image 100 is determined based on the angle (inci- dent angle) θl between the line segment 122 and a normal 124 to the surface A5, the refractive index n2 of the cube 102, and the refractive index nl of the space. The position of a u coordinate of the vertex a in the background image 100 is determined by projecting the vertex a onto the background 100 by way of perspective projection according to the vector of a line segment 126 that is determined by the exit angle Θ2. The positions of u coordinates of the other vertexes b. g, h of the surface A5 in the background image 100 are similarly determined.
Then, the position of a v coordinate of the vertex a in the background image 100 is determined in the same manner as described above.
In this manner, the u, v coordinates (u, v) of the vertex a of the front surface A5 in the background image 100 are determined. Similarly, the u, v coordinates of the other vertexes b, g, h in the background image 100 are de- termined. Now, as shown in FIGS. 4B and 4C, a range projected by vectors directed from the viewpoint 104 via the surface A5 toward the background image 100 and taking into account the refractions at the vertexes a, b, g, g of the surface A5, i.e., a range 130 of texture to be used, is de- termined. The texture of the range 130 is then mapped onto the surface A5, which is rendered in the image memory 74. At this time, images of the three rear surfaces Al , A2, A3 are rendered in the background image 100 in the front surface A5, and the texture of the front surface A5 includes the images of the three rear surfaces Al, A2, A3.
After the surface A5 has been rendered in the image memory 74, another front surface A6 is selected, and then processed in the same manner as described above with respect to the surface A5. In the above example, one cube 102 is placed in the mountain scene. However, as shown in FIG. 7, the method according to the present invention is also applicable to a plurality of (two in FIG. 7) cubes 102A, 102B placed in a mountain scene. The cubes 102A, 102B are processed successively in the order from the cube 102A remoter from the viewpoint 104. An example of software, i.e., a rendering means 200 (see FIG. 8), to perform the above function will be described below with reference to FIGS. 8 through 10.
The rendering means 200 is supplied to the entertainment apparatus 10 from a randomly accessible recording me- dium such as a CD-ROM or a memory card or via a network. It is assumed here that the rendering means 200 is read into the entertainment apparatus 10 from the optical disk 34 such as a CD-ROM.
The rendering means 200 is downloaded in advance from the optical disk 34 played back by the entertainment apparatus 10 into the main memory 14 of the entertainment apparatus 10 according to a predetermined process, and executed by the MPU 12.
As shown in FIG. 8, the rendering means 200 comprises an object selecting means 206 for selecting object data 204 of an object in the order from the one remotest from the viewpoint, among a number of objects registered in an object data file 202, a surface selecting means 208 for selecting a surface in the order from the one remotest from the view- point, among a plurality of surfaces of the objects indicated by the selected object data 204, a normal direction determining means 210 for determining the directions of nor- mals in the uz and vz coordinate systems in the selected surface, a first projected position calculating means 212 for determining the position of a vertex of the selected surface as projected onto the uz plane, an angle calculating means 214 for determining an incident angle θl and an exit angle Θ2 , and a first coordinate calculating means 216 for determining u coordinates in the background image 100 based on the projected position of the vertex on the uz plane and the exit angles Θ2 at the vertex. The rendering means 200 also has a second projected position calculating means 218 for determining the position of a vertex of the selected surface as projected onto the vz plane, a second coordinate calculating means 220 for determining v coordinates in the background image 100 based on the projected position of the vertex on the vz plane and the exit angles Θ2 at the vertex, a texture image determining means 222 for determining a texture image to be used from the background image 100 based on the uv coordinates of the vertexes, a texture mapping means 224 for mapping the deter- mined texture image onto the selected surface, and an end determining means 226 for determining whether the processing sequence of the rendering means 200 is completed or not.
A processing sequence of the rendering means 200 will be described below with reference to FIGS. 9 and 10. In step SI shown in FIG. 9, the rendering means 200 stores an initial value "1" into an index register m used to retrieve objects, thereby initializing the index register m. In step S2, the object selecting means 206 selects object data 204 of an object (mth object) in the order from the one remotest from the viewpoint, among a number of objects registered in the object data file 202. In step S3, the rendering means 200 stores an initial value "1" into an index register i used to retrieve surfaces of objects, thereby initializing the index register i.
In step S4, the surface selecting means 208 selects a surface (ith surface) in the order from the one remotest from the viewpoint, among a plurality of surfaces of the objects indicated by the selected object data 204.
In step S5, the rendering means 200 determines the number N of vertexes of the ith surface. Thereafter, in step S6, the normal direction determining means 210 determines the directions of normals to the ith surface in the uz and vz coordinate systems.
In step S7, the rendering means 200 stores an initial value "1" into an index register j used to retrieve vertexes , thereby initializing the index register j . In step S8, the first projected position calculating means 212 determines the position (coordinates) of a jth vertex as projected onto the uz plane.
In step S9, the angle calculating means 214 determines an angle (incident angle) θl between a line segment inter- connecting the viewpoint and the jth vertex and the normal in the uz plane. In step S10, the angle calculating means 214 determines an angle (exit angle) θ2 based on the refrac- tive index of the incident medium, the refractive index of the exit medium, and the incident angle Θl.
In step Sll, the first coordinate calculating means 216 projects the jth vertex by way of perspective projection in the direction determined by the position (coordinates) of the jth vertex as projected onto the uz plane and the exit angle Θ2, and determines the position (u coordinates) of the jth vertex in the background image 100.
In step S12 shown in FIG. 10, the second projected po- sition calculating means 218 determines the position (coordinates) of the jth vertex as projected onto the vz plane.
In step S13, the angle calculating means 214 determines an angle (incident angle) θl between a line segment interconnecting the viewpoint and the jth vertex and the normal in the vz plane. In step S14, the angle calculating means
214 determines an angle (exit angle) Θ2 based on the refractive index of the incident medium, the refractive index of the exit medium, and the incident angle θl.
In step S14, the second coordinate calculating means 220 projects the jth vertex by way of perspective projection in the direction determined by the position (coordinates) of the jth vertex as projected onto the vz plane and the exit angle Θ2, and determines the position (v coordinates) of the jth vertex in the background image 100. In step S16, the rendering means 200 increments the value of the index register j by "1". In step S17, the rendering means 200 decides whether uv coordinates of all the vertexes of the ith surface have been determined or not, based on whether the value of the index register j is greater than the number N of vertexes or not.
If uv coordinates of the vertexes of all the ith sur- face have not been determined, then control goes back to step S8 for determining uv coordinates of a next vertex. If uv coordinates of the vertexes of all the ith surface have been determined, then control proceeds to step S18. In step S18, the texture image determining means 222 determines a portion of the background image 100 in a range surrounded by the uv coordinates of the vertexes of the ith surface, as a texture image .
In step S19, the texture mapping means 224 maps the determined texture image onto the ith surface, and renders the ith surface with the mapped texture image in the image memory 74.
In step S20, the rendering means 200 increments the value of the index register i by "1". In step S21, the end determining means 226 decides whether the processings with respect to all the surfaces of the object have been completed or not, based on whether the value of the index register i is greater than the number M of surfaces of the object or not.
If the texture mapping has not been completed for all the surfaces, then control goes back to step S4 for processing a next surface. If the texture mapping has been completed for all the surfaces, then control proceeds to step S22 in which the rendering means 200 increments the value of the index register m by "1". In step S23, the end determining means 226 decides whether the processings with respect to all the objects have been completed or not, based on whether the value of the index register m is greater than the number P of objects or not.
If the processing with respect to all the objects have not been completed, then control goes back to step S4 for processing a next object. If the processing with respect to all the objects have been completed, then the processing sequence of the rendering means 200 is brought to an end.
As described above, when the rendering means according to the present embodiments 200 renders each surface of an object where light ray are refracted, the rendering means 200 uses only the background image 100 in each surface as a texture image, and hence can process images at a high speed. Consequently, light rays passing through an object and refracted thereby can be expressed at a high speed with a three-dimensional image, for thereby expressing a moving transparent object simply and at a high speed.
Although a certain preferred embodiment of the present invention has been shown and described in detail, it should be understood that various changes and modifications may be made therein without departing from the scope of the ap- pended claims.

Claims

1. A method of rendering an image, comprising the steps of: rendering surfaces (Al - A6) of an object (102) which causes refraction in the order from a surface (Al) remotest from a viewpoint (104); and employing a background image (100) of each of the surfaces (Al - A6) as a texture when the surfaces (Al - A6) are rendered.
2. A method according to claim 1 , wherein said step of employing a background image (100) comprises the step of: employing, as the texture, a portion of said background image (100) in a range projected by way of perspective projection with vectors which are directed from said viewpoint (104) toward said background image (100) and take into account refractions at vertexes (a, b, c, d) of said surface (Al), when the surface (Al) is rendered.
3. A method according to claim 2 , wherein said step of employing a portion of said background image (100) comprises the step of: determining said vectors based on at least the direc- tions of normals to said surface (Al) in a viewpoint coordinate system and the directions of line segments directed from said viewpoint (104) toward the vertexes (a, b, c, d) .
4. A method according to claim 3 , wherein said step of determining said vectors comprises the steps of: determining the positions of said vertexes (a, b, c, d) as projected onto a uz plane which is made up of a u-axis of said background image (100) and a z-axis of said viewpoint coordinate system; determining the coordinates of said vertexes (a, b, c, d) on said u-axis of said background image (100) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the directions of said normals in said uz plane; determining the positions of said vertexes (a, b, c, d) as projected onto a vz plane which is made up of a v-axis of said background image (100) and the z-axis of said viewpoint coordinate system; and determining the coordinates of said vertexes (a, b, c, d) on said v-axis of said background image (100) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the directions of said normals in said vz plane, thereby to determine the position of the surface in the background image, thereby to determine the position of said surface (Al) in the background image (100).
5. An apparatus for rendering an image, comprising: rendering means (200) for rendering surfaces (Al - A6) of an object (102) which causes refraction in the order from a surface (Al) remotest from a viewpoint (104), and employing a background image (100) of each of the surfaces (Al - A6) as a texture when the surfaces (Al - A6) are rendered.
6. An apparatus according to claim 5 , wherein said rendering means (200) comprises: means for employing, as the texture, a portion of said background image (100) in a range projected by way of per- spective projection with vectors which are directed from said viewpoint (104) toward said background image (100) and take into account refractions at vertexes (a, b, c, d) of said surface (Al), when the surface (Al) is rendered.
7. An apparatus according to claim 6 , wherein said rendering means (200) comprises: texture coordinate calculating means (212, 216, 218, 220) for determining said vectors based on at least the directions of normals to said surface (Al) in a viewpoint co- ordinate system and the directions of line segments directed from said viewpoint (104) toward the vertexes (a, b, c, d) .
8. An apparatus according to claim 7 , wherein said texture coordinate calculating means (212, 216, 218, 220) com- prises : first projected position calculating means (212) for determining the positions of said vertexes (a, b, c, d) as projected onto a uz plane which is made up of a u-axis of said background image (100) and a z-axis of said viewpoint coordinate system; first coordinate calculating means (216) for determin- ing the coordinates of said vertexes (a, b, c, d) on said u- axis of said background image (100) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the directions of said normals in said uz plane; second projected position calculating means (218) for determining the positions of said vertexes (a, b, c, d) as projected onto a vz plane which is made up of a v-axis of said background image (100) and the z-axis of said viewpoint coordinate system; and second coordinate calculating means (220) for determining the coordinates of said vertexes (a, b, c, d) on said v- axis of said background image ( 100 ) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the directions of said normals in said vz plane.
9. A recording medium storing a program and data, said program comprising the steps of: rendering surfaces (Al - A6) of an object (102) which causes refraction in the order from a surface (Al) remotest from a viewpoint ( 104 ) ; and employing a background image (100) of each of the sur- faces (Al - A6) as a texture when the surfaces (Al - A6) are rendered.
10. A recording medium according to claim 9, wherein said step of employing a background image (100) comprises the step of: employing, as the texture, a portion of said background image (100) in a range projected by way of perspective projection with vectors which are directed from said viewpoint (104) toward said background image (100) and take into account refractions at vertexes (a, b, c, d) of said surface (Al), when the surface (Al) is rendered.
11. A recording medium according to claim 10, wherein said step of employing a portion of said background image
(100) comprises the step of: determining said vectors based on at least the directions of normals to said surface (Al) in a viewpoint coordinate system and the directions of line segments directed from said viewpoint (104) toward the vertexes (a, b, c, d) .
12. A recording medium according to claim 11, wherein said step of determining said vectors comprises the steps of: determining the positions of said vertexes (a, b, c, d) as projected onto a uz plane which is made up of a u-axis of said background image (100) and a z-axis of said viewpoint coordinate system; determining the coordinates of said vertexes (a, b, c, d) on said u-axis of said background image (100) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the directions of said normals in said uz plane; determining the positions of said vertexes (a, b, c, d) as projected onto a vz plane which is made up of a v-axis of said background image (100) and the z-axis of said viewpoint coordinate system; and determining the coordinates of said vertexes (a, b, c, d) on said v-axis of said background image (100) based on at least the directions of line segments directed from said viewpoint (104) toward the projected positions and the di- rections of said normals in said vz plane.
13. A program which can be read and executed by a computer, comprising the steps of: rendering surfaces of an object which causes refraction in the order from a surface remotest from a viewpoint; and employing a background image of each of the surfaces as a texture when the surfaces are rendered.
PCT/JP2000/006136 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions WO2001020554A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP00957038A EP1214689B1 (en) 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions
BR0013900-9A BR0013900A (en) 1999-09-10 2000-09-08 Method and apparatus for converting an image, recording medium storing a program and data, and a program that can be read and executed by a computer
DE60002487T DE60002487T2 (en) 1999-09-10 2000-09-08 METHOD AND DEVICE FOR REPRESENTING IMAGES WITH REFRACTIONS
CA002384569A CA2384569A1 (en) 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions
MXPA02002373A MXPA02002373A (en) 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions.
AU68755/00A AU6875500A (en) 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP25756599 1999-09-10
JP11/257565 1999-09-10

Publications (1)

Publication Number Publication Date
WO2001020554A1 true WO2001020554A1 (en) 2001-03-22

Family

ID=17308046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2000/006136 WO2001020554A1 (en) 1999-09-10 2000-09-08 Method and apparatus for rendering images with refractions

Country Status (11)

Country Link
US (2) US6784882B1 (en)
EP (1) EP1214689B1 (en)
KR (1) KR100700307B1 (en)
CN (1) CN1204532C (en)
AU (1) AU6875500A (en)
BR (1) BR0013900A (en)
CA (1) CA2384569A1 (en)
DE (1) DE60002487T2 (en)
MX (1) MXPA02002373A (en)
TW (1) TW475155B (en)
WO (1) WO2001020554A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3602061B2 (en) * 2001-02-02 2004-12-15 九州日本電気ソフトウェア株式会社 Three-dimensional graphics display device and method
JP3576126B2 (en) * 2001-07-30 2004-10-13 株式会社ナムコ Image generation system, program, and information storage medium
KR100528343B1 (en) * 2003-07-14 2005-11-15 삼성전자주식회사 Method and apparatus for image-based rendering and editing of 3D objects
US6897871B1 (en) * 2003-11-20 2005-05-24 Ati Technologies Inc. Graphics processing architecture employing a unified shader
US7477777B2 (en) * 2005-10-28 2009-01-13 Aepx Animation, Inc. Automatic compositing of 3D objects in a still frame or series of frames
US7305127B2 (en) * 2005-11-09 2007-12-04 Aepx Animation, Inc. Detection and manipulation of shadows in an image or series of images
US20070139408A1 (en) * 2005-12-19 2007-06-21 Nokia Corporation Reflective image objects
US7970237B2 (en) * 2007-08-01 2011-06-28 Adobe Systems Incorporated Spatially-varying convolutions for rendering glossy reflection effects
US7982734B2 (en) * 2007-08-01 2011-07-19 Adobe Systems Incorporated Spatially-varying convolutions for rendering soft shadow effects
US8243072B2 (en) * 2007-09-11 2012-08-14 Philip Kramer Method for rendering an object
GB2528655B (en) 2014-07-24 2020-10-07 Advanced Risc Mach Ltd Graphics Processing Systems
GB2535792B (en) * 2015-02-27 2021-03-31 Advanced Risc Mach Ltd Graphic processing systems
GB2535791B (en) * 2015-02-27 2021-03-31 Advanced Risc Mach Ltd Graphics processing systems
US10614619B2 (en) 2015-02-27 2020-04-07 Arm Limited Graphics processing systems
GB2541928B (en) 2015-09-04 2018-01-31 Advanced Risc Mach Ltd Graphics processing systems
GB2543766B (en) 2015-10-26 2018-02-14 Advanced Risc Mach Ltd Graphics processing systems
JP6780315B2 (en) * 2016-06-22 2020-11-04 カシオ計算機株式会社 Projection device, projection system, projection method and program
US10366527B2 (en) 2016-11-22 2019-07-30 Samsung Electronics Co., Ltd. Three-dimensional (3D) image rendering method and apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0379225A2 (en) * 1989-01-20 1990-07-25 Daikin Industries, Limited Method and apparatus for displaying a translucent object
EP0447222A2 (en) * 1990-03-16 1991-09-18 Hewlett-Packard Company Gradient calculation for texture mapping
EP0666548A1 (en) * 1994-02-04 1995-08-09 International Business Machines Corporation Graphics system
WO1997034213A2 (en) * 1996-03-14 1997-09-18 I.I. Interactive Innovations Ltd. Computerized graphics systems

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3392644A (en) * 1965-07-12 1968-07-16 Eastman Kodak Co Flash device
JPH0727581B2 (en) 1988-09-09 1995-03-29 インターナショナル・ビジネス・マシーンズ・コーポレーション Graphic processing device
GB2240017A (en) 1990-01-15 1991-07-17 Philips Electronic Associated New, interpolated texture values are fed back to texture memories
US5369734A (en) * 1990-05-18 1994-11-29 Kabushiki Kaisha Toshiba Method for processing and displaying hidden-line graphic images
GB2259432A (en) * 1991-09-06 1993-03-10 Canon Res Ct Europe Ltd Three dimensional graphics processing
US5359704A (en) * 1991-10-30 1994-10-25 International Business Machines Corporation Method for selecting silhouette and visible edges in wire frame images in a computer graphics display system
JP3366633B2 (en) 1991-11-27 2003-01-14 セイコーエプソン株式会社 Pixel changing system and pixel changing method
US5644689A (en) * 1992-01-13 1997-07-01 Hitachi, Ltd. Arbitrary viewpoint three-dimensional imaging method using compressed voxel data constructed by a directed search of voxel data representing an image of an object and an arbitrary viewpoint
GB2270243B (en) * 1992-08-26 1996-02-28 Namco Ltd Image synthesizing system
JPH0785308A (en) * 1993-07-02 1995-03-31 Sony Corp Picture display method
US6005967A (en) * 1994-02-18 1999-12-21 Matushita Electric Industrial Co., Ltd. Picture synthesizing apparatus and method
US5461712A (en) 1994-04-18 1995-10-24 International Business Machines Corporation Quadrant-based two-dimensional memory manager
JP3442181B2 (en) * 1995-02-17 2003-09-02 株式会社ナムコ 3D game device and image composition method
US5956028A (en) * 1995-09-14 1999-09-21 Fujitsu Ltd. Virtual space communication system, three-dimensional image display method, and apparatus therefor
US5977979A (en) * 1995-10-31 1999-11-02 International Business Machines Corporation Simulated three-dimensional display using bit-mapped information
JP3744039B2 (en) * 1995-11-29 2006-02-08 株式会社日立製作所 Perspective drawing creation support method
JPH09319891A (en) * 1996-06-03 1997-12-12 Sega Enterp Ltd Image processor and its processing method
JP3358169B2 (en) * 1996-08-30 2002-12-16 インターナショナル・ビジネス・マシーンズ・コーポレーション Mirror surface rendering method and apparatus
WO1998022911A1 (en) * 1996-11-21 1998-05-28 Philips Electronics N.V. Method and apparatus for generating a computer graphics image
US6226005B1 (en) * 1997-01-31 2001-05-01 LAFERRIèRE ALAIN M Method and system for determining and/or using illumination maps in rendering images
JPH10334275A (en) * 1997-05-29 1998-12-18 Canon Inc Method and system for virtual reality and storage medium
JP3951362B2 (en) 1997-06-12 2007-08-01 株式会社セガ Image processing apparatus, game apparatus, method thereof, and recording medium
US6091422A (en) * 1998-04-03 2000-07-18 Avid Technology, Inc. System for editing complex visual data providing a continuously updated rendering
JP3646969B2 (en) * 1998-05-25 2005-05-11 富士通株式会社 3D image display device
US6201546B1 (en) * 1998-05-29 2001-03-13 Point Cloud, Inc. Systems and methods for generating three dimensional, textured models
US6417850B1 (en) * 1999-01-27 2002-07-09 Compaq Information Technologies Group, L.P. Depth painting for 3-D rendering applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0379225A2 (en) * 1989-01-20 1990-07-25 Daikin Industries, Limited Method and apparatus for displaying a translucent object
EP0447222A2 (en) * 1990-03-16 1991-09-18 Hewlett-Packard Company Gradient calculation for texture mapping
EP0666548A1 (en) * 1994-02-04 1995-08-09 International Business Machines Corporation Graphics system
WO1997034213A2 (en) * 1996-03-14 1997-09-18 I.I. Interactive Innovations Ltd. Computerized graphics systems

Also Published As

Publication number Publication date
KR20020031179A (en) 2002-04-26
DE60002487T2 (en) 2004-03-18
EP1214689A1 (en) 2002-06-19
EP1214689B1 (en) 2003-05-02
BR0013900A (en) 2002-05-07
CN1204532C (en) 2005-06-01
US20050001834A1 (en) 2005-01-06
MXPA02002373A (en) 2002-08-28
KR100700307B1 (en) 2007-03-29
US6972758B2 (en) 2005-12-06
AU6875500A (en) 2001-04-17
US6784882B1 (en) 2004-08-31
CN1373883A (en) 2002-10-09
TW475155B (en) 2002-02-01
CA2384569A1 (en) 2001-03-22
DE60002487D1 (en) 2003-06-05

Similar Documents

Publication Publication Date Title
US6784882B1 (en) Methods and apparatus for rendering an image including portions seen through one or more objects of the image
US6903741B2 (en) Method, computer program product and system for rendering soft shadows in a frame representing a 3D-scene
US6031542A (en) Image processing method and arrangement for the display of reflective objects
US20100238172A1 (en) Cone-culled soft shadows
JP2002236938A (en) Surface shading method using texture map stored on the basis of bidirectional reflection distribution function
Theoharis et al. The magic of the z-buffer: A survey
US7158133B2 (en) System and method for shadow rendering
US7184051B1 (en) Method of and apparatus for rendering an image simulating fluid motion, with recording medium and program therefor
JP2692783B2 (en) System and method for generating a two-dimensional representation of a three-dimensional object
JP3951362B2 (en) Image processing apparatus, game apparatus, method thereof, and recording medium
KR100295709B1 (en) Spotlight characteristic forming method and image processor using the same
JP3517637B2 (en) Image drawing method, image drawing apparatus, and recording medium
AU2017228700A1 (en) System and method of rendering a surface
JPH0546782A (en) Graphic display device
JP2518712B2 (en) Method and apparatus for producing high quality rendering drawing in computer graphic
US6130669A (en) Image processing method for stimulating structure in relief, and device for carrying out the method
JP3612239B2 (en) Image generating apparatus and recording medium
KR20220154780A (en) System and method for real-time ray tracing in 3D environment
CN116758208A (en) Global illumination rendering method and device, storage medium and electronic equipment
AU5248400A (en) Method and apparatus for rendering image
JPH0234069B2 (en)
Thalmann et al. Hidden Surfaces, Reflectance, and Shading
Yerex Rendering of multi-resolution graphics models captured from images
JP2004272856A (en) Glare image generating circuit
JPH04367083A (en) Image generating device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR CA CN KR MX NZ RU SG

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BE CH DE DK ES FI FR GB IT NL SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 68755/00

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: PA/a/2002/002373

Country of ref document: MX

ENP Entry into the national phase

Ref country code: RU

Ref document number: 2002 2002106402

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 2384569

Country of ref document: CA

Ref document number: 008126623

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 1020027003209

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2000957038

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 518225

Country of ref document: NZ

WWP Wipo information: published in national office

Ref document number: 1020027003209

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2000957038

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 2000957038

Country of ref document: EP