Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS5878174 A
Publication typeGrant
Application numberUS 08/746,591
Publication dateMar 2, 1999
Filing dateNov 12, 1996
Priority dateNov 12, 1996
Fee statusLapsed
Publication number08746591, 746591, US 5878174 A, US 5878174A, US-A-5878174, US5878174 A, US5878174A
InventorsPaul Joseph Stewart, Yifan Chen
Original AssigneeFord Global Technologies, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method for lens distortion correction of photographic images for texture mapping
US 5878174 A
Abstract
A method for correcting lens distortion in a photographic image used in texture mapping a CAD image uses a one-dimensional distortion correction curve. The photographic image is represented by an original texture image plane having a plurality of original points produced by a photographic device having a lens. The method has the steps of marking a plurality of data points in a predefined two-dimensional physical wall space and determining a wall space radial distance for each of the plurality of data points, taking a test photograph of the wall space with the photographic device, converting the test photograph to a digital two-dimensional image space, determine an image space radial distance for each of the plurality of data points in the image space, curve fitting the wall space radial distance and the corresponding image space radial distance pairs for each of the plurality of data points, and mapping each of the plurality of original points to a corresponding corrected point with the distortion correction curve so as to produce a corrected texture image plane having a plurality of corrected points.
Images(8)
Previous page
Next page
Claims(8)
We claim:
1. A method for correcting lens distortion in an original texture image plane having a plurality of original points produced by a photographic device having a lens, the method comprising the steps of:
(a) creating a test image plane having a plurality of test points corresponding to a plurality of grid points on a physical grid;
(b) determining a test set of test radial distances for each of the plurality of test points;
(c) determining a grid set of grid radial distances for each of the plurality of grid points;
(d) fitting a distortion correction curve to a radial pair set formed by corresponding elements in the test set and the grid set; and
(e) mapping each of the plurality of original points to a corresponding corrected point with the distortion correction curve so as to produce a corrected texture image plane having a plurality of corrected points.
2. A method according to claim 1 wherein step (a) further includes the steps of:
(i) creating a rectangular grid on a planer surface;
(ii) marking a grid center point at a center of the grid;
(iii) marking grid corner points at vertices of the grid;
(iv) marking a plurality of grid sample points at predetermined locations on the grid;
(v) taking a photograph of the grid; and
(vi) converting the photograph to a digitized test image plane having a test center point, test corner points, and a plurality of test sample points corresponding to the grid center point, the grid corner points, and the plurality of grid sample points, respectively.
3. A method according to claim 2 wherein in step (vi) the photograph is converted to a digitized test image by scanning the photograph into a computer.
4. A method according to claim 2 wherein step (b) further includes the steps of:
(i) determining a test center of the photograph;
(ii) translating the test center to a grid center;
(iii) determining a plurality of test radial distances by finding distance between the test center and each of the plurality of test sample points; and
(iv) normalizing each of the plurality of test radial distances by a width of the photograph to produce a test set of radial distances.
5. A method according to claim 4 wherein step (c) further includes the steps of:
(i) determining a grid center of the grid;
(ii) translating the grid center to a grid test center;
(iii) determining a plurality of test radial distances by finding distance between the test center and each of the plurality of test sample points; and
(iv) normalizing each of the plurality of test radial distances by a width of the grid to produce a grid set of radial distances.
6. A method according to claim 1 wherein the curve fitting in step (d) includes fitting a polynomial curve in the form
r'=a1r1 +a2r2 + . . . +an rn 
where {aj } are the coefficients of the polynomials, and r' and r represent a test radial distance and a grid radial distance, respectively.
7. A method according to claim 6 wherein are the coefficients {aj } of the polynomials are determined using a least-squares approximation.
8. A method for correcting lens distortion in an original texture image plane having a plurality of original points produced by a photographic device having a lens, the method comprising the steps of:
(a) marking a plurality of data points in a predefined two-dimensional physical wall space and determining a wall space radial distance for each of the plurality of data points;
(b) taking a test photograph of the wall space with the photographic device;
(c) converting the test photograph to a digital two-dimensional image space;
(d) determine an image space radial distance for each of the plurality of data points in the image space;
(e) curve fitting the wall space radial distance and the corresponding image space radial distance pairs for each of the plurality of data points; and
(f) mapping each of the plurality of original points to a corresponding corrected point with the distortion correction curve so as to produce a corrected texture image plane having a plurality of corrected points.
Description
FIELD OF THE INVENTION

The present invention relates to image distortion correction in general, and more specifically to correction of lens distortion in photographic images used for texture mapping.

BACKGROUND OF THE INVENTION

Texture mapping is the process of transferring a two-dimensional image onto a three-dimensional computer generated surface. For example, photographic pictures of gauges, dials and controls can be texture mapped onto a simple three-dimensional foundation surface to give a sophisticated rendering of an instrument panel without the need to geometrically model these details. Many automotive applications make use of this technique because it provides photo-realistic images at high frame rates in a cost-effective manner. Particular applications include, but are not limited to, concept vehicle design, product testing in a virtual environment and market evaluation studies.

Many commercial CAD systems are capable of creating and rendering texture maps. However, existing texture map techniques are only effective on planar or near planar surfaces. The mapped images tend to warp and distort when applied to free-form surface geometries common in automotive CAD applications. An example of such distortion is seen by comparing FIG. 1, a photograph of a vehicle tail lamp, with FIG. 2, a texture mapped CAD surface of the same tail lamp. The anomalies apparent in FIG. 2 can be corrected by a commercial artist, but this approach is both expensive and non-deterministic.

The warping and distortion seen in FIG. 2 can be traced to the transfer functions used by the texture mapping technique, which is similar to those used in most current texture mapping techniques. These techniques use a transfer function which relates the surface points of a three-dimensional object to locations in a two-dimensional image space. The function can be defined in either direction; that is, the object can be mapped to the image space or the image space can be mapped to the object. Typically, mapping from the image space to the object is done geometrically by projecting a two-dimensional algebraic surface, such as a plane or sphere representing the image space, onto the object. Any mismatch between the algebraic surface and the three-dimensional surface results in an image distortion (FIG. 2). Mapping from the object to image space must be procedural and can, therefore, be non-deterministic. Various procedures include node-relaxation and surface distance and direction methods, which still introduce image warping. Furthermore, to be computationally feasible these transfer functions are of low order, resulting in discontinuities in the mapped image.

Other existing methods can provide better results by splitting individual details into many small components, each relatively flat and with a separate texture map. However, this reduces flexibility and requires a significant amount of time and effort to produce a finished model. In U.S. Pat. No. 5,255,352, the problem of texture mapping two-dimensional texture images onto developable surfaces, that is, surfaces which can be made solely from flat panels, is addressed. This method is not applicable to free-form surfaces, however, and thus not of practical importance for the automotive industry or other industries desiring a more robust procedure. Another procedure utilizes a polygon fragmentation method of distortion correction for computer image generating systems, as disclosed in U.S. Pat. No. 5,319,744. This method tries to solve the distortion problem by pre-distorting the texture image. The method, however, does not take into account the distortion due to the perspective model used to capture the original texture image. Furthermore, the pre-distortion technique introduces several discontinuities into the mapped image.

Thus, a method is needed to efficiently texture map photographs of large sections of vehicle components onto the surface of a three-dimensional CAD object without distortion.

SUMMARY OF THE INVENTION

In response to deficiencies in the related art, the present invention provides a method for correcting lens distortion in a photographic image used in texture mapping a CAD image by using a one-dimensional distortion correction curve. The photographic image is represented by an original texture image plane having a plurality of original points produced by a photographic device having a lens. The method has the steps of creating a test image plane having a plurality of test points corresponding to a plurality of grid points on a physical grid, determining a test set of test radial distances for each of the plurality of test points, determining a grid set of grid radial distances for each of the plurality of grid points, fitting a distortion correction curve to a radial pair set formed by corresponding elements in the test set and the grid set, and mapping each of the plurality of original points to a corresponding corrected point with the distortion correction curve so as to produce a corrected texture image plane having a plurality of corrected points.

An advantage of the present invention is a method which provides a faithful and distortion-free texture-mapping of photographic images onto CAD models.

Another advantage of the present invention is a camera projection method which cancels out distortions that occur at non-perpendicular angles of incidence so that photographs of the physical object can be taken from any angle as long as the area of interest is visible.

A feature of the present invention is a method which uses an empirically determined, one-dimensional distortion correction function.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other objects, advantages, and features of the present invention will be apparent to those skilled in the art upon reading the following description with reference to the accompanying drawings, in which:

FIG. 1 is a photograph of an automotive vehicle rear tail light;

FIG. 2 is a CAD image of the tail light of FIG. 1 showing distortion due to mapping the photograph to the CAD image;

FIG. 3 is a perspective view showing image texture creation according to one embodiment of the present invention;

FIG. 4 is a perspective view showing texture mapping of the texture image created in FIG. 3 to an object;

FIG. 5 is a flow chart showing the texture image creation portion of the method according to the present invention;

FIG. 6 is a flow chart showing the texture mapping portion of the method according to the present invention;

FIG. 7 is a diagram demonstrating radial distortion of an image due to a lens of a photographic device;

FIG. 8 is a flow chart showing a lens distortion compensation method according to the present invention;

FIG. 9 is a grid showing sample data points for use with the lens distortion compensation method of FIG. 8;

FIG. 10 is a flow chart showing a procedure for creating a test image for use in conjunction with the lens distortion compensation method of FIG. 8;

FIG. 11 is a flow chart showing an image space radial distance computation and normalization portion of the lens distortion method;

FIG. 12 is a grid demonstrating the normalization procedures used in conjunction with the lens distortion compensation method of the present invention;

FIG. 13 is a flow chart showing an wall space radial distance normalization portion of the lens distortion method; and

FIG. 14 is a flow chart showing creation of a corrected texture image plane using a correction function determined with the lens distortion method of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

Referring now to the drawings, and in particular to FIGS. 3 and 4 thereof, two portions of the texture mapping method of the present invention are shown graphically. The apparatus shown in FIG. 3 will be described with the texture creation portion of the method shown in the flow chart of FIG. 5, whereas the texture mapping portion of FIG. 4 corresponds to the flow chart of FIG. 6.

Referring now to FIG. 3, a camera 10 is positioned so that several feature points 12 on a physical object 14, such as a cube with a star thereon, are viewable by the camera 10. The feature points 12 are preferably selected as readily identifiable features of the physical object 14, such as corners of a cube and points of a star (FIG. 3). A photograph 16 is then taken of the physical object 14 with the photographic device 10. For purposed of this disclosure, a photographic device may encompass all devices for creating a two-dimensional image of an object or a scene. Such devices may include for example, a camera either a hand held camera or a television camera, a camcorder, and other similar devices. For purposes of this disclosure, a photograph means a two-dimensional image produced by a photographic device, regardless of the form that the image takes. The photograph, however, must be convertible into a digital image. Conversion of the photograph into a digital image produces an original texture image plane having a plurality of texture image plane points. A two-dimensional coordinate system, or texture coordinate system, is established for the original texture image plane. Each of the texture image plane points has a pair of texture coordinates associated therewith determined by a location measured on the two axes of the texture coordinate system, generally in pixel (integer) units.

Turning now to FIG. 5, the texture creation portion of the texture mapping method of the present invention will be described. In box 20, the feature points on the physical object are identified. The physical camera is then positioned in box 22 so that the feature points 12 are viewable by the camera. The position of the camera is limited only by the line-of-sight rule which requires that the entire texture region is free from occlusion. That is, the camera must be positioned to obtain an image that contains the entire region of the physical object for which a texture mapping to a CAD surface is to be performed. It should be noted that the view direction of the camera need not be perpendicular to the physical object surface. When the camera has been properly positioned, a picture is taken of the physical object as represented in box 24. It should be understood that photograph is used simultaneously with picture for purposes of this disclosure. The picture is then converted into a digital texture image plane, refereed to as an original texture image plane. Such a conversion can be accomplished, for example, by scanning the photograph into a computer. Other ways of converting the picture into a digital texture image plane are also possible in the steps in boxes 24 and 26 of FIG. 5 can be accomplished together with a single photographic device such as a camcorder. Other methods will occur to those skilled in the art, such as a digital camera, in which case the step in box 26 is not necessary since the output from a digital camera is already in digital format.

Continuing with FIG. 5, the original texture image plane is corrected in box 28 for radial distortion due to a lens 18 (FIG. 3) of the camera. This correction of the radio distortion of the photographic device, which is described in further detail below, results in a corrected texture image plane. Next, a physical location and physical orientation of the photographic device with respect to the physical object is determined in boxes 30, 32. In box 30, the accordance of the feature points identified in box 20 are measured in the corrected texture image plane. The physical location and physical orientation of the photographic device can then be determined using a procedure more fully described below. Those skilled in the art will recognize that there are several procedures for determining the location and orientation of the camera based upon the coordinates of the feature points as captured in the corrected texture image plane.

Turning now to FIGS. 4 and 6, the texture mapping portion of the present invention will be described. In FIG. 4, a synthetic camera is constructed, for example in accordance with known techniques. The synthetic camera 40 is then positioned in the CAD space with respect to the three-dimensional surface 42 according to the physical location and orientation of the photographic device determined in boxes 30, 32 of FIG. 5. Each point of the three-dimensional surface (CAD object) corresponding to one of the surface points viewable by the physical camera, including the feature points 12, is then mapped to a corresponding point in the corrected texture image plane 44. The texture coordinate for each the point of the corrected texture image plane is then assigned to the corresponding surface point of the three-dimensional surface (CAD object).

The texture mapping process is shown in FIG. 6. In box 50, the synthetic camera is constructed and in box 52 the synthetic camera is positioned in the CAD space according to the physical location and physical orientation of the camera. It should be understood that the synthetic camera model must closely match the lens and image plane configuration of the physical camera. Then, for each surface point viewable by the synthetic camera, the steps in boxes 54-60 are accomplished. Beginning in box 54, a surface point on the three-dimensional surface is selected. In box 56, the surface point is then mapped onto a corresponding point in the corrected texture image plane through the synthetic camera. In box 58, the location of the image point in the corrected texture plane is determined, and the texture coordinates for the image point are then assigned to the corresponding surface in box 60. The method continues until all surface points of the three-dimensional surface have been assigned texture coordinates.

After all the surface points of the three-dimensional surface have texture coordinates assigned thereto, a texture rendering system, of which there are several known in the art, can be used to display the textured CAD surface.

As described above, texture image creation involves the use of photographic devices, sometimes referred to as image acquisition equipment. Whether the equipment is a typical camera, a digital camera, or a camcorder, the image is always formed through an optical lens or a group of optical lenses. It is known that images formed through optical lenses often exhibit distortions. The severity of the distortion depends on the type of lenses used and, more importantly, the focal length, or equivalently the view angle, of the lens used. For instance, a wide-angle lens often creates more severe distortion than a telescopic lens.

A synthetic camera used in computer graphics, however, is often constructed based on a primitive pin-hole model of image formation. The use of this model typically does not include image distortion due to the theoretical nature of model, as opposed to distortion which is known to exist in the physical world as described above. Thus, to faithfully map an image obtained of a physical object, i.e., with a physical camera having a lens potentially causing distortion, onto a computer generated CAD surface, it is necessary to first correct the image distortion due to the lens. The following lens distortion compensation method according to the present invention was developed for this purpose.

The lens distortion compensation method is based on the assumption that, due to the symmetry of an optical lens in its angular direction and asymmetry in radial direction, distortion occurs only in the radial direction of an original texture image plane. This assumption is called the radial-distortion assumption. For instance, given a point P' in the original texture image plane 62, its actual, correct location P should be somewhere along the radial direction of the point P', i.e., the direction from the center A' of the original texture image plane 62 to the point P', as illustrated in FIG. 7. Based on the radial-distortion assumption, a distorted image can be corrected using a one-dimensional correction function, r'=f(r), where r' denotes the radial distance of any point in the original texture image plane 62 and r the radial distance in the corrected texture image plane 64.

The lens distortion compensation method is shown in FIG. 8. In box 70, a test image plane is created using the photographic device to photograph a test grid, as is further described below with reference to FIG. 10. In boxes 72, 74, radial distances for all points in the test image plane and the test grid, respectively, are determined and normalized, as further described in FIGS. 11 and 13, respectively. After the set of radial distances {ri, r'i } have been determined, a curve fitting method is used in box 76 to produce the one-dimensional, polynomial function correction function r'=f(r). This function is then used in box 77 to form the corrected texture image plane, as FIG. 14 further describes .

To determine the set of data points {ri, r'i } for use in the polynomial curve fitting method, a data acquisition method according to the present invention uses a physical rectangular grid 78 (FIG. 9) with known coordinates for its vertices. The grid 78 is posted on a physical wall and is then photographed with the same photographic device used above to create the original texture image plane. For this reason, the grid 78 on the physical wall is sometimes referred to as a wall space and the coordinates measured, the wall coordinates. Preferably, the following data acquisition method is performed before creation of the original texture image plane, and need only be done once for each photographic device. The data acquisition method has two sub-procedures: 1) create a test image plane from data points on the grid 78 (FIGS. 9-10); and 2) find data set {ri,r'i } (FIGS. 11-13). In the first sub-procedure (FIG. 10), the rectangular grid 78 is posted on a wall in box 80 and the camera is positioned so as to view the grid 78 within the camera's viewfinder. A center point Po is marked in box 82 on the rectangular grid 78 to be captured by the camera, and four corner points Co -C3 of the camera's viewfinder are marked as the corners of the rectangular region 79 in box 84. In box 86, additional data points Pi, i=1 to n are then marked on the rectangular gird 78. A photograph of the rectangular region 79 is taken with the camera in box 88, and the photograph is digitally converted, for example by scanning it into a computer, to create a test image plane in box 89.

In the second sub-procedure of the data acquisition method, the set of radial distances {ri,r'i } is determined from the test image plane and the rectangular region 78 captured in the camera's viewfinder, rectangular region 79 (FIGS. 11-13). Beginning with FIGS. 11 and 12, the geometric center of the test image plane is measured and marked as A' in box 90, and the original origin O' of the test image plane is translated to A' in box 92. Next, the radial distance in the test image plane for each original point P'i, i=0 to n, is found in box 94 as follows:

r'i =((xi '-x'a)2 +(yi '-y'a)2)0.5

In box 96, these radial distances are normalized {r'i } by the width of the photograph as follows:

r'i =4r'i /(||C'2 -C'0 ||+||C'1 -C'3 ||)

where the || symbol represents absolute value

The test image plane radial distances of the data set {ri,r'i } have thus been determined.

Next, with reference to FIG. 13, the location of A' in the wall space (i.e., A) is determined based on its location in the grid 78 in box 100, and the original origin O of the grid 78 is translated to A in box 102. The radial distance of the sample points on the grid 79 for Pi, i=0 to n, are then determined in box 104 as follows:

ri =((xi -xa)2 +(yi -ya)2)0.5

where (xa, ya) is the center location of the image in the wall coordinates. These radial distances are normalized {ri } by the rectangular region 79 width in box 106 as follows:

ri =4ri /(||C2 -C0 ||+||C1 -C3 ||)

With the set of radial distances {ri,r'i } having been determined, the correction function r'=f(r) can be found using various methods known to those skilled in the art to fit a given set of data points. Preferably, a curve fitting method is used which fits a polynomial curve in the following form:

r'=a1r1 +a2r2 + . . . +an rn 

where {aj } are the coefficients of the polynomials that can be determined using known techniques, for example a least-squares approximation.

With the correction function developed as described above, a corrected texture image plane can be created based on the original texture image plane, as seen by the process shown in FIG. 14. First, for a point P(x,y) in the corrected texture image plane, a radial distance r is determined as follows in box 110:

r=((xi -xa)2 -(y-ya)2)0.5

where A(xa, ya) is the geometric center of the corrected texture image plane, and where r is normalized as described above. The one-dimensional correction function, r'=f(r), is then used in box 112 to find a corresponding radial distance, r', in the original texture image plane. The location coordinates of the original point in the original texture image plane are then determined as follows in box 114:

x'=x(r'/r)+x'a 

y'=y(r'/r)+y'a 

where (x'a, y'a) is the center of the original texture image plane. Finally, the texture coordinates from the original point are assigned to the corrected point in box 116 for rendering point P in the corrected texture image plane. The process of FIG. 14 is done for each point in the corrected texture image plane.

Although the preferred embodiment of the present invention has been disclosed, various changes and modifications may be made without departing from the scope of the invention as set forth in the appended claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3962588 *Jun 12, 1975Jun 8, 1976Matteo Paul L DiMethod for making record of objects
US4298944 *Jun 22, 1979Nov 3, 1981Siemens Gammasonics, Inc.Distortion correction method and apparatus for scintillation cameras
US4586038 *Dec 12, 1983Apr 29, 1986General Electric CompanyIn a computer image generator
US4842411 *Feb 6, 1986Jun 27, 1989Vectron, Inc.Method of automatically measuring the shape of a continuous surface
US4862388 *Dec 15, 1986Aug 29, 1989General Electric CompanyDynamic comprehensive distortion correction in a real time imaging system
US5159361 *Aug 3, 1990Oct 27, 1992Par Technology CorporationMethod and apparatus for obtaining the topography of an object
US5255352 *Sep 23, 1992Oct 19, 1993Computer Design, Inc.Mapping of two-dimensional surface detail on three-dimensional surfaces
US5319744 *Apr 3, 1991Jun 7, 1994General Electric CompanyPolygon fragmentation method of distortion correction in computer image generating systems
US5638461 *Apr 12, 1996Jun 10, 1997Kollmorgen Instrument CorporationStereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
US5642293 *Jun 3, 1996Jun 24, 1997Camsys, Inc.Method and apparatus for determining surface profile and/or surface strain
US5694533 *Jul 19, 1996Dec 2, 1997Sony Corportion3-Dimensional model composed against textured midground image and perspective enhancing hemispherically mapped backdrop image for visual realism
US5721585 *Aug 8, 1996Feb 24, 1998Keast; Jeffrey D.Digital video panoramic image capture and display system
US5768443 *Dec 19, 1995Jun 16, 1998Cognex CorporationMethod for coordinating multiple fields of view in multi-camera
Non-Patent Citations
Reference
1"Computer Graphics" Second Edition, Principles and Practice, Addison-Wesley Publishing Co.--(pp. 299-301)1992(Foley et al.)--(Reading, MA).
2"Survey of Texture Mapping" IEEE CG&A 1986 (Heckbert)--(pp. 56-67).
3"Texture and Reflection in Computer Generated Images" Communications of the ACM, Oct. 1976, vol. 19, No. 10 (Blinn et al)--(pp. 542-547).
4"Texture Mapping as a Fundamental Drawing Primitive" Proc. Fourth Eurographics Workshop on Rendering, Paris, France, Jun. 1993--(pp. (1-8)--(Haeberli et al.).
5"Texture Modelling Applications" The Visual Computer 1987 (Gagalowicz) (pp. 186-200).
6"Two-Part Texture Mappings" IEEE CG&A 1986 (Bier et al)--(pp. 40-53).
7 *Computer Graphics Second Edition, Principles and Practice, Addison Wesley Publishing Co. (pp. 299 301)1992(Foley et al.) (Reading, MA).
8 *Survey of Texture Mapping IEEE CG&A 1986 (Heckbert) (pp. 56 67).
9 *Texture and Reflection in Computer Generated Images Communications of the ACM, Oct. 1976, vol. 19, No. 10 (Blinn et al) (pp. 542 547).
10 *Texture Mapping as a Fundamental Drawing Primitive Proc. Fourth Eurographics Workshop on Rendering, Paris, France, Jun. 1993 (pp. (1 8) (Haeberli et al.).
11 *Texture Modelling Applications The Visual Computer 1987 (Gagalowicz) (pp. 186 200).
12 *Two Part Texture Mappings IEEE CG&A 1986 (Bier et al) (pp. 40 53).
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6002525 *Jul 6, 1998Dec 14, 1999Intel CorporationCorrecting lens distortion
US6804380 *May 18, 2000Oct 12, 2004Leica Geosystems Hds, Inc.System and method for acquiring tie-point location information on a structure
US6850250 *Aug 28, 2001Feb 1, 2005Sony CorporationMethod and apparatus for a declarative representation of distortion correction for add-on graphics in broadcast video
US7173672Aug 8, 2002Feb 6, 2007Sony CorporationSystem and method for transitioning between real images and virtual images
US7339609Aug 8, 2002Mar 4, 2008Sony CorporationSystem and method for enhancing real-time data feeds
US7750956Nov 9, 2005Jul 6, 2010Nvidia CorporationUsing a graphics processing unit to correct video and audio data
US7756297 *Oct 31, 2007Jul 13, 2010Pryor Timothy RCamera based sensing in handheld, mobile, gaming, or other devices
US7933431 *Jul 12, 2010Apr 26, 2011Pryor Timothy RCamera based sensing in handheld, mobile, gaming, or other devices
US8022965May 22, 2006Sep 20, 2011Sony CorporationSystem and method for data assisted chroma-keying
US8023758Aug 7, 2007Sep 20, 2011Qualcomm IncorporatedSurface mesh matching for lens roll-off correction
US8111239May 8, 2006Feb 7, 2012Motion Games, LlcMan machine interfaces and applications
US8194924Mar 18, 2011Jun 5, 2012Pryor Timothy RCamera based sensing in handheld, mobile, gaming or other devices
US8306635Jan 23, 2009Nov 6, 2012Motion Games, LlcMotivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US8373718Dec 10, 2008Feb 12, 2013Nvidia CorporationMethod and system for color enhancement with color volume adjustment and variable shift along luminance axis
US8405604Oct 31, 2007Mar 26, 2013Motion Games, LlcAdvanced video gaming methods for education and play using camera based inputs
US8456547Dec 31, 2009Jun 4, 2013Nvidia CorporationUsing a graphics processing unit to correct video and audio data
US8456548Dec 31, 2009Jun 4, 2013Nvidia CorporationUsing a graphics processing unit to correct video and audio data
US8456549Dec 31, 2009Jun 4, 2013Nvidia CorporationUsing a graphics processing unit to correct video and audio data
US8457350Sep 2, 2011Jun 4, 2013Sony CorporationSystem and method for data assisted chrom-keying
US8471852May 30, 2003Jun 25, 2013Nvidia CorporationMethod and system for tessellation of subdivision surfaces
US8538562Apr 5, 2010Sep 17, 2013Motion Games, LlcCamera based interactive exercise
US8553079Dec 14, 2012Oct 8, 2013Timothy R. PryorMore useful man machine interfaces and applications
US8570634Oct 11, 2007Oct 29, 2013Nvidia CorporationImage processing of an incoming light field using a spatial light modulator
US8571346Oct 26, 2005Oct 29, 2013Nvidia CorporationMethods and devices for defective pixel detection
US8588542Dec 13, 2005Nov 19, 2013Nvidia CorporationConfigurable and compact pixel processing apparatus
US8594441Sep 12, 2006Nov 26, 2013Nvidia CorporationCompressing image-based data using luminance
US8614668Oct 6, 2011Dec 24, 2013Motion Games, LlcInteractive video based games using objects sensed by TV cameras
US8624893 *Jul 31, 2009Jan 7, 2014Adobe Systems IncorporatedSystem and method for generating 2D texture coordinates for 3D meshed surfaces
US8654198Apr 30, 2012Feb 18, 2014Timothy R. PryorCamera based interaction and instruction
US8698908Feb 11, 2008Apr 15, 2014Nvidia CorporationEfficient method for reducing noise and blur in a composite still image from a rolling shutter camera
US8698918Dec 30, 2009Apr 15, 2014Nvidia CorporationAutomatic white balancing for photography
US8712183Apr 2, 2010Apr 29, 2014Nvidia CorporationSystem and method for performing image correction
US8723801Mar 26, 2013May 13, 2014Gesture Technology Partners, LlcMore useful man machine interfaces and applications
US8723969Mar 20, 2007May 13, 2014Nvidia CorporationCompensating for undesirable camera shakes during video capture
US8724895Jul 23, 2007May 13, 2014Nvidia CorporationTechniques for reducing color artifacts in digital images
US8736548Mar 26, 2013May 27, 2014Timothy R. PryorInteractive video based games using objects sensed by TV cameras
US8736605Sep 24, 2013May 27, 2014Adobe Systems IncorporatedMethod and apparatus for constraint-based texture generation
US8737832Feb 9, 2007May 27, 2014Nvidia CorporationFlicker band automated detection system and method
US8749662Apr 1, 2010Jun 10, 2014Nvidia CorporationSystem and method for lens shading image correction
US8750641Nov 17, 2011Jun 10, 2014Postech Academy—Industry FoundationApparatus and method for correcting distortion of image
US8760398Dec 14, 2012Jun 24, 2014Timothy R. PryorInteractive video based games using objects sensed by TV cameras
US8768160Dec 30, 2009Jul 1, 2014Nvidia CorporationFlicker band automated detection system and method
US8780128Dec 17, 2007Jul 15, 2014Nvidia CorporationContiguously packed data
EP1486916A1 *Jun 13, 2003Dec 15, 2004STMicroelectronics S.r.l.Digital image processing method for adaptive sharpening
WO2004075111A2 *Jan 23, 2004Sep 2, 2004Aubauer RolandApparatus and method for rectifying an image recorded at a wide angle
Classifications
U.S. Classification382/293, 382/254
International ClassificationG06T15/04, G06T7/00, G06T5/00
Cooperative ClassificationG06T15/04, G06T5/006
European ClassificationG06T5/00G, G06T15/04
Legal Events
DateCodeEventDescription
May 1, 2007FPExpired due to failure to pay maintenance fee
Effective date: 20070302
Mar 2, 2007LAPSLapse for failure to pay maintenance fees
Sep 20, 2006REMIMaintenance fee reminder mailed
Dec 19, 2002ASAssignment
Owner name: CENTRAL MICHIGAN UNIVERSITY, MICHIGAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FORD GLOBAL TECHNOLOGIES, INC.;REEL/FRAME:013305/0305
Effective date: 20021211
Owner name: CENTRAL MICHIGAN UNIVERSITY 1303 W. CAMPUS DRIVE M
Sep 17, 2002REMIMaintenance fee reminder mailed
Aug 26, 2002FPAYFee payment
Year of fee payment: 4
May 2, 1997ASAssignment
Owner name: FORD GLOBAL TECHNOLOGIES, INC., MICHIGAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FORD MOTOR COMPANY;REEL/FRAME:008564/0053
Effective date: 19970430
Mar 3, 1997ASAssignment
Owner name: FORD MOTOR COMPANY, MICHIGAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STEWART, PAUL JOSEPH;CHEN, YIFAN;REEL/FRAME:008386/0885
Effective date: 19961101