Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUSRE36207 E
Publication typeGrant
Application numberUS 08/662,410
Publication dateMay 4, 1999
Filing dateJul 12, 1996
Priority dateMay 13, 1991
Fee statusPaid
Also published asDE69232663D1, DE69232663T2, EP0539565A1, EP0539565A4, EP0971540A1, EP0971540B1, US5185667, WO1992021208A1
Publication number08662410, 662410, US RE36207 E, US RE36207E, US-E-RE36207, USRE36207 E, USRE36207E
InventorsSteven D. Zimmermann, H. Lee Martin
Original AssigneeOmniview, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Omniview motionless camera orientation system
US RE36207 E
Abstract
A device for omnidirectional image viewing providing pan-and-tilt orientation, rotation, and magnification within a hemispherical field-of-view that utilizes no moving parts. The imaging device is based on the effect that the image from a fisheye lens, which produces a circular image of at entire hemispherical field-of-view, which can be mathematically corrected using high speed electronic circuitry. More specifically, an incoming fisheye image from any image acquisition source is captured in memory of the device, a transformation is performed for the viewing region of interest and viewing direction, and a corrected image is output as a video image signal for viewing, recording, or analysis. As a result, this device can accomplish the functions of pan, tilt, rotation, and zoom throughout a hemispherical field-of-view without the need for any mechanical mechanisms. The preferred embodiment of the image transformation device can provide corrected images at real-time rates, compatible with standard video equipment. The device can be used for any application where a conventional pan-and-tilt or orientation mechanism might be considered including inspection, monitoring, surveillance, and target acquisition.
Images(4)
Previous page
Next page
Claims(11)
I claim:
1. A device for providing perspective corrected views of a selected portion of a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output transform calculation signals according to a combination of said digitized signals, said selected viewing angles and said selected magnification;
output image memory means for receiving said output signals from said image transform processor means;
input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said input means and for converting said selected viewing angles and magnification for input to said image transform processor means to control said processing of said transform processor means; and
output means connected to said output image memory means for recording said perspective corrected view according to said selected viewing angles and magnification.
2. The device of claim 1 wherein said output means includes image display means for providing a perspective corrected image display according to said selected viewing angle and said magnification.
3. The device of claim 1 wherein said input means further provides for input of a selected portion of said hemispherical view to said transform processor means.
4. The device of claim 1 wherein said input means further provides for input of a selected tilting of said viewing angle through 180 degrees.
5. The device of claim 1 wherein said input means further provides for input of a selected rotation of said viewing angle through 360 degrees to achieve said perspective corrected view.
6. The device of claim 1 wherein said input means further provides for input of a selected pan of said viewing angle through 180 degrees.
7. The device of claim 1 wherein said output transform calculation signals of said image transform processor means are produced in real-time at video rates.
8. The device of claim 1 wherein said input means is a user-operated manipulator switch means.
9. The device of claim 1 wherein said image transform processor means is programmed to implement the following two equations: ##EQU7## where:
A=(cos .O slashed. cos ∂-sin .O slashed. sin ∂cos β)
B=(sin .O slashed. cos ∂+cos .O slashed. sin ∂cos β)
C=(cos .O slashed. sin ∂+sin .O slashed. cos ∂cos β)
D=(sin .O slashed. sin ∂-cos .O slashed. cos ∂cos β)
and where:
R=radius of the image circle
β=zenith angle
∂=Azimuth angle in image plane
.O slashed.=Object plane rotation angle
m=Magnification
u,v=object plane coordinates
x,y=image plane coordinates
10. A device for providing perspective corrected views of a selected portion of a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output signals, said selected viewing angles and said selected magnification, according to the equations; ##EQU8## where:
A=(cos .O slashed. cos ∂-sin .O slashed. sin ∂cos β)
B=(sin .O slashed. cos ∂+cos .O slashed. sin ∂cos β)
C=(cos .O slashed. sin ∂+sin .O slashed. cos ∂cos β)
D=(sin .O slashed. sin ∂-cos .O slashed. cos ∂cos β)
and where:
R=radius of the image circle
β=zenith angle
∂=Azimuth angle in image plane
.O slashed.=Object plane rotation angle
m=Magnification
u,v=object plane coordinates
x,y=image plane coordinates
output image memory means for receiving said output signals from said image transform processor means;
input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said input means and for converting said selected viewing and magnification for input to said image transform processor means to control said processing of said transform processor means; and
output means connected to said output image means for recording said perspective corrected views according to said selected viewing angles and implementation.
11. A device for providing perspective corrected views of a selected portion or a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output transform calculation signals in real-time at video rates according to a combination of said digitized signals, said viewing angles and said selected magnification;
user operated input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said user operated input means and for converting said selected viewing angles and magnification for input to said image transform processor means to control said processing of said transform processor means;
output image memory means for receiving said output transform calculation signals in real-time and at video rates from said image transform processor means; and
output means connected to said output image memory means for recording said perspective corrected views according to said selected viewing angles and magnification.
Description

.Iadd.This invention was made with Government support under contract NAS1-18855 awarded by NASA. The Government has certain rights in this invention. .Iaddend.

TECHNICAL FIELD

The invention relates to an apparatus, algorithm, and method for transforming a hemispherical field-of-view image into a non-distorted, normal perspective image at any orientation, rotation, and magnification within the field-of-view. The viewing direction, orientation, and magnification are controlled by either computer or remote control means. More particularly, this apparatus is the electronic equivalent of a mechanical pan, tilt, zoom, and rotation camera viewing system with no moving mechanisms.

BACKGROUND ART

Camera viewing systems are utilized in abundance for surveillance, inspection, security, and remote sensing. Remote viewing is critical for robotic manipulation tasks. Close viewing is necessary for detailed manipulation tasks while wide-angle viewing aids positioning of the robotic system to avoid collisions with the work space. The majority of these systems use either a fixed-mount camera with a limited viewing field, or they utilize mechanical pan-and-tilt platforms and mechanized zoom lenses to orient the camera and magnify its image. In the applications where orientation of the camera and magnification of its image are required, the mechanical solution is large and can subtend a significant volume making the viewing system difficult to conceal or use in close quarters. Several cameras are usually necessary to provide wide-angle viewing of the work space.

In order to provide a maximum amount of viewing coverage or subtended angle, mechanical pan/tilt mechanisms usually use motorized drives and gear mechanisms to manipulate the vertical and horizontal orientation. An example of such a device is shown in U.S. Pat. No. 4,728,839 issued to J. B. Coughlan, et al, on Mar. 1, 1988. Collisions with the working environment caused by these mechanical pan/tilt orientation mechanisms can damage both the camera and the work space and impede the remote handling operation. Simultaneously, viewing in said remote environments is extremely important to the performance of inspection and manipulation activities.

Camera viewing systems that use internal optics to provide wide viewing angles have also been developed in order to minimize the size and volume of the camera and the intrusion into the viewing area. These systems rely on the movement of either a mirror or prism to change the tilt-angle of orientation and provide mechanical rotation of the entire camera to change the pitch angle of orientation. Using this means, the size of the camera orientation system can be minimized, but "blind spots" in the center of the view result. Also, these systems typically have no means of magnifying the image and or producing multiple images from a single camera.

Accordingly, it is an object of the present invention to provide an apparatus that can provide an image of any portion of the viewing space within a hemispherical field-of-view without moving the apparatus.

It is another object of the present invention to provide horizontal orientation (pan) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide vertical orientation (tilt) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide rotational orientation (rotation) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide the ability to magnify or scale the image (zoom in and out) electronically.

It is another object of the present invention to provide electronic control of the image intensity (iris level).

It is another object of the present invention to be able to change the image intensity (iris level) without any mechanisms.

It is another object of the present invention to be able to accomplish said pan, tilt, zoom, rotation, and iris with simple inputs made by a lay person from a joystick, keyboard controller, or computer controlled means.

It is also an object of the present invention to provide accurate control of the absolute viewing direction and orientations using said input devices.

A further object of the present invention is to provide the ability to produce multiple images with different orientations and magnifications simultaneously.

Another object of the present invention is to be able to provide these images at real-time video rates, that is 30 transformed images per second, and to support various display format standards such as the National Television Standards Committee RS-170 display format.

These and other objects of the present invention will become apparent upon consideration of the drawings hereinafter in combination with a complete description thereof.

DISCLOSURE OF THE INVENTION

In accordance with the present invention, there is provided an omnidirectional viewing system that produces the equivalent of pan, tilt, zoom, and rotation within a hemispherical field-of-view with no moving parts. This device includes a means for digitizing an incoming video image signal, transforming a portion of said video image based upon operator commands, and producing one or more output images that are in correct perspective for human viewing. In one preferred embodiment, the incoming image is produced by a fisheye lens which has a hemispherical field-of-view. This hemispherical field-of-view image is captured into an electronic memory buffer. A portion of the captured image containing a region-of-interest is transformed into a perspective correct image by image processing computer means. The image processing computer provides direct mapping of the hemispherical image region-of-interest into a corrected image using an orthogonal set of transformation algorithms. The viewing orientation is designated by a command signal generated by either a human operator or computerized input. The transformed image is deposited in a second electronic memory buffer where it is then manipulated to produce the output image as requested by the command signal.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows a schematic block diagram of the present invention illustrating the major components thereof.

FIG. 2 is an example sketch of a typical fisheye image used as input by the present invention.

FIG. 3 is an example sketch of the output image after correction for a desired image orientation and magnification within the original image.

FIG. 4 is a schematic diagram of the fundamental geometry that the present invention embodies to accomplish the image transformation.

FIG. 5 is a schematic diagram demonstrating the projection of the object plane and position vector into image plane coordinates.

BEST MODE FOR CARRYING OUT THE INVENTION

In order to minimize the size of the camera orientation system while maintaining the ability to zoom, a camera orientation system that utilizes electronic image transformations rather than mechanisms was developed. While numerous patents on mechanical pan-and-tilt systems have been filed, no approach using strictly electronic transforms and fisheye optics has ever been successfully implemented prior to this effort. In addition, the electrooptical approach utilized in the present invention allows multiple images to be extracted from the output of a single camera. Motivation for this device came from viewing system requirements in remote handling applications where the operating envelop of the equipment is a significant constraint to task accomplishment.

The principles of the present invention can be understood by reference to FIG. 1. Shown schematically at 1 is the fisheye lens that provides an image of the environment with a 180 degree field-of-view. The fisheye lens is attached to a camera 2 which converts the optical image into an electrical signal. These signals are then digitized electronically 3 and stored in an image buffer 4 within the present invention. An image processing system consisting of an X-MAP and a Y-MAP processor shown as 6 and 7, respectively, performs the two-dimensional transform mapping. The image transform processors are controlled by the microcomputer and control interface 5. The microcomputer control interface provides initialization and transform parameter calculation for the system. The control interface also determines the desired transformation coefficients based on orientation angle, magnification, rotation, and light sensitivity input from an input means such as a joystick controller 12 or computer input means 13. The transformed image is filtered by a 2-dimensional convolution filter 8 and the output of the filtered image is stored in an output image buffer 9. The output image buffer 9 is scanned out by display electronics 10 to a video display device 11 for viewing.

A range of lens types can be accommodated to support various fields of view. The lens optics 1 correspond directly with the mathematical coefficients used with the X-MAP and Y-MAP processors 6 and 7 to transform the image. The capability to pan and tilt the output image remains even though a different maximum field of view is provided with a different lens element.

The invention can be realized by proper combination of a number of optical and electronic devices. The fisheye lens 1 is exemplified by any of a series of wide angle lenses from, for example, Nikon, particularly the 8 mm F2.8. Any video source 2 and image capturing device 3 that converts the optical image into electronic memory can serve as the input for the invention such as a Videk Digital Camera interfaced with Texas Instrument's TMS 34061 integrated circuits. Input and output image buffers 4 and 9 can be constructed using Texas Instrument TMS44C251 video random access memory chips or their equivalents. The control interface can be accomplished with any of a number of microcontrollers including the Intel 80C196. The X-MAP and Y-MAP transform processors 6 and 7 and image filtering 8 can be accomplished with application specific integrated circuits or other means as will be known to persons skilled in the art. The display driver can also be accomplished with integrated circuits such as the Texas Instruments TMS34061. The output video signal can be of the NTSC RS-170, for example, compatible with most commercial television displays in the United States. Remote control 12 and computer control 13 are accomplished via readily available switches and/or computer systems that also will be well known. These components function as a system to select a portion of the input image (fisheye or wide angle) and then mathematically transform the image to provide the proper prospective for output. The keys to the success of the invention include:

(1) the entire input image need not be transformed, only the portion of interest

(2) the required mathematical transform is predictable based on the lens characteristics.

The transformation that occurs between the input memory buffer 4 and the output memory buffer 9, as controlled by the two coordinated transformation circuits 6 and 7, is better understood by looking at FIG. 2 and FIG. 3. The image shown in FIG. 2 is a pen and ink rendering of the image of a grid pattern produced by a fisheye lens. This image has a field-of-view of 180 degrees and shows the contents of the environment throughout an entire hemisphere. Notice that the resulting image in FIG. 2 is significantly distorted relative to human perception. Vertical grid lines in the environment appear in the image plane as 14a, 14b, and 14c. Horizontal grid lines in the environment appear in the image plane as 15a, 15b, and 15c. The image of an object is exemplified by 16. A portion of the image in FIG. 2 has been correct, magnified, and rotated to produce the image shown in FIG. 3. Item 17 shows the corrected representation of the object in the output display. The results shown in the image in FIG. 3 can be produced from any portion of the image of FIG. 2 using the present invention. Note the corrected perspective as demonstrated by the straightening of the grid pattern displayed in FIG. 3. In the present invention, these transformations can be performed at real-time video rates (30 times per second), compatible with commercial video standards.

The invention as described has the capability to pan and tilt the output image through the entire field of view of the lens element by changing the input means, e.g. the joystick or computer, to the controller. This allows a large area to be scanned for information as can be useful in security and surveillance applications. The image can also be rotated through 360 degrees on its axis changing the perceived vertical of the displayed image. This capability provides the ability to align the vertical image with the gravity vector to maintain a proper perspective in the image display regardless of the pan or tilt angle of the image. The invention also supports modifications in the magnification used to display the output image. This is commensurate with a zoom function that allows a change in the field of view of the output image. This function is extremely useful for inspection operations. The magnitude of zoom provided is a function of the resolution of the input camera, the resolution of the output display, the clarity of the output display, and the amount of picture element (pixel) averaging that is used in a given display. The invention supports all of these functions to provide capabilities associated with traditional mechanical pan (through 180 degrees), tilt (through 180 degrees), rotation (through 360 degrees), and zoom devices. The digital system also supports image intensity scaling that emulates the functionality of a mechanical iris by shifting the intensity or the displayed image based on commands from the user or an external computer.

The postulates and equations that follow are based on the present invention utilizing a fisheye lens as the optical element. There are two basic properties and two basic postulates that describe the perfect fisheye lens system. The first property of a fisheye lens is that the lens has a 2π steradian field-of-view and the image it produces is a circle. The second property is that all objects in the field-of-view are in focus, i.e. the perfect fisheye lens has an infinite depth-of-field. The two important postulates of the fisheye lens system (refer to FIGS. 4 and 5) are stated as follows:

Postulate 1: Azimuth angle invariability--For object points that lie in a content plane that is perpendicular to the image plane and passes through the image plane origin, all such points are mapped as image points onto the line of intersection between the image plane and the content plane, i.e. along a radial line. The azimuth angle or the image points is therefore invariant to elevation and object distance changes within the content plane.

Postulate 2: Equidistant Projection Rule--The radial distance, r, from the image plane origin along the azimuth angle containing the projection of the object point is linearly proportional to the zenith angle β, where β is defined as the angle between a perpendicular line through the image plane origin and the line from the image plane origin to the object point. Thus the relationship:

r=kβ                                                  (1)

Using these properties and postulates as the foundation of the fisheye lens system, the mathematical transformation for obtaining a perspective corrected image can be determined. FIG. 4 shows the coordinate reference frames for the object plane and the image plane. The coordinates u,v describe object points within the object plane. The coordinates x,y,z describe points within the image coordinate frame of reference.

The object plane shown in FIG. 4 is a typical region of interest to determine the mapping relationship onto the image plane to properly correct the object. The direction of view vector, DOV x,y,z!, determines the zenith and azimuth angles for mapping the object plane, UV, onto the image plane, XY. The object plane is defined to be perpendicular to the vector, DOV x,y,z!.

The location of the origin of the object plane in terms of the image plane x y,z! in spherical coordinates is given by:

x=D sin β cos ∂

y=D sin β sin ∂

x=D cos β                                             (2)

where D=scalar length from the image plane origin to the object plane origin, β, is the zenith angle, and ∂ is the azimuth angle in image plane spherical coordinates. The origin of object plane is represented as a vector using the components given in equation 1 as:

DOV x,y,z!= D sin β cos ∂, D sin β sin ∂, D cos β!                             (3)

DOV x,y,z! is perpendicular to the object plane and its scalar magnitude D provides the distance to the object plane. By aligning the YZ plane with the direction of action of DOV x,y,z!, the azimuth angle α becomes either 90 or 270 degrees and therefore the x component becomes zero resulting in the DOV x,y,z! coordinates:

DOV x,y,z!= 0, -D sin β, D cos β!                (4)

Referring now to FIG. 5, the object point relative to the UV plane origin in coordinates relative to the origin of the image plane is given by the following:

x=u

y=v cos β

z=v sin β                                             (5)

therefore, the coordinates of a point P(u,v) that lies in the object plane can be represented as a vector P x y,z! in image plane coordinates:

P x,y,z!= u, v cos β, v sin β!                   (6)

where P x,y,z! describes the position of the object point in image coordinates relative to the origin of the UV plane. The object vector O x,y,z! that describes the object point in image coordinates is then given by:

O x,y,z!=DOV x,y,z!+P x,y,z!                               (7)

O x,y,z!= u, v cos β-D sin β, v sin β+D cos β!(8)

Projection onto a hemisphere of radius R attached to the image plane is determined by scaling the object vector O x,y,z! to produce a surface vector S x,y,z,!: ##EQU1##

By substituting for the components of O x,y,z! from Equation 8, the vector S x,y,z! describing the image point mapping onto the hemisphere becomes: ##EQU2##

The denominator in Equation 10 represents the length or absolute value of the vector O x,y,z! and can be simplified through algebraic and trigonometric manipulation to give: ##EQU3##

From equation 11, the mapping onto the two-dimensional image plane can be obtained for both x and y as: ##EQU4##

Additionally, the image plane center to object plane distance D can be represented in terms of the fisheye image circular radius R by the relation:

D=mR                                                       (14)

where m represents the scale factor in radial units R from the image plane origin to the object plane origin. Substituting Equation 14 into Equations 12 and 13 provides a means for obtaining an effective scaling operation or magnification which can be used to provide zoom operation. ##EQU5##

Using the equations for two-dimensional rotation of axes for both the UV object plane and the XY image plane the last two equations can be further manipulated to provide a more general set of equations that provides for rotation within the image plane and rotation within the object plane. ##EQU6## where:

A=(cos .0. cos ∂-sin .0. sin ∂ cos β)

B=(sin .0. cos ∂+cos .0. sin ∂ cos β)

C=(cos .0. sin ∂+sin .0. cos ∂ cos β)

D=(sin .0. sin ∂-cos .0. cos ∂ cis β)

and where:

R=radius of the image circle

β=zenith angle

∂=Azimuth angle in image plane

.0.=Object plane rotation angle

m=Magnification

u,v=object plane coordinates

x,y=image plane coordinates

The Equations 17 and 18 provide a direct mapping from the UV space to the XY image space and are the fundamental mathematical result that supports the functioning of the present omnidirectional viewing system with no moving parts. By knowing the desired zenith, azimuth, and object plane rotation angles and the magnification, the locations of X and y in the imaging array can be determined. This approach provides a means to transform an image from the input video buffer to the output video buffer exactly. Also, the fisheye image system is completely symmetrical about the zenith, therefore, the vector assignments and resulting signs of various components can be chosen differently depending on the desired orientation of the object plane with respect to the image plane. In addition, these postulates and mathematical equations can be modified for various lens elements as necessary for the desired field-of-view coverage in a given application.

The input means defines the zenith angle, β, the azimuth angle, ∂, the object rotation, .0., and the magnification, m. These values are substituted into Equations 19 to determine values for substitution into Equations 17 and 18. The image circle radius, R, is a fixed value that is determined by the camera lens ane element relationship. The variables u and v vary throughout the object plane determining the values for x and y in the image plane coordinates.

From the foregoing, it can be seen that a fisheye lens provides a hemispherical view that is captured by a camera. The image is then transformed into a corrected image at a desired pan, tilt, magnification, rotation, and focus based on the desired view as described by a control input. The image is then output to a television display with the perspective corrected. Accordingly, no mechanical devices are required to attain this extensive analysis and presentation of the view of an environment through 180 degrees of pan, 180 degrees of tilt, 360 degrees of rotation, and various degrees of zoom magnification.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4772942 *Jan 7, 1987Sep 20, 1988Pilkington P.E. LimitedDisplay system having wide field of view
US5023725 *Oct 23, 1989Jun 11, 1991Mccutchen DavidMethod and apparatus for dodecahedral imaging system
US5067019 *Mar 31, 1989Nov 19, 1991The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationProgrammable remapper for image processing
US5068735 *Aug 13, 1990Nov 26, 1991Fuji Photo Optical Co., Ltd.System for controlling the aiming direction, focus, zooming, and/or position of a television camera
EP0011909A1 *Aug 10, 1979Jun 11, 1980E.I. Du Pont De Nemours And CompanyX-ray intensifying screen based on a tantalate phosphor and process for producing the phosphor
JPH02127877A * Title not available
WO1982003712A1 *Apr 10, 1981Oct 28, 1982Gabriel Steven AllenController for system for spatially transforming images
Non-Patent Citations
Reference
1"Declaration of Scott Gilbert in Support of Defendant Infinite Pictures' Memorandum in Opposition to Plaintiff's Motion for Preliminary Injunction", Omniview, Inc. v. Infinite Pictures, Inc., Civ. Action No. 3-96-849
2A. Paeth, "Digital Cartography for Computer Graphics", Graphics Gems, 1990, pp. 307-320.
3 *A. Paeth, Digital Cartography for Computer Graphics , Graphics Gems, 1990, pp. 307 320.
4 *Block diagram of Greene/NYIT system (1986), Greene testimony from Transcript, Feb. 2, 1998, pp. 33 37.
5Block diagram of Greene/NYIT system (1986), Greene testimony from Transcript, Feb. 2, 1998, pp. 33-37.
6 *Color copies of 12 prior art slides (1984 86) shown and described in the Greene trial testimony in Transcript pages identified in captions, Feb. 2, 1998.
7Color copies of 12 prior art slides (1984-86) shown and described in the Greene trial testimony in Transcript pages identified in captions, Feb. 2, 1998.
8 *Color image of the Hawthorne Bridge showing distortion at different magnification (Birdwell Transcript of Feb. 5, 1998, pp. 149 152).
9Color image of the Hawthorne Bridge showing distortion at different magnification (Birdwell Transcript of Feb. 5, 1998, pp. 149-152).
10 *Declaration of Scott Gilbert in Support of Defendant Infinite Pictures Memorandum in Opposition to Plaintiff s Motion for Preliminary Injunction , Omniview, Inc. v. Infinite Pictures, Inc. , Civ. Action No. 3 96 849
11Diagrams of the geometry employed in the "Fisheye to Box" and "Poly" software used in the DX 280 system, Greene trial testimony from Transcript, Feb. 2, 1998, pp. 43-50.
12 *Diagrams of the geometry employed in the Fisheye to Box and Poly software used in the DX 280 system, Greene trial testimony from Transcript, Feb. 2, 1998, pp. 43 50.
13 *Drawing from Zimmermann testimony, Jan. 6, 1998.
14 *DX 402 showing similarity of transforms performed by TRW TMC2302 and TMC 2301 ASICs TRW LSI Products, Inc., LaJolla, CA, 1988. (Birdwell Transcript, Feb. 5, 1998. pp. 157 160).
15DX 402- showing similarity of transforms performed by TRW TMC2302 and TMC 2301 ASICs TRW LSI Products, Inc., LaJolla, CA, 1988. (Birdwell Transcript, Feb. 5, 1998. pp. 157-160).
16F. Kenton Musgrave, "A Panoramic Virtual Screen for Ray Tracing", Graphics Gems, 1992, pp. 288-294.
17 *F. Kenton Musgrave, A Panoramic Virtual Screen for Ray Tracing , Graphics Gems, 1992, pp. 288 294.
18F. Pearson II, "Map Projections Theory and Applications", CRC Press, Inc., 1990, pp. 215-345.
19 *F. Pearson II, Map Projections Theory and Applications , CRC Press, Inc., 1990, pp. 215 345.
20 *Function, Statistics and Trigonometry , Scott, Foresman & Company, 1992, pp. i x, 143, 709 720.
21Function, Statistics and Trigonometry, Scott, Foresman & Company, 1992, pp. i-x, 143, 709-720.
22G. David Ripley, "DVI--A Digital Multimedia Technology", Communications of the ACM Jul. 1989, vol. 32, No. 7, pp. 811-822.
23 *G. David Ripley, DVI A Digital Multimedia Technology , Communications of the ACM Jul. 1989, vol. 32, No. 7, pp. 811 822.
24G. Wolberg, "Digital Image Warping", IEEE Computer Society Press, 1988.
25 *G. Wolberg, Digital Image Warping , IEEE Computer Society Press, 1988.
26 *Greene, Ned and Heckbert, Mark, Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter, IEEE Computer Graphics and Applications , Jun. 1986, pp. 21 27.
27Greene, Ned and Heckbert, Mark, Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter, IEEE Computer Graphics and Applications, Jun. 1986, pp. 21-27.
28 *Greene, Ned, A Method of Modeling Sky for Computer Animation , Proceedings for a computer animation conference, 1984.
29Greene, Ned, A Method of Modeling Sky for Computer Animation, Proceedings for a computer animation conference, 1984.
30 *Greene, Ned, Environment Mapping and Other Applications of World Projections, IEEE Computer Graphics and Applications , Nov. 1986, pp. 21 29.
31Greene, Ned, Environment Mapping and Other Applications of World Projections, IEEE Computer Graphics and Applications, Nov. 1986, pp. 21-29.
32Heckbert, "Fundamentals of Texture Mapping and Image Warping", Report No. UCB/CSD 89/516, Jun. 1989.
33Heckbert, "The PMAT and Poly User's Manual", NYIT Document, 1983.
34 *Heckbert, Fundamentals of Texture Mapping and Image Warping , Report No. UCB/CSD 89/516, Jun. 1989.
35 *Heckbert, Paul, Fundamentals of Texture Mapping and Image Warping , Computer Science Division, University of California, Berkeley, Masters Thesis, Jun. 1989.
36Heckbert, Paul, Fundamentals of Texture Mapping and Image Warping, Computer Science Division, University of California, Berkeley, Masters Thesis, Jun. 1989.
37 *Heckbert, Paul, NYIT PMAT and Poly Users Manual, 1983.
38 *Heckbert, The PMAT and Poly User s Manual , NYIT Document, 1983.
39Intel Corporation, "Action Media 750 Production Tool Reference", 1988, 1991.
40 *Intel Corporation, Action Media 750 Production Tool Reference , 1988, 1991.
41J. Blinn et al., "Texture and Reflection in Computer Generated Images," Comm. ACM, vol. 19, No. 10, 1976, pp. 542-547.
42 *J. Blinn et al., Texture and Reflection in Computer Generated Images, Comm. ACM, vol. 19, No. 10, 1976, pp. 542 547.
43J. D. Foley et al., "Computer Graphics: Principles and Practice", 1990, 1996, pp. 229-381.
44 *J. D. Foley et al., Computer Graphics: Principles and Practice , 1990, 1996, pp. 229 381.
45M. Onoe et al., "Digital Processing of Images Taken by Fish-Eye Lens", IEEE: Proceedings, New York, 1982, vol. 1, pp. 105-108.
46 *M. Onoe et al., Digital Processing of Images Taken by Fish Eye Lens , IEEE: Proceedings, New York, 1982, vol. 1, pp. 105 108.
47N. Greene et al., "Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter", IEEE Computer Graphics and Applications, Jun. 1986, pp. 21-27.
48 *N. Greene et al., Creating Raster Omnimax Images from Multiple Perspective Views Using the Elliptical Weighted Average Filter , IEEE Computer Graphics and Applications, Jun. 1986, pp. 21 27.
49N. Greene, "A Method of Modeling Sky for Computer Animations", Proc. First Int'l. Conf. Engineering and Computer Graphics, Aug. 1984, pp. 297-300.
50N. Greene, "Environment Mapping and Other Applications of World Projections", IEEE Computer Graphics and Applications, Nov. 1986, pp. 21-29.
51 *N. Greene, A Method of Modeling Sky for Computer Animations , Proc. First Int l. Conf. Engineering and Computer Graphics, Aug. 1984, pp. 297 300.
52 *N. Greene, Environment Mapping and Other Applications of World Projections , IEEE Computer Graphics and Applications, Nov. 1986, pp. 21 29.
53 *Plaintiff s exhibits PX 559 and PX560 showing the failure of perspective correction of a fisheye image of Hawthorne Bridge side rail using 667 patent test program (previously submitted) when actual image radius R 256 pixels is used and with a reduced radius, R 230 pixels, to get a somewhat less distorted output image. (Birdwell Feb. 5, 1998 Transcript at pp. 149 150, line 1).
54Plaintiff's exhibits PX 559 and PX560 showing the failure of perspective correction of a fisheye image of Hawthorne Bridge side rail using '667 patent test program (previously submitted) when actual image radius R˜256 pixels is used and with a reduced radius, R˜230 pixels, to get a somewhat less distorted output image. (Birdwell Feb. 5, 1998 Transcript at pp. 149-150, line 1).
55R. Kingslake, "Optical System Design", Academic Press, 1983, pp. 86-87.
56 *R. Kingslake, Optical System Design , Academic Press, 1983, pp. 86 87.
57S. Morris, "Digital Video Interactive--A New Integrated Format for Multi-Media Information", Microcomputer for Information Management, Dec. 1987, 4(4):249-261.
58 *S. Morris, Digital Video Interactive A New Integrated Format for Multi Media Information , Microcomputer for Information Management, Dec. 1987, 4(4):249 261.
59S. Ray, "The Lens in Action", Hastings House, 1976, pp. 114-117.
60 *S. Ray, The Lens in Action , Hastings House, 1976, pp. 114 117.
61 *TMC2301 ASICS Data Sheet, TRW LSI Products, Inc., LaJolla, CA, 1988.
62 *TMC2302 ASICS Data Sheets, TRW LSI Products, Inc., LaJolla, CA, 1990.
63 *Transcript of trial testimony of Dr. Ned Greene, Interactive Pictures Corporation, f/k/a Omniview, Inc., v. Infinite Pictures, Inc. and Bill Tillman , Civil Action No. 3 96 849, U.S. District Court, Eastern District of Tennessee, Feb. 2, 1998.
64Transcript of trial testimony of Dr. Ned Greene, Interactive Pictures Corporation, f/k/a Omniview, Inc., v. Infinite Pictures, Inc. and Bill Tillman, Civil Action No. 3-96-849, U.S. District Court, Eastern District of Tennessee, Feb. 2, 1998.
65 *Transcript of trial testimony of Steven D. Zimmermann, Interactive Pictures Corporation, f/k/a Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman , Civil Action No. 3 96 849, U.S. District Court, Eastern District of Tennessee, Jan. 6, 1998, pp. 77 142, 152 156.
66Transcript of trial testimony of Steven D. Zimmermann, Interactive Pictures Corporation, f/k/a Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Civil Action No. 3-96-849, U.S. District Court, Eastern District of Tennessee, Jan. 6, 1998, pp. 77-142, 152-156.
67 *Transcripts of relevant trial testimony of Dr. Douglas Birdwell: Transcript of Jan. 7, 1998, pp. 61 72, Transcript of Jan. 8, 1998, pp. 27 42, Transcript of Feb. 5, 1998, pp. 65 165.
68Transcripts of relevant trial testimony of Dr. Douglas Birdwell: Transcript of Jan. 7, 1998, pp. 61-72, Transcript of Jan. 8, 1998, pp. 27-42, Transcript of Feb. 5, 1998, pp. 65-165.
69 *Two (2) Japanese prior art articles authored by Dr. Murio Kuno (1980).
70 *Two sketches drawn by Dr. Douglas Birdwell, Transcript, Jan. 8, 1998, pp. 27 41.
71Two sketches drawn by Dr. Douglas Birdwell, Transcript, Jan. 8, 1998, pp. 27-41.
72 *U.S. Geological Survey Professional Paper 1395, Map Projections A Working Manual , 1987, pp. viii ix, 3 10, 33 35, 90 91, 164 168.
73U.S. Geological Survey Professional Paper 1395, Map Projections--A Working Manual, 1987, pp. viii-ix, 3-10, 33-35, 90-91, 164-168.
74 *Upstill, Steve, Building Strong Images , UNIX Review, Oct. 1988, pp. 63 73.,
75Upstill, Steve, Building Strong Images, UNIX Review, Oct. 1988, pp. 63-73.,
76 *Zimmermann et al, excerpts from Phase I NASA Test Report, Aug. 1988.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6181335Sep 21, 1998Jan 30, 2001Discovery Communications, Inc.Card for a set top terminal
US6219089 *Jun 24, 1999Apr 17, 2001Be Here CorporationMethod and apparatus for electronically distributing images from a panoptic camera system
US6222683Jul 31, 2000Apr 24, 2001Be Here CorporationPanoramic imaging arrangement
US6331869Aug 7, 1998Dec 18, 2001Be Here CorporationMethod and apparatus for electronically distributing motion panoramic images
US6625812 *Oct 22, 1999Sep 23, 2003David Hardin AbramsMethod and system for preserving and communicating live views of a remote physical location over a computer network
US6675386Sep 4, 1997Jan 6, 2004Discovery Communications, Inc.Apparatus for video access and control over computer network, including image correction
US6704434 *Aug 3, 2000Mar 9, 2004Suzuki Motor CorporationVehicle driving information storage apparatus and vehicle driving information storage method
US6833843 *Dec 3, 2001Dec 21, 2004Tempest MicrosystemsPanoramic imaging and display system with canonical magnifier
US7058239Oct 28, 2002Jun 6, 2006Eyesee360, Inc.System and method for panoramic imaging
US7123777Sep 26, 2002Oct 17, 2006Eyesee360, Inc.System and method for panoramic imaging
US7274381 *Dec 20, 2004Sep 25, 2007Tempest Microsystems, Inc.Panoramic imaging and display system with canonical magnifier
US7304680 *Oct 3, 2002Dec 4, 2007Siemens AktiengesellschaftMethod and device for correcting an image, particularly for occupant protection
US7382399May 21, 1999Jun 3, 2008Sony CoporationOmniview motionless camera orientation system
US7629995Sep 24, 2004Dec 8, 2009Sony CorporationSystem and method for correlating camera views
US7707137 *Jul 6, 2006Apr 27, 2010Sun Microsystems, Inc.Method and apparatus for browsing media content based on user affinity
US7714936 *Jul 2, 1997May 11, 2010Sony CorporationOmniview motionless camera orientation system
US7750936Sep 24, 2004Jul 6, 2010Sony CorporationImmersive surveillance system interface
US7834907Mar 2, 2005Nov 16, 2010Canon Kabushiki KaishaImage-taking apparatus and image processing method
US8134608 *Oct 27, 2008Mar 13, 2012Alps Electric Co., Ltd.Imaging apparatus
US8284258Sep 18, 2009Oct 9, 2012Grandeye, Ltd.Unusual event detection in wide-angle video (based on moving object trajectories)
US8547423Sep 22, 2010Oct 1, 2013Alex NingImaging system and device
US8670001Nov 30, 2006Mar 11, 2014The Mathworks, Inc.System and method for converting a fish-eye image into a rectilinear image
US8692881Jun 25, 2009Apr 8, 2014Sony CorporationSystem and method for correlating camera views
US8723951Nov 23, 2005May 13, 2014Grandeye, Ltd.Interactive wide-angle video server
US20120114262 *Apr 21, 2011May 10, 2012Chi-Chang YuImage correction method and related image correction system thereof
US20130044258 *Oct 17, 2011Feb 21, 2013Danfung DennisMethod for presenting video content on a hand-held electronic device
EP1600890A2 *Feb 15, 2005Nov 30, 2005Kabushiki Kaisha ToshibaDistortion correction of fish-eye image
Classifications
U.S. Classification348/207.99, 348/143, 348/36
International ClassificationA61B19/00, G06T1/00, H04N5/335, H04N7/00, G08B13/196, G06F17/30, H04N5/225, H04N7/18, H04N1/21, G06T3/00, H04N5/262, G08B15/00
Cooperative ClassificationG03B37/06, H04N5/2259, G08B13/19628, H04N5/23238, H04N7/002, G06T3/0062, H04N1/2158, H04N1/217, H04N7/183, H04N5/2628, G06T3/0018, A61B2019/4868, H04N5/335
European ClassificationG06T3/00P, G08B13/196C4W, H04N5/232M, G06T3/00C2, H04N1/21B7, H04N1/21C2, H04N7/00B, H04N7/18D, H04N5/335, H04N5/262T, H04N5/225V
Legal Events
DateCodeEventDescription
Mar 29, 2007ASAssignment
Owner name: SONY CORPORATION,JAPAN
Owner name: SONY CORPORATION, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPIX CORPORATION;REEL/FRAME:019084/0034
Effective date: 20070222
Owner name: SONY CORPORATION,JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPIX CORPORATION;REEL/FRAME:019084/0034
Effective date: 20070222
Owner name: SONY CORPORATION, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPIX CORPORATION;REEL/FRAME:019084/0034
Effective date: 20070222
Aug 9, 2004FPAYFee payment
Year of fee payment: 12
Nov 9, 2001ASAssignment
Owner name: INTERACTIVE PICTURES CORPORATION, TENNESSEE
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0982
Effective date: 20010926
Owner name: INTERMET PICTURES CORPORATION, TENNESSEE
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0986
Effective date: 20010926
Owner name: PW TECHNOLOGY, INC., CALIFORNIA
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0978
Effective date: 20010926
Owner name: INTERACTIVE PICTURES CORPORATION 1009 COMMERCE PAR
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0982
Owner name: INTERACTIVE PICTURES CORPORATION 1009 COMMERCE PAR
Owner name: INTERMET PICTURES CORPORATION 1009 COMMERCE PARK D
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0986
Owner name: INTERMET PICTURES CORPORATION 1009 COMMERCE PARK D
Owner name: PW TECHNOLOGY, INC. 3160 CROW CANYON RD. SAN RAMON
Owner name: PW TECHNOLOGY, INC. 3160 CROW CANYON RD.SAN RAMON,
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0978
Owner name: INTERACTIVE PICTURES CORPORATION,TENNESSEE
Owner name: INTERMET PICTURES CORPORATION,TENNESSEE
Owner name: PW TECHNOLOGY, INC.,CALIFORNIA
Owner name: PW TECHNOLOGY, INC.,CALIFORNIA
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0978
Effective date: 20010926
Owner name: INTERACTIVE PICTURES CORPORATION,TENNESSEE
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0982
Effective date: 20010926
Owner name: PW TECHNOLOGY, INC. 3160 CROW CANYON RD. SAN RAMON
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0978
Effective date: 20010926
Owner name: INTERMET PICTURES CORPORATION 1009 COMMERCE PARK D
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0986
Effective date: 20010926
Owner name: PW TECHNOLOGY, INC. 3160 CROW CANYON RD.SAN RAMON,
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0978
Effective date: 20010926
Owner name: INTERMET PICTURES CORPORATION 1009 COMMERCE PARK D
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0986
Effective date: 20010926
Owner name: INTERMET PICTURES CORPORATION,TENNESSEE
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0986
Effective date: 20010926
Owner name: INTERACTIVE PICTURES CORPORATION 1009 COMMERCE PAR
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC /AR;REEL/FRAME:012295/0982
Effective date: 20010926
Owner name: INTERACTIVE PICTURES CORPORATION 1009 COMMERCE PAR
Free format text: RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0982
Effective date: 20010926
May 23, 2001ASAssignment
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERACTIVE PICTURES CORPORATION /AR;REEL/FRAME:011837/0431
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERNET PICTURES CORPORATION /AR;REEL/FRAME:011828/0054
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:PW TECHNOLOGY, INC. /AR;REEL/FRAME:011828/0088
Owner name: IMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMP
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERACTIVE PICTURES CORPORATION /AR;REEL/FRAME:011837/0431
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERNET PICTURES CORPORATION /AR;REEL/FRAME:011828/0054
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:PW TECHNOLOGY, INC.;REEL/FRAME:011828/0088
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:PW TECHNOLOGY, INC. /AR;REEL/FRAME:011828/0088
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:PW TECHNOLOGY, INC.;REEL/FRAME:011828/0088
Effective date: 20010514
Owner name: IMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMP
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERACTIVE PICTURES CORPORATION;REEL/FRAME:011837/0431
Effective date: 20010514
Owner name: IMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMP
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERACTIVE PICTURES CORPORATION;REEL/FRAME:011837/0431
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERNET PICTURES CORPORATION;REEL/FRAME:011828/0054
Effective date: 20010514
Owner name: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM
Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERNET PICTURES CORPORATION;REEL/FRAME:011828/0054
Effective date: 20010514
Aug 24, 2000FPAYFee payment
Year of fee payment: 8
Aug 24, 2000SULPSurcharge for late payment
Aug 25, 1998ASAssignment
Owner name: INTERACTIVE PICTURES CORPORATION, TENNESSEE
Free format text: CHANGE OF NAME;ASSIGNOR:OMNIVIEW, INC.;REEL/FRAME:009401/0428
Effective date: 19971208
Apr 6, 1998ASAssignment
Owner name: IPIX, TENNESSEE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MARTIN, H. LEE;REEL/FRAME:009087/0230
Effective date: 19980403