CA2287453A1 - Optical imaging system and graphic user interface - Google Patents

Optical imaging system and graphic user interface Download PDF

Info

Publication number
CA2287453A1
CA2287453A1 CA002287453A CA2287453A CA2287453A1 CA 2287453 A1 CA2287453 A1 CA 2287453A1 CA 002287453 A CA002287453 A CA 002287453A CA 2287453 A CA2287453 A CA 2287453A CA 2287453 A1 CA2287453 A1 CA 2287453A1
Authority
CA
Canada
Prior art keywords
image
imaging
lens
camera
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002287453A
Other languages
French (fr)
Inventor
Zbigniew Rybczynski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zbig Vision Gesellschaft fur neue Bildgestaltung mbH
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2287453A1 publication Critical patent/CA2287453A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B15/00Optical objectives with means for varying the magnification
    • G02B15/14Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective
    • G02B15/142Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective having two groups only

Abstract

The invention relates to an optical imaging system for imaging an object with progressively adjustable magnification. The inventive system comprises an imaging plane (6) for projecting an image field originating from the object and an imaging optical system (5) situated on the object side in front of the imaging plane, preferably with a fixed focal length for producing the image field on the imaging plane. Said imaging optical system and/or said imaging plane are arranged so that they can move in the direction of the optical axis in order to adjust the magnification and/or focus the image appearing on the imaging plane. A first optical element which receives optical radiation through a wide angle is provided on the object side in front of the imaging optical system to select and/or transmit those light beams originating from the object, said light beams running through a predetermined focal point or nodal point (N), and a second optical element is positioned between the first optical element and the imaging optical system in such a way that the light beams running through the predetermined focal or nodal point produce an image of the object in an intermediate image plane. This image can then be projected by the imaging optical system into the imaging plane in various sizes in such a way as to produce an image field with varying magnification in the area of the imaging plane.

Description

OPTICAL IMAGING SYSTEM AND GRAPHIC USER INTERFACE
The invention relates to an optical imaging system for reproducing an object with an infinitely variable magnification (zoom) as defined in the introductory portion of claim 1, as well as to a graphic user interface, as defined in claim 12.
From SCHRODER, G.: Technische Optik, 7th edition, Vogel Buchverlag Wiirzburg, pages 128 ff., an optical imaging system in the form of a zoom lens is known, which enables an object to be reproduced with an infinitely variable magnification.
The previously known zoom lens consists essentially of several lenses, two of the lenses being movable along the optical axis. This makes it possible, on the one hand, to vary the focal length and, with that, the magnification infinitely and, on the other, to adapt the focusing to the changed focal length, so that the plane of the image plane can remain unchanged when the focal length is varied.
It is a problem of the previously known zoom lens that the sine error, which arises during the reproduction, varies with the respective focal length setting of the zoom lens. When images with different focal length settings are superimposed, this variation leads to an image, which gives the impression of being unnatural, because of the different optical imaging errors of the images, which are to be superimposed.
It is therefore an object of the invention to provide an optical imaging system, which enables an object to be reproduced with infinitely variable magnification and with sine errors, which are independent of the respective SUBSTITUTE PAGE (REGULATION 26) magnification. It is furthermore an object of the invention to provide a graphic user interface for the multivalent representation of the image.
This objective is accomplished by the distinguishing features of claims 1 and 12.
The invention includes the technical teachings of reproducing the object initially independently of the respective focal length setting with a constant magnification on a two-dimensional interim image and of then photographing this interim image with the desired magnification, so that the sine distortions, occurring during the reproduction process, are independent of the respective focal length setting.
The concept of object is to be understood generally here and in the following and comprises an individual object, a spatial scene as well as a two-dimensional presentation.
Preferably those light rays, which emanate from the object and pass through a nodal or focal point, form the image. For selecting the light rays, passing through the nodal point, from the totality of the rays emanating from the object, an optical element is provided pursuant to a theoretical approach and, in a preferred embodiment, may be formed as a spherical lens. The light rays, which strike the spherical lens perpendicularly to the spherical surface, experience, in accordance with the refractive index law, no or only a slight change in direction, and thus proceed centrally through the center point of the spherical lens, whereas the remaining rays are deflected and thus do not contribute to the imaging. The spherical lens thus selects the rays, which strike the lens from a wide-angled region and, at the same time, pass centrally through its center point, so that the center point of the spherical lens is the nodal or focal point of the imaging rays.
The approach, however, is not limited to a spherical lens for selecting the rays passing through the nodal point. For example, a pinhole diaphragm or an appropriate lens system can also be used. The nodal or focal point of the imaging rays coincides here with the center point of the pinhole diaphragm.
When a spherical lens is used, as well as when a pinhole diaphragm is used for selecting the rays passing through a certain nodal point, it is, in principle, not possible and, for an adequate intensity of light, also not desirable to select exclusively those rays, which pass exactly through the nodal point. Rather, other rays also, which extend in the vicinity of the nodal point, are interrupted, forming the image, and, with that, selected. The only thing that matters for the functioning of the optical imaging system is that the optical element produces a constricted bundle of rays in the region of the nodal point.
Furthermore, the optical imaging system preferably has a further optical element, which is downstream from the first optical element and preferably has at least one planoconcave lens or a lens system, containing a corresponding optical system, the concavely shaped side of the planoconcave lens facing the first optical element. For a specific embodiment, the planoconcave lens is disposed in such a manner, that its focal point coincides with the nodal point of the imaging rays, so that, the rays, passing through the nodal point, after being deflected by the planoconcave lens, proceed parallel to the optical axis. Thus, from a diverging bundle of rays, the planoconcave lens generates parallel rays, which extend at right angles to the plane surface of the planoconcave lens.
In accordance with the theoretical approach, the radius of the spherical lens is equal to the radius of curvature of the planoconcave lens, so that, on the side averted from the object, the spherical lens fits into the concave curvature of the planoconcave lens or of a corresponding optical system and forms a unit with the latter. For this approach, it is important that the refractive index of the planoconcave lens or of the corresponding optical system differs from the refractive index of the spherical lens, since otherwise the light rays would not be refracted at the interface between the spherical lens and the planoconcave lens.
In accordance with this approach, the planoconcave lens (or the corresponding optical element, guiding the rays, emanating from the object, to a plane pictorial representation) is provided on its plane side with an interim image plane, preferably in the form of a diffusor layer, which enables an interim image to be generated from the parallel rays. This interim image plane, preferably in the form of a diffusor layer, can be either mounted directly on the plane surface of the planoconcave lens or consist of a separate diffusor plate.
The optical imaging system thus initially selects, from the totality of light rays emanating from the object, essentially the rays, which pass through a particular nodal point. These light rays are then directed parallel to one another by an appropriate optical system symbolized by the planoconcave lens and, on the interim image plane, which preferably is in the form of a diffusor layer, generate an interim image, which reproduces a wide-angled section of the image in orthographic proj ection.
By means of imaging optics with a fixed focal length, this interim image is then reproduced on a photosensitive imaging surface, which is directed parallel to the interim image plane, preferably in the form of a diffusor layer, and consists, for example, of a light-sensitive film or a CCD sensor element as target. The diaphragm plane is between the interim image plane and the imaging plane, that is, as seen from the object, on the far side of the focus of the first optical element.
For adjusting the focal length and/or for focusing the image, appearing on the imaging plane, the imaging optics and/or the imaging plane are disposed so that they can be shifted parallel to the optical axis. By shifting the imaging optics and/or the imaging plane parallel to the optical axis, the magnification and, with that, the section of the image appearing on the imaging plane, can be adjusted infinitely variably. The arrangement functions like a zoom lens, the distortion being independent of the "focal length setting" and, with that, does not have to be corrected.
The focal point of the first optical element is in front of the diaphragm plane.
In the simplest version, the imaging optics consist of a single lens, which can be shifted parallel to the optical axis of the imaging system.
Usually, however, lens imaging systems with several individual lenses are used.
Since the imaging optics of the inventive optical imaging system reproduce not the object itself, which usually is three dimensional, but the two-dimensional interim image appearing on the interim image plane, preferably in the form of a diffusor layer, the sine distortions advantageously are independent of the particular setting of the magnification.
A further advantage of the inventive, optical imaging system is seen to lie therein that the images, obtained in this manner, can also be superimposed on images of other viewing angles and other focal length settings, an image being obtained, which gives a natural impression without any signs of interfering distortions at the edges of the image due to the superimposing, since the image geometry is identical for all settings.
In a preferred embodiment of the invention, the imaging optics are disposed in the body of a tube, which is aligned parallel to the optical axis of the imaging system and can be rotated about its longitudinal axis. On the inside of the body of the tube, there are catches here which, when the body of the tube is rotated, endeavor to rotate one or several lenses of the imaging objects. However, since these lenses are secured against rotation and can be moved only in the axial direction, the lenses carry out an evasive axial movement when the body of the tube is rotated. It is SUBSTITUTE PAGE (REGULATION 26) thus possible, by rotating the body of the tube, to shift lenses of the imaging optics in the axial direction and, with that, setting the magnification of the optical imaging system.
In an advantageous variation of the invention, automatic focusing is provided, which automatically focuses the imaging optics as a function of the magnification set. In this way, manual focusing can be omitted when the magnification is changed.
One possibility of focusing consists of shifting one or several lenses along the optical axis of the imaging system in such a manner, that the imaging plane lies in the image plane.
On the other hand, for a different possibility, the imaging plane itself is shifted, so that the image plane and the imaging plane coincide.
In both cases, the focusing equipment receives the axial position of the movable parts of the optical imaging system as an input variable and calculates from this the shifting of the imaging plane or of one or several lenses, required for the focusing.
Other advantageous further developments of the invention are characterized in the dependent claims or, together with the description of examples of the invention, are explained in greater detail below by means of the Figures, of which Figure 1 shows, in a theoretical approach, an optical imaging system in a perspectme representation, Figure 2 shows the path of the rays in a theoretical model of the optical imaging system of Figure 1 in cross section, Figure 3 shows the construction of a ray in the optical system of Figures l and 2 as an example of an object point, Figure 4 shows a block representation of the system, Figure 5 shows the geometry of the linear projection, Figure 6 shows the geometry of the orthographic projection, Figure 7 shows the object image height ho as a function of the object h for a linear projection, Figure 8 shows the object image height ho as a function of the object h for an orthographic projection, Figure 9 shows the object image height h'o as a function of the deviation angle 8 for a linear projection, Figure 10 shows the object image height h'o as a function of the deviation angle 8 for an orthographic projection, Figure 11 shows the viewing angle a'ob as a function of the deviation angle b for a linear projection, Figure 12 shows the viewing angle a'ob as a function of the deviation angle 8 for an orthographic proj ection, Figure 13 shows a diagrammatic view of a scene - position of the projection system of example scene with objects of the projection system, Figure 14 shows images photographed in the situation shown in Figure 13:
a) linear projection system with small viewing angle b) linear projection system with large viewing angle c) orthographic projection system with small viewing angle d) orthographic projection system with large viewing angle, Figure 15 shows a diagrammatic view of a set of orthographic zoom lenses, Figure 16 shows a method of describing the direction of the ray for the set of orthographic lenses, Figure 17 shows the image generating process for the set of orthographic lenses, Figure 18 shows a method for changing the viewing angle of the set of orthographic zoom lenses, Figure 19 shows the maximum viewing angle y of the set of orthographic zoom lenses as a function of the distance d between lens L2 and the screen S 1 and Figure 20 shows a method for positioning and aligning the camera pan angle tilt angle roll angle.
The optical imaging system 1, shown in Figure 1, shows a spatial scene to be imaged with little distortion and infinitely variable magnification.
Because the barrel-shaped or pillow-shaped distortions are largely suppressed, the images, photographed by the optical imaging system 1, can be superimposed easily without, for example, any signs of the disturbing distortion in the resulting image.
Preferably, the representation is of the orthographic type.
In Figure 1, the optical imaging system 1 has a spherical lens 2, which forms a first optical element and is connected at its outer surface on one side with a planoconcave lens 3, which forms a second optical element. The radius of curvature of this planoconcave lens 3 is equal to the radius of the spherical lens 2, which thus fits exactly into the concave curvature of the planoconcave lens 3 and forms a unit with the latter. To decrease the overall size, the spherical lens 2 is flattened at the sides, at the top and at the bottom, so that the unit of a spherical lens 2 and a planoconcave lens 3, apart from the spherical surface on the object side, has an almost rectangular shape. The spherical lens enables wide-angle, incident optical radiation to be processed, so that a large image angle can be achieved.
Figure 3 shows that the rays, passing centrally through the spherical lens 2, after being deflected at the interface between the spherical lens 2 and the planoconcave lens 3, proceed parallel to the optical axis.
The light rays, directed in parallel in this manner, then strike the plane surface of the planoconcave lens 3 perpendicularly and generate an interim image on the diffusor layer 4 deposited there.
The light rays, emanating from the object points of the spatial scene that is to be represented, thus pass through the spherical outer surface of the spherical lens 2, enter the optical imaging system, proceed centrally through the center N of the spherical lens 2 and are then aligned in parallel at the interface between the spherical lens 2 and the planoconcave lens 3, so that an image of the spatial scene appears on the diffusor layer 4.

The image, appearing on the difFusor layer 4, however reproduces the spatial scene with a constant magnification and functions merely as an interim image for generating images with infinitely variable magnification. For this purpose, a camera 5, 6 with automatic focusing, which can be shifted along the optical axis of the imaging system, is disposed behind the diffusor layer 4 and thus images the image appearing on the diffusor layer with infinitely variable magnification on the camera film 6. In the simplest version shown here, the camera consists of a light-sensitive film 6 and a lens 5, which is disposed between the diffusor layer 4 and the film 6 and aligned with its optical axis at right angles to the plane of the film and can be shifted along the optical axis in order to adjust the magnification. Since the camera 5, 6 always reproduces the two-dimensional interim image appearing on the diffusor layer 4 and not the spatial scene itself, sine distortions, which arise during the reproduction, are independent of the magnification factor or the zoom position of the camera 5, 6.
In accordance with the theoretical approach, the path of the rays in the optical imaging system described above is shown in detail in the cross-sectional representation in Figure 2. For the sake of simplification, as well to maintain clarity, only the rays, emanating from individual, equidistantly disposed object points, are shown. These rays pass through the spherical outer surface of the spherical lens 2, enter the optical imaging system and pass centrally through the center N of the latter.
On the opposite side, the rays then strike the interface between the spherical lens 2 and planoconcave lens 3. Since the focal point of the planoconcave lens 3 is in the center N of the spherical lens 2 and the rays, striking the interface, pass centrally through the spherical lens 2 and, with that, through the focal point N of the planoconcave lens 3, the rays are directed in parallel by the planoconcave lens 3 and thus strike at right angles the diffusor layer 4, which is applied on the plane surface of the planoconcave lens 3, so that an image of the spatial scene, which is independent of the particular magnification, appears on the diffusor layer 4.

Behind the diffusor layer 4, a camera 5, 6 with a fixed focal length is disposed pursuant to the invention. It can be shifted along the optical axis of the imaging system and thus makes it possible to reproduce the image, which appears on the diffusor layer 4, with infinitely variable magnification. For this purpose, the camera 5, 6 is shifted along the optical axis and focused in the position with the desired magnification factor and the therefrom resulting image section. In the camera, there is a diaphragm, which is not shown in the drawing. As seen from the object, the diaphragm is thus disposed on the far side of the nodal or focal point.
The mathematical, geometric construction of the course of the rays is shown, by way of example, in Figure 3 by means of the light ray emanating from an object O (OX, Oy, OZ). This light ray passes through the center N of the spherical lens at the elevation angle of ~3° and the azimuth angle of a°. The elevation angle (3° is the angle between the incident light ray and the horizontal plane, while the azimuth angle a° describes the angle between the perpendicular plane, extending parallel to the optical axis, and the incident ray.
Azimuth angle ao and elevation angle (3o are calculated here from the coordinate values OX, Oy, OZ of the object point P using the formulas ao = arctan Ox v X30 = arctan ~Z
y The incident ray of light then passes through the spherical lens 2 centrally and, on the opposite side at point Ss, strikes the spherical interface between the lens 2 and the planoconcave lens 3, the focal point of which coincides with the center N of the spherical lens, so that the rays, passing centrally through the spherical lens 2, are aligned in parallel by the planoconcave lens 3.
The ray, emanating from the object point O, is therefore deflected by the planoconcave lens 3 and subsequently proceeds parallel to the optical axis and finally strikes the diffusor layer 4 at point SP. The coordinate values SPX, SPZ of this point SP are calculated here from the azimuth angle a and the elevation angle (3 of the incident ray using the following formulas:
SPX = -sinacos~i SPZ = -cosasin(3 The image, appearing on the diffusor layer 4, has a constant magnification here and therefore functions only as an interim image for generating an image with an infinitely variable magnification factor. For this purpose, the image points, appearing on the diffusor layer 4, are imaged by a lens 5 onto the film plane 6, the coordinate values PX, PZ of the image point P in the film plane 6, corresponding to the object point O, being calculated from the following formulas:
PX = sinacos~i f = -SPX f PZ = cosasin(3 f = -SPZ f The factor F is obtained from the radius r of the spherical lens 2 and the width b of the imaging plane 6 in the film plane f.-b r SUBSTITUTE PAGE (REGULATION 26) In the following description, a new type of graphic user interface is to be characterized by means of examples. This interface is intended to enable images of objects of the real world and of the virtual world, as well as other computer applications, to be represented on a computer terminal. Figure 4 shows a block diagram of the interface.
The components of an image, which is to be represented on an interface output platform (typically a computer monitor), are produced by a set of cameras, which is constructed using an orthographic imaging system (or its model). The set of cameras comprises real cameras, which are directed onto objects of the real world, and virtual cameras, which are created with the help of computer expansion cards and are directed onto virtual objects and computer applications. The images, generated by the set of cameras, are fed into a video mixer, which forms a component of the interface. The video mixer generates the finished pictures, which can then be sent to an interface output platform. The camera positions are controlled by an interface system for the camera control.
In order to ensure communication between user and system, the interface is provided with input and output platforms. The input platform makes it possible to manipulate the components of the resultant image and to generate messages to the applications installed in the system. The output platform enables the resultant images to be represented and, with that, the state of the installed applications to be made visible.
The interface enables the movement paths of the cameras, as well as their viewing directions and angles, to be arranged freely. Each camera in the system can be activated independently of or synchronously with other cameras. As a result of the interface activity, any image desired can be represented on the screen.
Two basic distinguishing features, which differentiate the interface, introduced here, from the other present graphic user interfaces, are now to be emphasized. First of all, the resultant image is achieved by the use of orthographic projection which, in contrast to the usual linear projection, perfectly reproduces the manner in which a human eye sees the surroundings (the comparison between linear and orthographic projections is shown in even greater detail in the following).
Secondly, the system enables a final image to be created without limitations.
This image may contain reproductions of objects of the real world, of objects of a virtual world, as well as computer applications. Due to the movement control system and the use of the video mixer within the resultant image, all of these reproductions can change and move at will. The movement control system is in a position to control the positions and direction of the cameras and, what is even more important, to repeat the movements of the individual cameras as well as of sets of cameras with the same precision (in this case, synchronously if required). The camera movement control system is also described in greater detail in the following. The video mixer offers modern image processing methods for freely manipulating the components of the image in the working environment.
With regard to the logical system structure, the interface has the following components:
~ a system of computer graphic (CG) lens cameras (for "observing" virtual worlds and applications) on the basis of the model of a set of orthographic zoom lenses;
~ a movement control system for CG lens cameras ~ a system of cameras with optical lenses (for observing the real world) equipped with orthographic zoom lens sets;
~ a movement control system for cameras with optical lenses;
~ video mixer;

SUBSTITUTE PAGE (REGULATION 26) ~ input platform;
~ output platform;
a 3D organization of the computer memory, based on the Cartesian space (YXYZ).
This 3D memory is presented on the screen as an image of a 3D virtual world.
In this world, the graphic interface of each software has a single position in space.
This world is represented on the screen by means of an orthographic CG lens, which is an integrated part of the graphics card (hardware);
~ a link from this memory (computer Cartesian space) with the GPS (global Cartesian space) system.
A CG lens camera is understood to be a computer expansion card with appropriate software, which is able to construct two-dimensional images from observing a three-dimensional virtual world. The image construction process proceeds according to the principles of functioning of the orthographic zoom lens set.
The virtual 3D world may contain virtual 3D objects, which were modeled utilizing scene modeling methods in Cartesian space, and computer applications (such as Windows 95, autoCAD, XWindow), which are represented by virtual screens, which may be disposed in any way in a virtual space. These screens represent a means of communicating between the applications and a user.
A CG lens camera is characterized by three parameters: a viewing angle, a standpoint and a viewing direction. These parameters clearly define the nature, in which the camera perceives the virtual world.
An optical lens camera is understood to be a real camera which, similarly to a CG lens camera, permits two-dimensional images of the real world to be produced. Cameras with optical lenses, used in the interface, are equipped with orthographic zoom lens sets.
SUBSTITUTE PAGE (REGULATION 26) As in the above case, an optical lens camera is also defined by a viewing angle, a standpoint and a viewing direction.
The movement of the camera is controlled over a computer expansion card with appropriate software by means of suitable mechanical manipulators for operating the optical lens camera (in the event that such a camera is present in the system). This card makes it possible to calculate and then convert sequences of camera positions, as desired by the user, in order to move the cameras) along prescribed paths. In this way, both camera types, virtual and real, which are present in the system, can be positioned at will and thus compose any movement progressions.
The images, generated by the system cameras, are sent as input to the video mixer. The mixer combines the images supplied and generates resultant images from them, which can be sent to the interface output platform. The capabilities of the mixer largely determine the capabilities of the system as a whole. If the system uses only a single lens camera, the video mixer can be omitted.
The task of an interface input platform is to make communication in the direction of the user system possible. This platform consists of basic input equipment, such as keyboards, mice, joysticks, etc. or of more complicated equipment, such as speech analyzers, which generate messages for controlling the system over the interface. There are two types of such messages:
~ interface messages, which modify the appearance of the resultant picture sent to an output platform;
~ application messages, which are transmitted for further processing to the applications programs installed in the system.

SUBSTITUTE PAGE (REGULATION 26) It is the task of an interface output platform to make communication m the direction from system to user possible. Each video device or a set of video devices for making the actual state of the interface (which reflects the state of the system as a whole) visible takes over the role of interface output platform.
In the following section, the distinguishing features of the image are compared, which result from the use of the linear or the orthographic projection. It is assumed that we have at our disposal the linear projection system shown in Figure 5.
We wish to know how the image height ho depends on the object height h, in the event that the object is in the optical axis y of the system. In the situation under consideration, this is a direct consequence of the Tale theorem that h - doh (1) d Figure 7 illustrates the relationship shown above, the system parameters do and d being set as constants. In an orthographic projection system (shown in Figure 6), the similar relationship is shown by the formula ho - d h (2) d2 + h2 A diagram, in which the image height ho is plotted against the object height h is shown in Figure 8. It can be seen immediately that, in the case of the linear projection, the image height is proportional to the object height, a coefficient depending on the parameters of the projection system. In the case of the orthographic projection, this is not so. In this case, at low object heights (up to a few ten percent of the projection system parameter do), the image height ho is still approximately proportional to the object height h. In general, it may be noted in this case that the greater the object height, the slower is the increase in the image height.
Theoretically, SUBSTITUTE PAGE (REGULATION 26) the largest possible image height for this type of projection is equal to do (the diagram in Figure 8 was calculated with do = 0.1).
The same phenomenon may be noted when the objects observed are moved out of the optical axis y of the orthographic projection system. In this situation, the image height h'o of objects with a constant height h is of interest to us.
The objects are located at different positions on the plane xOz perpendicularly to the optical axis y of the system. If the height h'o is calculated as a function of the deviation angle 8 (the angle, from which the object base surface is seen from standpoint v). For comparison purposes, the linear projection case is considered first.
If the object is moved out of the optical axis of the system in such a manner, that the base surface of the object is viewed from standpoint v at an angle 8, the distance d' between the real object and the projection system increases according to the formula d~ = d cos 8 In the same manner, the distance d'o is determined by the formula d, __ d° (4) ° cos S
If equations (3) and (4) are inserted into equation (2), - d°h ° d is obtained. It can be seen readily that the image height h'o does not depend on the angel of inclination 8. This is shown in Figure 9. Contrary to this, in the case of the orthographic projection system, when an object is moved out of the optical axis of the SUBSTITUTE PAGE (REGULATION 26) system, only the obj ect distance d' changes, whereas the image distance d'o remains constant and is equal to do. Since the object distance once again is given by the equation (3), the image height h'o can be calculated from the formula dh dz + h2 cost 8 A representation of the height h'o as a function of the angle ~ is shown in Figure 10. It may be noted that, as the object moves away from the optical axis y of the system, the image height h'o becomes smaller. This is brought about by the increase in the distance between the object and the projection system. This phenomenon is quite natural. Let us imagine, for example, that we are standing in front of a very large painting, on which the symbol "a" is represented several times. It is clear that the symbols, which are directly in front of us on the painting, appear to be much larger than the symbols in the edge region of the painting.
Until now, we have investigated the image height as a function of the projection and of the parameters of the object observed. In the following, the image angle oc'o, from which a viewer sees the image, is to be determined as a function of the above-mentioned parameters. It is assumed that the eye of the viewer is on the optical axis y of the projection system at a distance dab from the image plane xo0yo.
The object image is then seen from the angle h' a'°b = arctan ° (7) °b h'o being defined by (5) for the linear projection and by (6) for the orthographic projection. The distance d'ob can be calculated from SUBSTITUTE PAGE (REGULATION 26) d'°b = do + do tan2 8 (8) for the linear projection and from _ 2 d zdo tan2 8 ( ) d°b 1~d°b + dZ + hz cos2 8 for the orthographic projection. If these equations are combined with equation (7), the following final formula is obtained for the viewing angle:
aob = arctan d°h (10) ddb+dotanz~
for the linear projection and aob = arctan d°h (11) d bd2 +d bhz cost 8+dzdo tan2 8 cost 8 for the orthographic projection. For both cases, the angle a'ob is plotted as a function of the angle 8 in Figures 11 and 12. For producing these diagrams, the projection system and the object parameters were selected so that objects, placed on the optical axis y of the system, are viewed at an angle of 5°. It can easily be noted that the course of the function is the same in the two cases - the image height varies inversely with the distance between the object and the optical axis y of the system.
There are, however, important differences between the two cases, which we wish to point out.
In the case of the linear type of projection, the image width extends infinitely, when the angle 8 reaches 90°, whereas, in the case of the orthographic type of projection, SUBSTITUTE PAGE (REGULATION 26) the image which does not exceed the parameter value do. Moreover, it is clear that the real image size is distorted by the optics of the linear projection system.
We will now attempt to estimate how the distinguishing features of the projection system affect the appearance of the resultant image. Let us imagine the following situation. In front of the projection system, in a plane perpendicular to the optical axis of the system, there is a series of rods of equal length (Figure 13). Figure 14 shows the representations of the scene shown in Figure 13 with different projection system parameters. It can be seen that, in the case of the projection systems with small viewing angles, practically no difference can be noted between the images produced by means of linear or orthographic projection. Clear differences arise when large viewing angles of the projection systems are used. In this case, for a linear projection, we obtain an image, in which all reproductions of the rods appear to be small and of the same height, although the rods, which are directly in front of the projection system, are much closer than the rods at the side. The use of the orthographic projection system prevents such a situation. With an orthographic projection, we obtain an image, in which the rods, closer to the system, are reproduced larger than are the more remote rods.
The image generating process in the orthographic zoom lens set has two phases. In phase 1 (Figure 15), an interim image is produced on a virtual screen S1. This takes place by means of a lens (a lens set) L1 of the "fisheye" type with a viewing angle of 180° according to the orthographic projection rule.
Such a.n interim image is circular, the reproductions of the world objects, which are placed close to the optical axis of the lens set, being in the central part. In phase 2, a resultant image is generated on the screen S2 on the basis of the interim image. For this purpose, an observation lens (lens set) L2, acting according to the linear projection rule and having a given viewing angle (significantly smaller than 180°) is used. The main distinguishing feature of the proposed lens set consists therein that, when the viewing angle of the lens set is changed, the resultant image of the constant interim image is SUBSTITUTE PAGE (REGULATION 26) obtained, in that a portion thereof or the whole image is selected and enlarged. If the resultant image is distorted geometrically by the lens set, these distortions remain constant during the zooming process. This distinguishing feature of the lens set is particularly important and affects the reliability of the final image supplied by the video mixer.
The viewing direction of the orthographic zoom lens set is defined by the optical axis of the latter, and the standpoint position is defined by the position of the orthographic lens. The viewing angle of the set can be changed by changing the distance between the observation line LZ and the virtual screen S1. It should be noted that the viewing angle of the proposed lens set theoretically can be changed from 180°
to 0°.
For describing the direction of rays, which are incident on the orthographic lens set, a pair of angles can be used: a right-angled incident angle a and an angle of reflection (3 (Figure 16). Without loss of the general validity, this representation enables the analysis to be continued with a two-dimensional model (in the OvY plane), which is obtained by defining the angle (3.
A mathematical description of the process of generating the resultant image follows.
Under the above assumption, we can set (3 equal to 0. If now a given point is seen by the orthographic zoom lens set with a viewing angle a (Figure 17), an image on the virtual screen S1 has a height of O~~ _ ~s O~ = ds r sin a (12) SUBSTITUTE PAGE (REGULATION 26) in which r represents a radius, which characterizes the orthographic projection used.
This point corresponds to a point on the resultant screen S2 at a height O~ = P = r sin a (13) The viewing angle of the orthographic zoom lens set is calculated as follows:
Let the lens L2 be characterized by the maximum viewing angle yo (Figure 18). If it is placed at a distance d from the virtual screen Si, a circular section of this screen, having a radius of r, = d tan 2 y3 (14) is within the viewing a range of this lens. Within this range, there are reproductions of objects, which are seen through the lens L1 at an angle not greater than tan - Y3 y = 2 arcsin 2 d (15) r At the same time, this is a viewing angle of the total zoom lens set. It should be noted that the values of yo i r are constants, which characterize the lens set, and that the distance d is the only variable, which affects the viewing angle of the lens set. The viewing angle y of the zoom lens set is shown as a function of the distance d in Figure 19.
If the angle yo for the lens LZ is defined as a horizontal, vertical or diagonal viewing angle, the above-calculated viewing angle of the lens set is also defined as being horizontal, vertical or diagonal.

SUBSTITUTE PAGE (REGULATION 26) The function of the camera movement control system is described in the following. The region, recorded by a camera at a given time, is defined by the actual camera viewing angle, the actual camera standpoint and the actual camera alignment. The camera movement control system is used in order to change the above-mentioned position and alignment of the camera. The system makes it possible to control all three position components of the camera standpoint, which is determined in the Cartesian system by a vector (x, y, z), as well as to control the viewing direction of the camera by changing the three angles of rotation defined by TRP (tilt, roll, pan) (Figure 20). With the help of the camera movement control system, it possible to change the position and alignment of the optical lens camera as well as of the CG lens camera. The difference between optical (real) and CG
(virtual) cameras lies therein that, in the case of the real camera, the configuration of a mechanical device (manipulator), to which the camera is fastened, is changed, while in the case of the virtual camera, the respective matrixes for the alignment and position of the camera are subject to changes.
In its embodiment, the invention is not limited to the preferred examples given above. Rather, a number of variations is conceivable, which make use of all of the solutions presented even for basically different types of constructions.

SUBSTITUTE PAGE (REGULATION 26)

Claims (12)

Claims
1. An optical imaging system for reproducing an object with an infinitely variable magnification, comprising means for producing an orthographic projection of the object in an image plane (4), an imaging plane (6) for representing an image region originating from the object, imaging optics (5) disposed on the object side in front of the imaging plane (6) with preferably a fixed focal length for producing the image region on the imaging plane (6), the imaging optics (5) and/or the imaging plane (6) being disposed so that they can be moved in the direction of the optical axis for adjusting the magnification and/or for focusing the image appearing on the imaging plane (6), and an image region of different magnification being produced in the region of the imaging plane.
2. The optical imaging system of claim 1, characterized in that a film or a CCD sensor element is provided in the imaging plane (6).
3. The optical imaging system of one of the preceding claims, characterized in that an automatic focusing device is provided for focusing the image in the imaging plane (6).
4. A graphical user interface as part of a computer system for producing reproductions of objects of the real world, the virtual world as well as, optionally, further computer applications, the components of an image, which is to be represented on an interface output platform, being generated by a camera set, which is constructed using orthographic imaging systems of claim 1, and the camera set comprising real cameras, which are directed onto objects of the real world, and virtual cameras, which are created with the help of computer expansion cards and relate to virtual objects, and optionally computer applications and the images, generated by the camera set, being supplied to a video mixer and processed and the finished pictures, generated by the video mixer, then being sent to the interface output platform, the camera positions being controlled by a camera movement control system.
5. The graphical user interface of claim 4, characterized in that, in addition to the orthographic imaging systems, their orthographic virtual images are used.
6. The graphical user interface of claim 4, characterized in that the interface output platform is a monitor and the video mixer is a component of the interface.
7. The graphical user interface of claim 4, characterized in that the cameras are CG lens cameras in the form of computer expansion cards with appropriate software as well as optical lens cameras in the form of real cameras and in that the cameras have orthographic imaging systems.
8. The graphical user interface of claim 4, characterized in that the camera movement control system has at least one computer expansion card with appropriate software, in that mechanical manipulators are disposed for operating the optical lens camera and in that, for controlling the virtual CG lens camera, the respective matrixes for the camera alignment and position are changed, as a result of which both types of cameras, virtual and real, used in the system, can be positioned at will and movement courses can be composed at will in this manner.
9. The graphical user interface of claim 4, characterized in that interface input platforms are disposed, which consist of basic input devices, such as keyboards, mice, joysticks, etc. or of more complicated equipment, such as speech analyzers, which generate messages for controlling the system over the interface.
10. The graphical user interface of claim 4, characterized in that the image generating process in the orthographic imaging set has two phases, an interim image being generated in phase 1 on a virtual screen S1 and, on the basis of the interim image, a resultant image being generated on the screen S2 in phase 2.
11. The graphical user interface of claim 10, characterized in that the interim image is generated on the virtual screen S1 by means of a lens or a lens set L1 with a viewing angle of 180° according to the orthographic projection rule and in that such an interim image is circular, reproductions of world objects, which are placed close to the optical axis of the lens set, being in the central part.
12. The graphical user interface of claim 10, characterized in that the resultant image is generated on the screen S2 by means of an observation lens acting according to the linear projection rule or a lens set L2 with a given viewing angle, which is significantly smaller than 180°.
CA002287453A 1997-04-17 1998-04-17 Optical imaging system and graphic user interface Abandoned CA2287453A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE19716958A DE19716958A1 (en) 1997-04-17 1997-04-17 Optical imaging system
DE19716958.9 1997-04-17
PCT/DE1998/001121 WO1998048309A1 (en) 1997-04-17 1998-04-17 Optical imaging system and graphic user interface

Publications (1)

Publication Number Publication Date
CA2287453A1 true CA2287453A1 (en) 1998-10-29

Family

ID=7827376

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002287453A Abandoned CA2287453A1 (en) 1997-04-17 1998-04-17 Optical imaging system and graphic user interface

Country Status (11)

Country Link
US (2) US6327097B1 (en)
EP (1) EP0976003B1 (en)
JP (1) JP2000510610A (en)
KR (1) KR20010006517A (en)
CN (1) CN1255206A (en)
AT (1) ATE212732T1 (en)
AU (1) AU7756598A (en)
CA (1) CA2287453A1 (en)
DE (2) DE19716958A1 (en)
PL (1) PL336469A1 (en)
WO (1) WO1998048309A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19716958A1 (en) * 1997-04-17 1998-10-22 Zbigniew Rybczynski Optical imaging system
AUPQ610700A0 (en) 2000-03-08 2000-03-30 Crown Limited Automatic table game
CN101010700B (en) * 2004-08-23 2010-05-26 盖姆卡斯特公司 Apparatus, methods and systems for viewing and manipulating a virtual environment
US20080112512A1 (en) * 2006-11-15 2008-05-15 Qualcomm Incorporated Transmitted reference signaling scheme
US8386918B2 (en) * 2007-12-06 2013-02-26 International Business Machines Corporation Rendering of real world objects and interactions into a virtual universe
US8228170B2 (en) 2008-01-10 2012-07-24 International Business Machines Corporation Using sensors to identify objects placed on a surface
US8326077B2 (en) * 2008-10-31 2012-12-04 General Instrument Corporation Method and apparatus for transforming a non-linear lens-distorted image
US8687044B2 (en) 2010-02-02 2014-04-01 Microsoft Corporation Depth camera compatibility
GB201302174D0 (en) * 2013-02-07 2013-03-27 Cardiff Metropolitan University Improvements in and relating to image making
CN104125390B (en) * 2013-04-28 2018-03-23 浙江大华技术股份有限公司 A kind of localization method and device for ball-shaped camera

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2402216A (en) * 1946-06-18 Means for securing improvements in
GB1055639A (en) * 1962-10-31 1967-01-18 Contina Ag Panoramic and straight-view optical system
US3514186A (en) * 1968-04-10 1970-05-26 Eugene F Poncelet Wide angle optical apparatus
US4045116A (en) * 1975-02-27 1977-08-30 Russa Joseph A Wide angle optical imaging system
GB1595964A (en) * 1977-03-17 1981-08-19 Micro Consultants Ltd Tv Special effects generator
US5004328A (en) * 1986-09-26 1991-04-02 Canon Kabushiki Kaisha Spherical lens and imaging device using the same
US5161013A (en) * 1991-04-08 1992-11-03 Honeywell Inc. Data projection system with compensation for nonplanar screen
US5187754A (en) * 1991-04-30 1993-02-16 General Electric Company Forming, with the aid of an overview image, a composite image from a mosaic of images
JP2998285B2 (en) * 1991-05-28 2000-01-11 ミノルタ株式会社 camera
US5633995A (en) * 1991-06-19 1997-05-27 Martin Marietta Corporation Camera system and methods for extracting 3D model of viewed object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US6005984A (en) * 1991-12-11 1999-12-21 Fujitsu Limited Process and apparatus for extracting and recognizing figure elements using division into receptive fields, polar transformation, application of one-dimensional filter, and correlation between plurality of images
US5490239A (en) * 1992-10-01 1996-02-06 University Corporation For Atmospheric Research Virtual reality imaging system
JPH07504285A (en) * 1992-11-24 1995-05-11 フランク・データ・インターナショナル・ナムローゼ・フェンノートシャップ Panoramic image formation method and device, and panoramic image search method and device
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
AU3128595A (en) * 1994-09-01 1996-03-22 Motorola, Inc. Interface card with an electronic camera and method of use therefor
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
IL113496A (en) * 1995-04-25 1999-09-22 Cognitens Ltd Apparatus and method for recreating and manipulating a 3d object based on a 2d projection thereof
US5999660A (en) * 1995-07-26 1999-12-07 California Institute Of Technology Imaging system for correction of perceptual distortion in wide angle images
GB9607541D0 (en) * 1996-04-11 1996-06-12 Discreet Logic Inc Processing image data
US6124864A (en) * 1997-04-07 2000-09-26 Synapix, Inc. Adaptive modeling and segmentation of visual image streams
DE19716958A1 (en) * 1997-04-17 1998-10-22 Zbigniew Rybczynski Optical imaging system

Also Published As

Publication number Publication date
WO1998048309A1 (en) 1998-10-29
PL336469A1 (en) 2000-06-19
JP2000510610A (en) 2000-08-15
CN1255206A (en) 2000-05-31
EP0976003A1 (en) 2000-02-02
DE59802959D1 (en) 2002-03-14
KR20010006517A (en) 2001-01-26
US6327097B1 (en) 2001-12-04
US6259568B1 (en) 2001-07-10
AU7756598A (en) 1998-11-13
DE19716958A1 (en) 1998-10-22
ATE212732T1 (en) 2002-02-15
EP0976003B1 (en) 2002-01-30

Similar Documents

Publication Publication Date Title
US6271853B1 (en) Method for generating and interactively viewing spherical image data
KR100599423B1 (en) An omnidirectional imaging apparatus
DE69727052T2 (en) OMNIDIRECTIONAL IMAGING DEVICE
US6844990B2 (en) Method for capturing and displaying a variable resolution digital panoramic image
US7382399B1 (en) Omniview motionless camera orientation system
Onoe et al. Telepresence by real-time view-dependent image generation from omnidirectional video streams
KR100799088B1 (en) Fast digital pan tilt zoom video
JP4065488B2 (en) 3D image generation apparatus, 3D image generation method, and storage medium
TW408554B (en) Method for smooth field passing among vistas
US20020113865A1 (en) Image processing method and apparatus
US20050053274A1 (en) System and method for 3D photography and/or analysis of 3D images and/or display of 3D images
EP0834232A1 (en) Method and apparatus for creating spherical images
Ikeda et al. High-resolution panoramic movie generation from video streams acquired by an omnidirectional multi-camera system
JPH06194758A (en) Method and apparatus for formation of depth image
JPH11194296A (en) Panoramic viewing device
RU2554299C2 (en) Apparatus for generating stereoscopic images
US9100562B2 (en) Methods and apparatus for coordinated lens and sensor motion
US6259568B1 (en) Optical imaging system and graphic user interface
KR100780701B1 (en) Apparatus automatically creating three dimension image and method therefore
US4805998A (en) Variable anamorphic lens system
JPH07174538A (en) Image input camera
KR19980034846A (en) How to make a relief photo
CN108205236A (en) Panoramic camera and its camera lens
KR100579135B1 (en) Method for capturing convergent-type multi-view image
US20020067356A1 (en) Three-dimensional image reproduction data generator, method thereof, and storage medium

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued