US20030113012A1 - Method and system for controlling a screen ratio based on a photographing ratio - Google Patents

Method and system for controlling a screen ratio based on a photographing ratio Download PDF

Info

Publication number
US20030113012A1
US20030113012A1 US10/280,246 US28024602A US2003113012A1 US 20030113012 A1 US20030113012 A1 US 20030113012A1 US 28024602 A US28024602 A US 28024602A US 2003113012 A1 US2003113012 A1 US 2003113012A1
Authority
US
United States
Prior art keywords
image
display
camera
ratio
photographing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/280,246
Inventor
Byoungyi Yoon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GEO-RAE Co Ltd
Original Assignee
GEO-RAE Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/KR2001/001398 external-priority patent/WO2002015595A1/en
Priority claimed from KR10-2001-0067246A external-priority patent/KR100397066B1/en
Priority claimed from KR10-2001-0067245A external-priority patent/KR100445799B1/en
Application filed by GEO-RAE Co Ltd filed Critical GEO-RAE Co Ltd
Assigned to GEO-RAE CO., LTD. reassignment GEO-RAE CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YOON, BYOUNGYI
Publication of US20030113012A1 publication Critical patent/US20030113012A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/20Stereoscopic photography by simultaneous viewing using two or more projectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Definitions

  • the present invention relates to a method and system for generating and/or displaying a more realistic stereoscopic image. Specifically, the present invention relates to a method and system for displaying at least one stereoscopic image in a set of display devices based on received photographing ratios such that each of the screen ratios for the display devices is substantially the same as each of the photographing ratios.
  • a human being can recognize an object by sensing the environment through eyes. Also, as the two eyes are spaced apart a predetermined distance from each other, the object perceived by the two eyes is initially sensed as two images, each image being formed by one of the left or right eyes. The object is recognized by the human brain as the two images are partially overlapped.
  • the portion where the images perceived by a human being overlap as the two different images transmitted from the left and right eyes are synthesized in the brain, there is a perception of 3-dimensions.
  • U.S. Pat. No. 4,729,017 discloses “Stereoscopic display method and apparatus therefor.” With a relatively simple construction, the apparatus allows a viewer to view a stereoscopic image via the naked eye.
  • U.S. Pat. No. 5,978,143 discloses “Stereoscopic recording and display system.”
  • the patent discloses that the stereoscopically shown image content is easily controllable by the observer within the scene, which is recorded by the stereo camera.
  • U.S. Pat. No. 6,005,607 discloses “Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus.” This apparatus stereoscopically displays two-dimensional images generated from three-dimensional structural information.
  • One aspect of the invention provides a method of displaying an image.
  • the method comprises generating a digital image of a scene by a camera.
  • the method also comprises measuring a photographing ratio (A:B:C) of the camera while the digital image is being generated, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene.
  • the method also comprises transmitting the image and the photographing ratio (A:B:C) to a display device.
  • the method comprises displaying the transmitted image in the display device such that a screen ratio (D:E:F) of the display device is substantially the same as the photographing ratio (A:B:C), wherein D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point.
  • D:E:F screen ratio of the display device
  • A:B:C photographing ratio
  • Another aspect of the invention provides a method of displaying stereoscopic images.
  • the method comprises producing at least one stereoscopic image of a scene, the stereoscopic image comprising a pair of two-dimensional plane images produced by first and second cameras.
  • the method also comprises measuring a first photographing ratio (A1:B1:C1) of the first camera and a second photographing ratio (A2:B2:C2) of the second camera while the scene is being imaged, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene imaged by each of the first and second cameras, respectively, and C1 and C2 are defined as distances between object lenses of the cameras and the scene, respectively.
  • the method also comprises transmitting the stereoscopic image and the photographing ratios (A1:B1:C1, A2:B2:C2) to first and second display devices, respectively.
  • the method also comprises displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
  • the system comprises a receiver, a signal separator, an image size adjusting portion, and at least one display portion.
  • the receiver receives at least one image of a scene and at least one photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene.
  • the signal separator separates the image and the photographing ratio.
  • the image size adjusting portion adjusts a size of the received image to be displayed based on the photographing ratio and at least one screen ratio, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point.
  • the at least one display portion displays the adjusted image.
  • Still another aspect of the invention provides a method of displaying images in at least one display device.
  • the method comprises receiving at least one image of a scene and photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene.
  • the method also comprises determining a screen ratio of the display device, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the device, respectively, and C is defined as a distance between the display device and a viewing point.
  • the method also comprises adjusting a size of the received image to be displayed such that the photographing ratio (A:B:C) is substantially the same as the screen ratio (D:E:F).
  • the method comprises displaying the adjusted image in the display device.
  • Still another aspect of the invention provides a method of displaying stereoscopic images.
  • the method comprises producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions.
  • the method also comprises providing a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively.
  • the method also comprises transmitting the produced stereoscopic image and the corresponding photographing ratios (A1:B1:C1, A2:B2:C2) to first and second display devices.
  • the method comprises displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
  • Yet another aspect of the invention provides a method of displaying stereoscopic images.
  • the method comprises producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions.
  • the method also comprises providing a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively.
  • the method also comprises displaying the stereoscopic image in a pair of display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
  • Yet another aspect of the invention provides a system for displaying stereoscopic images.
  • the system comprises first and second projection portions, a computing device and a display portion.
  • the first and second projection portions produce at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions.
  • the computing device provides a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between each of the projection portions and the scene, respectively.
  • the display portion displays the stereoscopic image in a pair of display screens such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display screens is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display screens, respectively, and F1 and F2 are defined as distances between the display screens and viewing points, respectively.
  • FIG. 1A illustrates one typical 3-D image generating and reproducing apparatus.
  • FIG. 1B illustrates another typical 3-D image generating and reproducing apparatus.
  • FIGS. 2A and 2B illustrate a photographing ratio of a camera.
  • FIGS. 3A and 3B illustrate a screen ratio of a display device that displays a photographed image.
  • FIG. 4A illustrates the variation of the distance between an object lens and a film according to the variation of a focal length of a camera.
  • FIG. 4B illustrates the variation of a photographing ratio according to the variation of the focal length of the camera.
  • FIG. 4C shows the relationship between a photographing ratio and the focal length of the camera.
  • FIG. 4D illustrates an exemplary table showing maximum and minimum photographing ratios of a camera.
  • FIG. 5A illustrates a photographing ratio calculation apparatus according to one aspect of the invention.
  • FIG. 5B illustrates a photographing ratio calculation apparatus according to another aspect of the invention.
  • FIG. 6A illustrates an exemplary flowchart for explaining the operation of the photographing ratio calculation apparatus of FIG. 5A.
  • FIG. 6B illustrates an exemplary flowchart for explaining the operation of the photographing ratio calculation apparatus of FIG. 5B.
  • FIG. 7 illustrates a camera comprising the photographing ratio calculation apparatus as shown in FIGS. 5A and 5B.
  • FIG. 8 illustrates a system for displaying stereoscopic images such that a photographing ratio (A:B:C) is substantially the same as a screen ratio (D:E:F).
  • FIG. 9 illustrates an exemplary flowchart for explaining the operation of the image size adjusting portion of FIG. 8.
  • FIG. 10 is a conceptual drawing for explaining the image size adjustment in each of the display devices.
  • FIG. 11 illustrates an exemplary flowchart for explaining the entire operation of the system shown in FIG. 8.
  • FIG. 12 illustrates examples of the display system according to one aspect of the invention.
  • FIG. 13 illustrates a 3D display system including an eye position fixing device according to one aspect of the invention.
  • FIG. 14 illustrates a relationship between the displayed images and a viewer's eyes.
  • FIG. 15 illustrates a 3D image display system according to one aspect of the invention.
  • FIG. 16A illustrates an exemplary flowchart for explaining the operation of the system of FIG. 15.
  • FIG. 17 is a conceptual drawing for explaining the operation of the display device of FIG. 15.
  • FIG. 18 illustrates a 3D image display system according to another aspect of the invention.
  • FIG. 19 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 18.
  • FIG. 20 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 18.
  • FIG. 21A illustrates an eye lens motion detection device.
  • FIG. 21B is a conceptual drawing for explaining the movement of the eye lenses.
  • FIG. 22 is a conceptual drawing for explaining the movement of the center points of the displayed images.
  • FIG. 23 illustrates a camera system for a 3D display system according to one aspect of the invention.
  • FIG. 24 illustrates a display system corresponding to the camera system shown in FIG. 23.
  • FIG. 25 illustrates an exemplary flowchart for explaining the operation of the camera and display systems shown in FIGS. 23 and 24.
  • FIG. 26A is a conceptual drawing that illustrates parameters for a set of stereoscopic cameras.
  • FIG. 26B is a conceptual drawing that illustrates parameters for a viewer's eyes.
  • FIG. 27 is a conceptual drawing that illustrates the movement of a set of stereoscopic cameras.
  • FIG. 28 is a conceptual drawing for explaining the eye lens movement according to the distance between the viewer and an object
  • FIG. 29 illustrates a 3D display system for controlling a set of stereoscopic cameras according to another aspect of the invention.
  • FIG. 30 illustrates an exemplary block diagram of the camera controllers shown in FIG. 29.
  • FIG. 31 illustrates an exemplary flowchart for explaining the operation of the camera controllers according to one aspect of the invention.
  • FIG. 32A illustrates an exemplary table for controlling horizontal and vertical motors.
  • FIG. 32B illustrates a conceptual drawing that explains motion of the camera.
  • FIG. 33 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 29.
  • FIG. 34 illustrates a stereoscopic camera controller system used for a 3D display system according to another aspect of the invention.
  • FIG. 35 illustrates an exemplary table showing the relationship between camera adjusting values and selected cameras.
  • FIG. 36A is a top plan view of the plural sets of stereoscopic cameras.
  • FIG. 36B is a front elevational view of the plural sets of stereoscopic cameras.
  • FIG. 37 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 34.
  • FIG. 38 illustrates a 3D display system according to another aspect of the invention.
  • FIG. 39 illustrates one example of a 3D display image.
  • FIGS. 40 A- 40 H illustrate conceptual drawings that explain the relationship between the 3D mouse cursors and eye lens locations.
  • FIG. 41 illustrates an exemplary block diagram of the display devices as shown in FIG. 38.
  • FIG. 42 illustrates an exemplary flowchart for explaining the operation of the display devices of FIG. 41.
  • FIGS. 43A and 43B illustrate conceptual drawings that explain a method for calculating the location of the center points of the eye lens.
  • FIG. 44 is a conceptual drawing for explaining a determination method of the location of the center points of the displayed images.
  • FIG. 45 illustrates a 3D display system according to another aspect of the invention.
  • FIG. 46 illustrates an exemplary block diagram of the display device of FIG. 45.
  • FIG. 47 is a conceptual drawing for explaining the camera control based on the movement of the eye lenses.
  • FIG. 48 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 45.
  • FIG. 49 illustrates a 3D display system according to another aspect of the invention.
  • FIG. 50 illustrates an exemplary block diagram of the camera controller of FIG. 49.
  • FIG. 51 illustrates an exemplary flowchart for explaining the camera controller of FIG. 50.
  • FIG. 52 illustrates an exemplary table for explaining the relationship between the space magnification and camera distance.
  • FIG. 53 illustrates an exemplary flowchart for explaining the operation of the entire system shown in FIG. 49.
  • FIG. 54 illustrates a 3D display system according to another aspect of the invention.
  • FIG. 55 illustrates an exemplary table for explaining the relationship between the camera motion and display angle.
  • FIG. 56 illustrates an exemplary flowchart for explaining the entire operation of the system shown in FIG. 54.
  • FIG. 57 illustrates a 3D display system according to another aspect of the invention.
  • FIG. 58 illustrates an exemplary block diagram of the display device of FIG. 57.
  • FIGS. 59A and 59B are conceptual drawings for explaining the adjustment of the displayed image.
  • FIG. 60 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 54.
  • FIG. 61 illustrates an exemplary block diagram for transmitting stereoscopic images and photographing ratios for the images.
  • FIG. 62 illustrates an exemplary block diagram for storing on a persistent memory stereoscopic images and photographing ratios for the images.
  • FIG. 63 illustrates an exemplary format of the data that are stored in the recording medium of FIG. 62.
  • FIG. 64 illustrates an exemplary block diagram of a pair of portable communication devices comprising a pair of digital cameras and a pair of display screens.
  • FIG. 65 illustrates an exemplary block diagram of a portable communication device for displaying stereoscopic images based on a photographing ratio and a screen ratio.
  • FIGS. 66A and 66B illustrate an exemplary block diagram of a portable communication device for controlling the location of the stereoscopic images.
  • FIG. 67 illustrates an exemplary block diagram of a portable communication device for controlling space magnification for stereoscopic images.
  • FIG. 68 illustrates a conceptual drawing for explaining a portable communication device having separate display screens.
  • FIGS. 69A and 69B illustrate an exemplary block diagram for explaining the generation of the stereoscopic images from three-dimensional structural data.
  • FIGS. 70 illustrates a 3D display system for conforming the resolution between the stereoscopic cameras and display devices.
  • FIG. 1A illustrates one typical 3-D image generating and reproducing apparatus.
  • the system of FIG. 1A uses two display devices so as to display stereoscopic images.
  • the apparatus includes a set of stereoscopic cameras 110 and 120 , spaced apart a predetermined distance from each other.
  • the cameras 110 and 120 may be spaced apart as the same as exists distance between a viewer's two eyes, for photographing an object 100 at two different positions.
  • Each camera 110 and 120 provides each photographed image simultaneously or sequentially to the display devices 140 and 150 , respectively.
  • the display devices 140 and 150 are located such that a viewer can watch each image displayed in the devices 140 and 150 through their left and right eyes, respectively.
  • the viewer can recognize a 3-D image by simultaneously or sequentially perceiving and synthesizing the left and right images. That is, when the viewer sees a pair of stereoscopic images with each eye, a single image (object) is perceived having a 3D quality.
  • FIG. 1B illustrates another typical 3-D image generating and reproducing apparatus.
  • the system of FIG. 1B uses one display device so as to display stereoscopic images.
  • the apparatus includes a set of stereoscopic cameras 110 and 120 , spaced apart a predetermined distance from each other for photographing the same object 100 at the two different positions.
  • Each camera 110 and 120 provides each photographed image to a synthesizing device 130 .
  • the synthesizing device 130 receives two images from the left and right cameras 110 and 120 , and sequentially irradiates the received images on a display device 160 .
  • the synthesizing device 130 may be located in either a camera site or a display site.
  • the viewer wears special glasses 170 that allow each displayed image to be seen by each eye.
  • the glasses 170 may include a filter or a shutter that allows the viewer to see each image alternately.
  • the display device 160 may comprise a LCD or a 3-D glasses such as a head mounted display (HMD).
  • HMD head mounted display
  • the size of the displayed image is determined. Also, as the distance between the left and right images displayed on the display device has the same ratio as the distance between a viewer's two eyes, the viewer feels a sense of viewing the actual object in 3-dimensions.
  • an object may be photographed by a camera while the object moves, the camera moves, or a magnifying (zoom-in) or reducing (zoom-out) imaging function is performed with respect to the object, not being in a state in which a fixed object is photographed by a fixed camera.
  • the distance between the camera and the photographed object, or the size of the photographed object changes.
  • a viewer may perceive the image having a sense of distance different than is the actual distance from the camera to the object.
  • each viewer has their own unique eye distance, a biometric which is measured as the distance between the center points of the viewer's eyes. For example, the distance between an adult's eyes is quite different from that of a child's eyes. Also the eye distance varies between viewers of the same age. In the meantime, in current 3D display systems, the distance between the center points of each stereoscopic image is fixed at the distance value of the average adult (i.e., 70 mm) as exemplified in FIGS. 1A and 1B. However, as discussed above, each viewer has their own personal eye distance. This may cause a headache when the viewer sees stereoscopic images as well as the sense of 3-dimensions being distorted. In certain instances, the sense of 3-dimensions is not even perceived.
  • one aspect of the invention is to adjust display images or display devices such that a screen ratio (D:E:F) in the display device is substantially the same as a photographing ratio (A:B:C) in the camera.
  • a stereoscopic image comprises a pair of two-dimensional plane images produced by a pair of stereoscopic images.
  • Stereoscopic images comprise a plurality of stereoscopic images.
  • FIGS. 2A and 2B illustrate a photographing ratio of a camera.
  • the ratio relates to a scope or the size of the space, being proportional to a range which is seen through a viewfinder of a camera, that the camera can photograph in a scene.
  • the photographing ratio includes three parameters (A, B, C).
  • Parameters A and B are defined as horizontal and vertical lengths of the space, respectively, including the object 22 photographed by the camera 20 .
  • Parameter C is defined as the perpendicular distance between the camera 20 and the object 22 .
  • a camera has its own horizontal and vertical ranges that can photograph an object, and the ratio of the horizontal and vertical lengths is typically constant, e.g., 4:3 or 16:9. Thus, once one of the horizontal and vertical lengths is determined, the other length may be automatically determined.
  • the camera 20 comprises a video camera, a still camera, an analog camera, or a digital camera.
  • the object 22 is located “20 m” away from the camera 20 and is photographed such that the object 22 is included in a single film or an image frame as shown in FIGS. 2A and 2B.
  • the horizontal distance (A) is 20 m
  • the present photographing ratio while photographing an object may be determined based on the optical property of a camera object lens, e.g., the maximum photographing ratio and minimum photographing ratio.
  • FIGS. 3A and 3B illustrate a screen ratio of a display device that displays a photographed image.
  • the screen ratio relates to a range or scope that a viewer can see through a display device.
  • the screen ratio includes three parameters (D, E, F).
  • Parameters D and E are defined as horizontal and vertical lengths of the image displayed in the display device 24 , respectively.
  • Parameter F is defined as the perpendicular distance between the display device and a viewer's eye 26 . For convenience, only one eye 26 and one display device 24 are illustrated instead of two eyes and a set of display devices in FIGS. 3A and 3B.
  • F may be automatically measured using a distance detection sensor or may be manually measured, or may be fixed.
  • parameters D and E are adjusted such that the photographing ratio (A:B:C) equals the screen ratio (D:E:F).
  • the size of the adjusted image in the display device 24 corresponds to that of the image that has been captured by the camera 20 .
  • the camera photographs an object with a large photographing ratio
  • the image is displayed using a large screen ratio.
  • FIG. 4A illustrates the variation of the distance between an object lens and a film according to the variation of a focal length of the camera 20 .
  • film is not limited to analog image recording media.
  • a CCD device or CMOS image sensor may be use to capture an image in a digital context.
  • the camera 20 may have more focal length ranges, but only four focal length ranges are exemplified in FIG. 4A.
  • the distance between a film and an object lens ranges from d1-d4 according to the focal length of the camera 20 .
  • the focal length may be adjusted by a focus adjusting portion (which will be explained below) of the camera 20 .
  • the distance (d1) is shortest when the focal length is “infinity” ( ⁇ ).
  • the focal length
  • the distance (d4) is longest when the focal length is “0.5 m,” where the camera receives the least amount of light through the object lens. That is, the amount of light coming into the camera 20 varies according to the focal length of the camera 20 .
  • the location of the object lens is normally fixed, in order to change the distance from d1 to d4, the location of the film ranges from P s to P 1 as much as “d” according to the focal length.
  • the focus adjusting portion of the camera 20 adjusts the location of the film from P s to P 1 .
  • the focus adjusting of the camera 20 may be manually performed or may be automatically made.
  • FIG. 4B illustrates the variation of a photographing ratio according to the variation of the focal length of the camera 20 .
  • the photographing ratio (A:B:C) may be expressed as (A/C:B/C).
  • the value A/C or B/C is the biggest amount, which is shown as “2.0/1” in FIG. 4B.
  • the value A/C or B/C is the smallest amount, which is shown as “1.0/1” in FIG. 4B. That is, the more amount of light the camera receives, the larger the photographing ratio. Similarly, the longer the focal length, the greater the photographing ratio.
  • FIG. 4C shows the relationship between a photographing ratio and a focal length of a camera.
  • the focal length of the camera may be determined, e.g., by detecting a current scale location of the focus adjusting portion of the camera.
  • the focus adjusting portion is located in one position of the scales between 0.3 m and infinity while the camera is photographing an object.
  • the photographing ratio varies linearly as shown in FIG. 4C. If the camera has a focus adjusting portion that is automatically adjusted while photographing an object, the photographing ratio may be determined by detecting the current focal length that is automatically adjusted.
  • FIG. 4D illustrates an exemplary table showing maximum and minimum photographing ratios of a camera.
  • the maximum and minimum photographing ratios of the camera are determined by the optical characteristic of the camera.
  • a camera manufacturing company may provide the maximum and minimum photographing ratios in the technical specification of the camera.
  • the table in FIG. 4D is used for determining a photographing ratio when the focus adjusting portion is located in one scale between “0.3 m and an infinity.”
  • FIG. 5A illustrates a photographing ratio calculation apparatus according to one aspect of the invention.
  • the apparatus comprises a focus adjusting portion (FAP) 52 , a FAP location detection portion 54 , a memory 56 , and a photographing ratio calculation portion 58 .
  • the photographing ratio calculation apparatus may be embedded into the camera 20 .
  • the focus adjusting portion 52 adjusts the focus of the object lens of the camera 20 .
  • the focus adjusting portion 52 may perform its function either manually or automatically.
  • the focus adjusting portion 52 may comprise 10 scales between “0.3 m and infinity,” and is located in one of the scales while the camera 20 is photographing an object.
  • the focus adjusting portion 52 may use a known focus adjusting portion that is used in a typical camera.
  • the FAP location detection portion 54 detects the current scale location of the focus adjusting portion 52 among the scales.
  • the FAP location detection portion 54 may comprise a known position detection sensor that detects the scale value in which the focus adjusting portion 52 is located.
  • the FAP location detection portion 54 may comprise a known distance detection sensor that measures the distance between the object lens and film.
  • the memory 56 stores data representing maximum and minimum photographing ratios of the camera 20 .
  • the memory 56 comprise a ROM, a flash memory or a programmable ROM. This may apply to all of the other memories described throughout the specification.
  • the photographing ratio calculation portion 58 calculates a photographing ratio (A:B:C) based on the detected scale location and the maximum and minimum photographing ratios.
  • the photographing ratio calculation portion 58 comprises a digital signal processor (DSP) calculating the ratio (A:B:C) using the following Equations I and II.
  • parameters A max and B max represent horizontal and vertical length values (A and B) of the maximum photographing ratio, respectively, exemplified as “3” and “2” in FIG. 4D.
  • Parameters A min and B min represent horizontal and vertical length values (A and B) of the minimum photographing ratio, respectively, shown as “1.5” and “1” in FIG. 4D.
  • Parameters S cur and S tot represent the current detected scale value and the total scale value, respectively.
  • the photographing ratio calculation portion 58 calculates a photographing ratio (A:B:C) such that the ratio falls between the maximum and minimum photographing ratios and at the same time is proportional to the value of the detected scale location.
  • A:B:C a photographing ratio
  • the camera 20 photographs an object ( 602 ).
  • the camera 20 comprise a single (mono) camera.
  • the camera 20 comprise a pair of stereoscopic cameras as shown in FIG. 1A. In either case, the operation of the apparatus will be described based on the single camera for convenience.
  • Maximum and minimum photographing ratios are provided from the memory 56 to the photographing ratio calculation portion 58 ( 604 ).
  • the photographing ratio calculation portion 58 may store the maximum and minimum photographing ratios therein. In this situation, the memory 56 may be omitted from the apparatus.
  • the FAP location detection portion 54 detects the current location of the focus adjusting portion 52 while the camera 20 is photographing the object ( 606 ). While the camera is photographing the object, the focal length may be changed. The detected current location of the focus adjusting portion 52 is provided to the photographing ratio calculation portion 58 .
  • the photographing ratio calculation portion 58 calculates a vertical value (B) of a current photographing ratio from Equation II ( 610 ).
  • FIG. 5B illustrates a block diagram of a photographing ratio calculation apparatus according to another aspect of the invention.
  • the apparatus comprises an iris 62 , an iris opening detection portion 64 , a memory 66 and a photographing ratio calculation portion 68 .
  • the photographing ratio calculation apparatus is embedded into the camera 20 .
  • the iris 62 is a device that adjusts an amount of light coming into the camera 20 according to the degree of its opening.
  • the degree of the opening of the iris 62 is largest, the maximum amount of light shines on the film of the camera 20 .
  • This largest opening corresponds to the longest focal length and the maximum photographing ratio.
  • the degree of the opening of the iris 62 is smallest, the least amount of light comes into the camera 20 .
  • This smallest opening corresponds to the shortest focal length and the minimum photographing ratio.
  • the iris 62 may be a known iris that is used in a typical camera.
  • the iris opening detection portion 64 detects the degree of the opening of the iris 62 .
  • the degree of the opening of the iris 62 may be quantitized to a range of, for example, 1-10. Degree “10” may mean the largest opening of the iris 62 and degree “1” may mean the smallest opening of the iris 62 .
  • the memory 66 stores data representing maximum and minimum photographing ratios of the camera 20 .
  • the photographing ratio calculation portion 68 calculates a photographing ratio (A:B:C) based on the detected degree of the opening and the maximum and minimum photographing ratios.
  • the photographing ratio calculation portion 68 comprises a digital signal processor (DSP) calculating the ratio (A:B:C) using the following Equations III and IV.
  • Equations III and IV parameters A max and B max , A min and B min , and “c” are the same as the parameters used in Equations I and II.
  • Parameters I cur and I largest represent the detected current degree of the opening and the largest degree of the opening, respectively.
  • FIG. 6B the operation of the photographing ratio calculation apparatus will be described.
  • the operation with regard to the first two procedures 702 and 704 is the same as those in FIG. 6A.
  • the iris opening detection portion 64 detects the current degree of the opening of the iris 62 while the camera 20 is photographing the object ( 706 ). The detected degree of the opening of the iris 62 is provided to the photographing ratio calculation portion 68 .
  • the photographing ratio calculation portion 68 calculates a vertical value (B) of a current photographing ratio from Equation IV ( 710 ).
  • the vertical value B is obtained as follows.
  • the photographing ratio calculation portion 68 retrieves parameter C from the maximum and minimum ratios that have been used for calculating parameters A and B ( 712 ). Referring to FIG. 4D, the distance value is “1.”
  • the photographing ratio calculation portion 68 provides a current photographing ratio (A:B:C) ( 714 ). In the above example, a current photographing ratio is 1.8:1.2:1.
  • FIG. 7 illustrates a camera comprising the photographing ratio calculation apparatus as shown in FIGS. 5A and 5B.
  • the camera 20 comprises an image data processing apparatus 70 , a microcomputer 72 , a photographing ratio calculation apparatus 74 , and a data combiner 76 .
  • the camera 20 comprises an analog camera and a digital camera.
  • the image data processing apparatus 70 performs a typical image processing of the photographed image according to the control of the microcomputer 72 .
  • the image data processing apparatus 70 may comprise a digitizer that digitizes the photographed analog image into digital values, a memory that stores the digitized data, and a digital signal processor (DSP) that performs an image data processing of the digitized image data (all not shown).
  • DSP digital signal processor
  • the image data processing apparatus 70 provides the processed data to a data combiner 76 .
  • the photographing ratio calculation apparatus 74 comprises the apparatus shown in FIGS. 5A or 5 B.
  • the photographing ratio calculation apparatus 74 calculates a photographing ratio (A:B:C).
  • the calculated photographing ratio (A:B:C) data are provided from the apparatus 74 to the data combiner 76 .
  • the microcomputer 72 controls the image data processing apparatus 70 , the photographing ratio calculation apparatus 74 , and the data combiner 76 such that the camera 20 outputs the combined data 78 .
  • the microcomputer 72 controls the image data processing apparatus 70 such that the apparatus properly processes the digital image data.
  • the microcomputer 72 controls the photographing ratio calculation apparatus 74 to calculate a photographing ratio for the image being photographed.
  • the microcomputer 72 controls the data combiner 76 to combine the processed data and the photographing ratio data corresponding to the processed data.
  • the microcomputer 72 may provide a synchronization signal to the data combiner 76 so as to synchronize the image data and the ratio data.
  • the microcomputer 72 may detect the change of the scale location or the opening degree, and control the data combiner 76 such that the image data and the corresponding ratio data are properly combined.
  • the microcomputer 72 is programmed to perform the above function using typical microcomputer products, available from the Intel, IBM and Motorola companies, etc. This product may also apply to the other microcomputers described throughout this specification.
  • the data combiner 76 combines the image data from the image data processing apparatus 70 and the calculated photographing ratio (A:B:C) data according to the control of the microcomputer 72 .
  • the combiner 76 outputs the combined data 78 in which the image data and the ratio data may be synchronized with each other.
  • the combiner 76 comprises a known multiplexer.
  • FIG. 8 illustrates a system for displaying stereoscopic images such that a photographing ratio (A:B:C) is substantially the same as a screen ratio (D:E:F).
  • the system comprises a camera site 80 and a display site 82 .
  • the camera site 80 transmits a photographing ratio (A:B:C) and photographed image to the display site 82 .
  • the display site 82 displays the transmitted image such that a screen ratio (D:E:F) is substantially the same as the photographing ratio (A:B:C).
  • the camera site 80 may comprise a single camera and the display site may comprise a single display device.
  • the camera site may comprise a set of stereoscopic cameras and the display site may comprise a set of display devices as shown in FIG. 8.
  • the embodiment of camera site 80 shown in FIG. 8 comprises a set of stereoscopic cameras 110 and 120 , and transmitters 806 and 808 .
  • the stereoscopic left and right cameras 110 and 120 may be located as shown in FIG. 1A with regard to an object to be photographed.
  • the cameras 110 and 120 comprise the elements described with respect to FIG. 7.
  • Each of the cameras 110 and 120 provides its own combined data 802 and 804 to the transmitters 806 and 808 , respectively.
  • Each transmitter 806 and 808 transmits the combined data 802 and 804 to the display site 82 through a network 84 .
  • the network 84 may comprise a wire transmission or a wireless transmission.
  • each transmitter 806 and 808 is separate from the cameras 110 and 120 .
  • each transmitter 806 and 808 may be embedded into each camera 110 and 120 .
  • both of the photographing ratios are referred to as “A1 :B1:C1” and “A2:B2:C2,” respectively.
  • the photographing ratios “A1:B1:C1” and “A2:B2:C2” are substantially the same.
  • the data 802 and 804 may be combined and transmitted to the display site 82 .
  • the photographing ratio may have a standard data format in each of the camera and display sites so that the display site can identify the photographing ratio easily.
  • the display site 82 comprises a set of receivers 820 , 832 , a set of display devices 86 , 88 .
  • Each receiver 820 , 832 receives the combined data transmitted from the camera site 80 and provides each data set to the display devices 86 , 88 , respectively.
  • each of the receivers 820 , 832 is separate from the display devices 86 , 88 .
  • receivers 820 , 832 may be embedded into each display device 86 , 88 .
  • the display devices 86 and 88 comprise data separators 822 and 834 , image size adjusting portions 828 and 840 , and display screens 830 and 842 .
  • the data separators 822 and 834 separate the photographing ratio data ( 824 , 838 ) and the image data ( 826 , 836 ) from the received data.
  • each of the data separators 822 and 834 comprises a typical demultiplexer.
  • the image size adjusting portion 828 adjusts the size of the image to be displayed in the display screen 830 based on the photographing ratio (A1:B1:C1), and screen-viewer distance (F1) and display screen size values (G1, H1).
  • the screen-viewer distance (F1) represents the distance between the display screen 830 and one of a viewer's eyes, e.g., a left eye, that is directed to the screen 830 .
  • the distance F1 may be fixed. In this situation, a viewer's eyes may be located in a eye fixing structure, which will be described in more detail later.
  • the image size adjusting portion 828 may store the fixed value F1 therein.
  • the screen size values G1 and H1 represent the horizontal and vertical dimensions of the screen 830 , respectively. In one embodiment of the invention, the size values G1 and H1 may be stored in the image size adjusting portion 828 .
  • the image size adjusting portion 840 adjusts the size of the image to be displayed in the display screen 842 based on the photographing ratio (A2:B2:C2), and screen-viewer distance (F2) and display screen size values (G2, H2).
  • the screen-viewer distance (F2) represents the distance between the display screen 842 and one of a viewer's eyes, e.g., a right eye, that is directed to the screen 842 .
  • the distance F2 may be fixed.
  • the screen-viewer distance (F2) is substantially the same as the screen-viewer distance (F1).
  • the screen size values G2 and H2 represent the horizontal and vertical dimensions of the screen 842 , respectively.
  • the display screen size values G2 and H2 are substantially the same as the display screen size values G1 and H1.
  • the image data 826 , the photographing ratio data (A1:B1:C1) and the screen-viewer distance (F1) are provided to the image size adjusting portion 828 ( 902 ).
  • a screen ratio (D1:E1 F1) is calculated based on the photographing ratio (A1:B1:C1) and the screen-viewer distance (F1) using the following Equation V ( 904 ). Since the value F1 is already provided, the parameters D1 and E1 of the screen ratio are obtained from Equation V.
  • the horizontal and vertical screen size values (G1, H1) of the display screen 830 are provided to the image size adjusting portion 828 ( 906 ).
  • the screen size values G1 and H1, and the distance value F1 are fixed and stored in the image size adjusting portion 828 .
  • the screen size values G1 and H1, and the distance value F1 are manually provided to the image size adjusting portion 828 .
  • Image magnification (reduction) ratios d and e are calculated from the following Equation VI ( 908 ).
  • the ratios d and e represent horizontal and vertical magnification (reduction) ratios for the display screens 830 and 842 , respectively.
  • magnification (reduction) ratios (d, e) are greater than “1” ( 910 ). If both of the ratios (d, e) are greater than 1, the image data 826 are magnified as much as “d” and “e,” respectively, as shown in FIG. 10A ( 912 ). In one embodiment of the invention, the portion of the image greater than the screen sizes (G1, H1) is cut out as shown in FIG. 10A ( 914 ).
  • both of the ratios “d” and “e” are not greater than 1, it is determined whether the magnification (reduction) ratios (d, e) are less than “1” ( 916 ). If both of the ratios d and e are less than 1, the image data 826 are reduced as much as “d” and “e,” respectively, as shown in FIG. 10B ( 918 ). In one embodiment of the invention, the blank portion of the screen is filled with background color, e.g., black color, as shown in FIG. 10B ( 920 ).
  • background color e.g., black color
  • Photographing an object is performed using a set of stereoscopic cameras 110 and 120 ( 1120 ), as exemplified in FIG. 1A.
  • Each of the cameras 110 and 120 calculates the photographing ratio (A1 :B 1 :C1) and (A2:B2:C2), respectively ( 1140 ), for example, using the method shown in FIG. 6.
  • the image data and the photographing ratio that are calculated for the image are combined for each of the stereoscopic cameras 110 and 120 ( 1160 ).
  • the combined data are illustrated as reference numerals 802 and 804 in FIG. 8.
  • the combining is performed per a frame of the image data.
  • the combining may not be performed and only image data without the photographing ratio may be transmitted to the display site 82 . In that situation, when the photographing ratio is changed, the combining may resume. Alternatively, the photographing ratio is not combined, and rather, is transmitted separately from the image data.
  • Each of the transmitters 806 and 808 transmits the combined data to the display site 82 through the communication network 84 ( 1180 ).
  • Each of the receivers 820 and 832 receives the transmitted data from the camera site 80 ( 1200 ).
  • the photographing ratio and image data are separated from the combined data ( 1220 ).
  • the image data and photographing ratio are separately received as they are not combined in transmission.
  • the combined data may not include a photographing ratio.
  • the photographing ratio that has been received most recently is used for calculating the screen ratio.
  • the screen ratio may remain unchanged until the new photographing ratio is received.
  • the screen ratios (D1:E1:F1) and (D2:E2:F2) for each of the display devices 86 and 88 are calculated using the method described with regard to FIG. 9 ( 1240 ).
  • the stereoscopic images are displayed such that each of the photographing ratios (A1 :B1 :C1) and (A2:B2:C2) is substantially the same as each of the screen ratios (D1:E1:F1) and (D2:E2:F2) ( 1260 ).
  • the image may be magnified or reduced with regard to the screen size of each of the display devices 86 and 88 as discussed with reference to FIGS. 9 and 10.
  • FIG. 12 illustrates examples of the display system according to one embodiment of the invention.
  • FIG. 12A illustrates a head mount display (HMD) system.
  • the HMD system comprises the pair of the display screens 1200 and 1220 .
  • the electronic display mechanism as exemplified in FIG. 8 is omitted in this HMD system.
  • a viewer wears the HMD on his or her head and watches stereoscopic images through each display screen 1200 and 1220 .
  • the screen-viewer's eye distance (F) may be fixed.
  • the distance (F) may be measured with a known distance detection sensor and provided to the HMD system.
  • Another embodiment of the invention includes a 3D display system as shown in FIG. 1B.
  • Another embodiment of the display devices includes a pair of projection devices that project a set of stereoscopic images on the screen.
  • FIG. 12B illustrates a 3D display system according to another embodiment of the invention.
  • the display system comprises a V shaped mirror 1240 , and a set of display devices 1260 and 1280 .
  • the display devices 1260 and 1280 are substantially the same as the display devices 86 and 88 of FIG. 8 except for further comprising an inverting portion (not shown), respectively.
  • the inverting portion inverts the left and right sides of the image to be displayed.
  • the V shaped mirror 1240 reflects the images coming from the display devices 1260 and 1280 to a viewer's eyes. Thus, the viewer watches a reflected image from the V shaped mirror 1240 .
  • the 3 D display system comprising the V shaped mirror is disclosed in U.S. application Ser.
  • FIG. 13 illustrates a 3D display system including an eye position fixing device 1300 according to one aspect of the invention.
  • the eye position fixing device 1300 is located in front of the V shaped mirror 1240 at a predetermined distance from the mirror 1240 .
  • the eye position fixing device 1300 is used for fixing the distance between the mirror 1240 and a viewer's eyes.
  • the eye position fixing device 1300 is also used for locating a viewer's eyes such that each of the viewer's eyes are substantially perpendicular to each of the mirror (imaginary) images.
  • a pair of holes 1320 and 1340 defined in the device 1300 are configured to allow the viewer to see each of the center points of the reflected images.
  • each of the holes 1320 and 1340 is big enough to allow the viewer to see a complete half portion (left or right portion) of the V shaped mirror 1240 at a predetermined distance and location as exemplified in FIGS. 13A and 13B.
  • the eye position fixing device 1300 may be used for fixing the location of a viewer's eyes as necessary with regard to the other aspects of the invention as discussed below.
  • FIG. 14A illustrates a relationship between the displayed images and a viewer's eyes.
  • Distance (W d ) represents the distance between the center points ( 1430 , 1440 ) of each of the displayed images ( 1410 , 1420 ).
  • Distance (W a ) represents the distance between the center points ( 1450 , 1460 ) of each of a viewer's eyes.
  • the distance W a varies from person to person. Normally the distance increases as a person grows and it does not change when he or she reaches a certain age. The average distance of an adult may be 70 mm. Some people may have 80 mm distance, other people may have 60 mm distance.
  • Distance (V a ) represents the distance between the center points ( 1470 , 1480 ) of each of a viewer's eye lenses.
  • a lens means a piece of round transparent flesh behind the pupil of an eye. The lens moves along the movement of the eye.
  • the distance V a changes according to the distance (F) between an object and the viewer's eyes. The farther the distance (F) is, the greater the value V a becomes.
  • V amax when a viewer sees an object farther than, for example, 10,000 m, V a has the maximum value (V amax ) which is substantially the same as the distance W a .
  • FIG. 15 illustrates a 3D image display system according to one aspect of the invention.
  • the system may be used with, for example, either a HMD system or a display system with the V shaped mirror shown in FIGS. 13A and 13B, a projection display system, respectively.
  • the system shown in FIG. 15 comprises a pair of display devices 1260 and 1280 , and a pair of input devices 1400 and 1500 .
  • Each of the input devices 1400 and 1500 provides the distance value W a , to each of the display devices 1260 and 1280 .
  • each of the input devices 1400 and 1500 comprises a keyboard, a mouse, a pointing device, or a remote controller.
  • one of the input devices 1400 and 1500 may be omitted and the other input device is used for providing the distance value W a to both of the display devices 1260 and 1280 .
  • the display devices 1260 and 1280 comprise interfaces 1510 and 1550 , microcomputers 1520 and 1560 , display drivers 1530 and 1570 , and display screens 1540 and 1580 , respectively.
  • each of the display screens 1540 and 1580 comprises a LCD screen, a CRT screen, or a PDP screen.
  • the interfaces 1510 and 1550 provide the interface between the input devices 1400 and 1500 and the microcomputers 1520 and 1560 , respectively.
  • each of the interfaces 1510 and 1550 comprises a typical input device controller and/or a typical interface module (not shown).
  • W a There may be several methods to measure and provide the distance (W a ).
  • an optometrist may measure the W a value of a viewer with eye examination equipment. In this situation, the viewer may input the value (W a ) via the input devices 1400 , 1500 .
  • an eye lens motion detector may be used in measuring the W a value. In this situation, the W a value may be provided from the detector to either the input devices 1400 , 1500 or the interfaces 1510 , 1550 in FIG. 15.
  • the W a value may be measured using a pair of parallel pipes 200 , 220 , about 1 m in length and about 1 mm in diameter, which are spaced approximately 1 cm apart from a viewer's eyes. Each end of the pipes 200 , 220 is open.
  • the pipe distance (P d ) may be adjusted between about 40 mm and about 120 mm by widening or narrowing the pipes 200 , 220 .
  • the pipes 200 , 220 maintain a parallel alignment while they are widened or narrowed.
  • a ruler 240 may be attached into the pipes 200 , 220 , as shown in FIG. 14C so that the ruler 240 can measure the distance between the pipes 200 , 220 .
  • the ruler 240 indicates the W a value of the viewer.
  • red and blue color materials paper, plastic, or glass
  • the pipe distance (P d ) is the W a value of the viewer where the viewer perceives a purple color from the holes 260 , 280 by the combination of the red and blue colors.
  • Each of the microcomputers 1520 and 1560 determines an amount of movement for the displayed images based on the provided W a value such that the W d value is substantially the same as the W a value.
  • each microcomputer ( 1520 , 1560 ) initializes the distance value W d and determines an amount of movement for the displayed images based on the value W a and the initialized value W d .
  • Each of the display drivers 1530 and 1570 moves the displayed images based on the determined movement amount and displays the moved images on each of the display screens 1540 and 1580 .
  • each microcomputer ( 1520 , 1560 ) may incorporate the function of each of the display drivers 1530 and 1570 . In that situation, the display drivers 1530 and 1570 may be omitted.
  • a set of stereoscopic images are displayed in the pair of display screens 1540 and 1580 ( 1610 ).
  • the stereoscopic images may be provided from the stereoscopic cameras 110 and 120 , respectively, as exemplified in FIG. 1A.
  • the distance (W d ) between the center points of the displayed images is initialized ( 1620 ).
  • the initial value may comprise the eye distance value of the average adult, e.g., “70 mm.”
  • the distance (W a ) between the center points of a viewer's eye lenses is provided ( 1630 ).
  • W a does not equal W d
  • W d it is determined whether W a is greater than W d ( 1650 ). If W a is greater than W d , the distance (W d ) needs to be increased until W d equals W a . In this situation, the left image 1750 displayed in the left screen 1540 is moved to the left side and the right image 1760 displayed in the right screen 1580 is moved to the right side until the two values are substantially the same as shown in FIG. 17A. Referring to FIG. 17B, movements of the displayed images 1750 and 1760 are conceptually illustrated for the display system with a V shaped mirror.
  • the V shaped mirror reflects the displayed images, which have been received from the display devices 1260 and 1280 , to a viewer, in order for the viewer to see the adjusted images through the mirror as shown in FIG. 17A, the displayed images 1750 and 1760 need to be moved with regard to the V shaped mirror as shown in FIG. 17B. That is, when the displayed images 1750 and 1760 are moved as shown in FIG. 17B, the viewer who sees the V shaped mirror perceives the image movement as shown in FIG. 17A.
  • the movement direction of the displayed images is the same as the direction of those shown in FIG. 17A.
  • the projection display system described in connection with FIG. 15 since the projection display system projects images into a screen that is located across the projection system, the movement direction of the displayed images is opposite to the direction of those shown in FIG. 17A.
  • the distance W d needs to be reduced until W d equals W a .
  • the left image 1770 displayed in the display device 1260 is moved to the right side and the right image 1780 displayed in the display device 1280 is moved to the left side until the two values are substantially the same as shown in FIGS. 17C and 17D.
  • the same explanation with regard to the movement of the displayed images described in FIGS. 17A and 17B applies to the system of FIGS. 17C and 17D.
  • FIG. 18 illustrates a 3D image display system according to another embodiment of the invention.
  • the system comprises an input device 1810 , a microcomputer 1820 , a pair of servo mechanisms 1830 and 1835 , and a pair of display devices 1840 and 1845 .
  • the input device 1810 provides a viewer's input, i.e., the distance value W a , to each of the display devices 1260 and 1280 .
  • the input device 1810 may be a keyboard, a mouse, a pointing device, or a remote controller, for example.
  • An interface is omitted for convenience.
  • the microcomputer 1820 determines an amount of the movement for the display devices 1840 and 1845 based on the provided value W a such that the W d value is substantially the same as the W a value. In one embodiment of the invention, the microcomputer 1820 initializes the distance value (W d ) and determines an amount of the movement for the display devices 1840 and 1845 based on the value W a and the initialized value W d . Each of the servo mechanisms 1830 and 1835 moves the display devices 1840 and 1845 , respectively, based on the determined movement amount.
  • Each of stereoscopic images is displayed in the display devices 1840 and 1845 ( 1850 ).
  • the distance (W d ) between the center points of the displayed images is initialized ( 1855 ). In one embodiment of the invention, the initial value may be “70 mm.”
  • the distance (W a ) between the center points of a viewer's eyes is provided to the microcomputer 1820 ( 1860 ). It is determined whether W a equals W d ( 1870 ). If W a equals W d , no movement of the display devices 1840 and 1845 is made ( 1910 ).
  • the servo mechanisms 1830 and 1835 move the display devices 1840 and 1845 , respectively such that W d is widened to W a as shown in FIGS. 20A and 20B. If it is determined that W a is not greater than W d , the servo mechanisms 1830 and 1835 move the display devices 1840 and 1845 , respectively such that W d is narrowed to W a as shown in FIGS. 20C and 20D.
  • the distance (V a ) is automatically detected using a known eye lens motion detector.
  • the detector 2100 detects the distance V a between the center points of a viewer's eye lenses.
  • the detector 2100 detects the locations of each of the eye lenses.
  • a 2L and A 2R represent the center points of a viewer's eye lenses
  • a 3L and A 3R represent the center points of a viewer's eyes.
  • the A 3L location is fixed, but the A 2L location moves.
  • the detector 2100 detects the current locations of each of the eye lenses.
  • the detector 2100 comprises a known eye lens detecting sensor disclosed, for example, in U.S. Pat. No. 5,526,089.
  • the detected distance and location values are provided to a microcomputer 2120 .
  • the microcomputer 2120 receives the distance value V a and determines an amount of movement for the displayed images or an amount of movement for the display devices similarly as described with regard to FIGS. 15 - 20 . The determined amount is used for controlling either the movement of the displayed images or the movement of the display devices.
  • the microcomputer 2120 determines new locations of the center points of the images based on the location values of the eye lenses.
  • the microcomputer 2120 controls the display drivers ( 1530 , 1570 ) or the servo mechanisms ( 1830 , 1835 ) to move the stereoscopic images from the current center points 2210 and 2230 of the images to, for example, new center points 2220 and 2240 as shown in FIG. 22.
  • FIG. 23 illustrates a camera system for a 3D display system according to one aspect of the invention.
  • the camera system is directed to provide photographed image data and camera motion detection data to a display site.
  • the camera system comprises a set of stereoscopic cameras 2200 , 2210 , motion detection devices 2220 , 2230 , combiners 2240 , 2250 , and transmitters 2280 , 2290 .
  • Each of the stereoscopic cameras 2200 , 2210 captures an image and provides the captured image data to each of the combiners 2240 , 2250 .
  • the motion detection devices 2220 and 2230 detect the motion of the cameras 2200 and 2210 , respectively.
  • the motion of the cameras 2200 and 2210 may comprise motions for upper and lower directions, and left and right directions as shown in FIG. 23.
  • Each detection device ( 2220 , 2230 ) provides the detection data to each of the combiners 2240 and 2250 .
  • the devices 2220 and 2230 may provide no detection data or provide information data representing no motion detection to the combiners 2240 and 2250 .
  • each of the motion detection devices 2220 and 2230 comprises a typical motion detection sensor.
  • the motion detection sensor may provide textual or graphical detection data to the combiners 2240 and 2250 .
  • the combiners 2240 and 2250 combine the image data and the motion detection data, and provide the combined data 2260 and 2270 to the transmitters 2280 and 2290 , respectively. If the combiners 2240 and 2250 receive information data representing no motion detection from the motion detection devices 2220 and 2230 , or if the combiners 2240 and 2250 do not receive any motion data, each combiner ( 2240 , 2250 ) provides only the image data to the transmitters 2280 and 2290 without motion detection data. In one embodiment of the invention, each of the combiners 2240 and 2250 comprises a typical multiplexer. Each of the transmitters 2280 and 2290 transmits the combined data 2260 and 2270 to the display site through a communication network (not shown).
  • FIG. 24 illustrates a display system corresponding to the camera system shown in FIG. 23.
  • the display system is directed to provide camera motion to a viewer.
  • the camera system comprises a pair of receivers 2300 and 2310 , data separators 2320 and 2330 , image processors 2340 and 2360 , microcomputers 2350 and 2370 , on screen data (OSD) circuits 2390 and 2410 , combiners 2380 and 2400 , display drivers 2420 and 2430 , and display screens 2440 and 2450 .
  • OSD on screen data
  • Each of the receivers 2300 and 2310 receives the combined data transmitted from the camera system, and provides the received data to the data separators 2320 and 2330 , respectively.
  • Each of the data separators 2320 and 2330 separates the image data and the motion detection data from the received data.
  • the image data are provided to the image processors 2340 and 2360 .
  • the motion detection data are provided to the microcomputers 2350 and 2370 .
  • the image processors 2340 and 2360 perform typical image data processing for the image data, and provide the processed data to the combiners 2380 and 2400 , respectively.
  • Each of the microcomputers 2350 and 2370 determines camera motion information from the motion detection data.
  • each microcomputer determines camera motion information for at least four directions, e.g., upper, lower, left, right.
  • the microcomputers 2350 and 2370 provide the determined camera motion information to the OSD circuits 2390 and 2410 , respectively.
  • Each of the OSD circuits 2390 and 2410 produces OSD data representing camera motion based on the determined motion information.
  • the OSD data comprise arrow indications 2442 - 2448 showing the motions of the cameras 2200 and 2210 .
  • the arrows 2442 and 2448 mean that the camera has moved to the upper and lower directions, respectively.
  • the arrows 2444 and 2446 mean that the camera has moved to the left and right directions, respectively.
  • the combiners 2380 and 2400 combine the processed image data and the OSD data, and provide the combined image to the display drivers 2420 and 2430 .
  • Each of the display drivers 2420 and 2430 displays the combined image in each of the display screens 2440 and 2450 .
  • Each of the stereoscopic cameras 2200 and 2210 images an object ( 2460 ).
  • the pair of the motion detection devices 2220 and 2230 detect the motions of the cameras 2200 and 2210 , respectively ( 2470 ).
  • the photographed image data and the motion detection data are combined in each of the combiners 2240 and 2250 ( 2480 ).
  • the combined data 2260 and 2270 are transmitted to the display site through a communication network ( 2490 ).
  • Other embodiments may not have the combining and separation of data as shown in the diagrams.
  • the transmitted data from the camera system are provided to the data separators 2320 and 2330 via the receivers 2300 and 2310 ( 2500 ).
  • the image data and the motion detection data are separated in the data separators 2320 and 2330 ( 2510 ).
  • the image data are provided to the image processors 2340 and 2360 , and each of the processors 2340 and 2360 processes the image data ( 2520 ).
  • the motion detection data are provided to the microcomputers 2350 and 2370 , and each of the microcomputers 2350 and 2370 determines motion information from the motion detection data ( 2520 ).
  • OSD data corresponding to motion information are generated based on the determined motion information in the OSD circuits 2390 and 2410 ( 2530 ).
  • the processed image data and the OSD data are combined together in the combiners 2380 and 2400 ( 2540 ).
  • the combined data are displayed in the display screens 2440 and 2450 ( 2550 ).
  • the OSD data are displayed on the display screens 2440 and 2450 , this means that at least one of the cameras 2200 and 2210 has moved.
  • the image also moves in the direction in which the cameras 2200 and 2210 have moved. This is for guiding a viewer's eye lenses to track the motion of the cameras 2200 and 2210 .
  • the arrows 2442 - 2448 are displayed right before the image is moved by the movement of the cameras so that a viewer can expect the movement of the images in advance.
  • the display system may allow the viewer to know the movement of the cameras 2200 and 2210 by providing a voice message that represents the movement of the cameras.
  • the voice message may be “the stereoscopic cameras have moved in the upper direction” or “the cameras have moved in the right direction.”
  • the OSD circuits 2390 and 2410 may be omitted.
  • both of the OSD data and voice message representing the movement of the cameras may be provided to the viewer.
  • the camera and display systems shown in FIGS. 23 and 24 comprise the functions in which the image is displayed such that the photographing ratio (A:B:C) equals the screen ratio (A:B:C) as discussed with regard to FIGS. 7 - 11 .
  • the systems may comprise the function that displays stereoscopic images such that the distance between the center points of the stereoscopic images are substantially the same as the distance between the center points of a viewer's eyes as discussed with regard to FIGS. 15 - 22 .
  • Another aspect of the invention provides a 3D display system that controls the movement of the cameras according to a viewer's eye lens movement.
  • a 3D display system that controls the movement of the cameras according to a viewer's eye lens movement.
  • FIG. 26A is a conceptual drawing that illustrates parameters for stereoscopic cameras.
  • Each of the cameras 30 and 32 comprises object lenses 34 and 36 , respectively.
  • the camera parameters comprise C 2L , C 2R , C 3L , C 3R , S CL , S CR , V c and W c .
  • C 2L and C 2R represent the center points of the object lenses 34 and 36 , respectively.
  • C 3L and C 3R represent rotation axes of the cameras 30 and 32 , respectively.
  • S CL represents the line connecting C 2L and C 3L .
  • S CR represents the line connecting C 2R and C 3R .
  • V c represents the distance between C 2L and C 2R .
  • W c represents the distance between C 3L and C 3R .
  • the rotation axes C 3L and C 3R do not move and are the axes around which the cameras 30 and 32 rotate.
  • the rotation axes C 3L and C 3R allow the cameras 30 and 32 to rotate by behaving like a car windshield wiper, respectively, as shown in FIGS. 27 B- 27 E.
  • FIG. 27A illustrates a default position of the cameras 30 and 32 .
  • FIGS. 27 B- 27 D illustrate the horizontal movements of the cameras 30 and 32 .
  • FIG. 27E illustrates the vertical movements of the cameras 30 and 32 . In one embodiment of the invention, while they are moving and after they move as shown in FIGS. 27 B- 27 E, each of the cameras 30 and 32 is substantially parallel to each other.
  • FIG. 27A illustrates a default position of the cameras 30 and 32 .
  • FIGS. 27 B- 27 D illustrate the horizontal movements of the cameras 30 and 32 .
  • FIG. 27E illustrates the vertical movements of the cameras 30 and 32 .
  • each of the cameras 30 and 32 is substantially parallel to
  • 27F is a front view of one of the stereoscopic cameras and exemplifies the movements of the camera in eight directions.
  • the diagonal movements 46 a - 46 d may be performed by the combination of the horizontal and vertical movements.
  • the movement “ 46 a ” is made by moving the camera to the left and upper directions.
  • FIG. 26B is a conceptual drawing that illustrates parameters for a viewer's eyes.
  • Each of the eyes 38 and 40 comprises eye lenses 42 and 44 , respectively.
  • Each of the eye lenses is located substantially in the outside surface of the eyes. This means that the distance between each center point of the eyes and each eye lens is substantially the same as the radius of the eye.
  • the eye lens moves along with the rotation of the eye.
  • the eye parameters comprise A 2L , A 2R , A 3L , A 3R , S AL , S AR , V a and W a .
  • a 2L and A 2R represent the center points of the eye lenses 42 and 44 , respectively.
  • Each of the eye lenses 42 and 44 performs substantially the same function as the object lenses 34 and 36 of the stereoscopic cameras 30 and 32 in terms of receiving an image.
  • the eye parameters A 2L and A 2R may correspond to the camera parameters C 2L and C 2R .
  • a 3L and A 3R represent rotation axes of the eyes 38 and 40 , respectively.
  • the rotation axes A 3L and A 3R are the axes around which the eyes 38 and 40 rotate.
  • the rotation axes A 3L and A 3R allow the eyes 38 and 40 to rotate as shown in FIGS. 28 B- 28 D.
  • the rotation axes C 3L and C 3R of the stereoscopic cameras 30 and 32 do not move while the cameras 30 and 32 are rotating, so the rotation axes A 3L and A 3R of a viewer's eyes 38 and 40 do not move while the eyes 38 and 40 are rotating.
  • the eye parameters A 3L and A 3R may correspond to the camera parameters C 3L and C 3R .
  • S AL represents the line connecting A 2L and A 3L .
  • S AR represents the line connecting A 2R and A 3R .
  • the eye parameters S AL and S AR may correspond to the camera parameters S CL and S CR , respectively.
  • V a represents the distance between A 2L and A 2R .
  • W a represents the distance between A 3L and A 3R .
  • the eye parameters V a and W a may correspond to the camera parameters V c and W c , respectively.
  • FIG. 28A illustrates an example of the eye configuration in which a viewer sees an object at least “10,000 m” distant from him or her. This example corresponds to the camera configuration in which the focal length of the cameras are infinity. As discussed before, when a viewer sees an object farther than, for example, “10,000 m,” the distance (V a ) between the center points A 2L and A 2R of the eye lenses 42 and 44 is substantially the same as the distance (W a ) between the center points A 3L and A 3R of the eyes 38 and 40 .
  • FIG. 28D exemplifies the movements of the eyes in eight directions.
  • FIG. 29 illustrates a 3D display system for controlling a set of stereoscopic cameras according to another aspect of the invention.
  • the system comprises a camera site and a display site.
  • the display site is directed to transmit eye lens motion data to the camera site.
  • the camera site is directed to control the set of stereoscopic cameras 30 and 32 based on the eye lens motion data.
  • the display site comprises an eye lens motion detecting device 3000 , a transmitter 3010 , a pair of display devices 2980 and 2990 , a pair of receivers 2960 and 2970 , and a V shaped mirror 2985 .
  • the display site receives the images and displays through the display devices 2980 and 2990 .
  • a viewer sees stereoscopic images through the V shaped mirror that reflects the displayed image to the viewer.
  • the viewer's eye lenses may move in directions, e.g., latitudinal (upper or lower) and longitudinal (clockwise or counterclockwise) directions.
  • another display device such as a HMD, or a projection display device as discussed above, may be used.
  • the eye lens motion detecting device 3000 detects motions of each of a viewer's eye lenses while a viewer is watching 3D images through the V shaped mirror.
  • the motions may comprise current locations of the eye lenses.
  • the detecting device 3000 is substantially the same as the device 2100 shown in FIG. 21A.
  • the detecting device 3000 may convert the movements of the eye lenses to data that a microcomputer 2940 of the camera site can recognize, and provide the converted data to the transmitter 3010 .
  • the detection data may comprise a pair of (x,y) values for each of the eye lenses.
  • the transmitter 3010 transmits the eye lens motion data to the camera site through a communication network 3015 .
  • the detection data may comprise identification data that identify each of the left and right eye lenses in the camera site.
  • the display site may comprise a pair of transmitters each transmitting left and right eye lens motion data to the camera site.
  • data modification such as encoding and/or modulation adapted for transmitting may be performed before transmitting the motion data.
  • the camera site comprises a set of stereoscopic cameras 30 and 32 , a receiver 2950 , a microcomputer 2940 , a pair of camera controllers 2910 and 2920 , the pair of transmitters 2900 and 2930 .
  • the receiver 2950 receives the eye lens motion data from the display site, and provides the data to the microcomputer 2940 .
  • the microcomputer 2940 determines each of the eye lens motion data from the received data, and provides the left and right eye lens motion data to the camera controllers 2910 and 2920 , respectively.
  • the camera site may comprise a pair of receivers each of which receives left and right eye lens motion data from the display site, respectively. In that situation, each receiver provides each eye lens detection data to corresponding camera controllers 2910 and 2920 , respectively, and the microcomputer 2940 may be omitted.
  • the camera controllers 2910 and 2920 control each of the cameras 30 and 32 based on the received eye lens motion data. That is, the camera controllers 2910 and 2920 control movement of each of the cameras 30 and 32 in substantially the same directions as each of the eye lenses 42 and 44 moves.
  • the camera controllers 2910 and 2920 comprise servo controllers 3140 and 3190 , horizontal motors 3120 and 3160 , and vertical motors 3130 and 3180 , respectively.
  • Each of the servo controllers 3140 and 3190 controls the horizontal and vertical motors ( 3120 , 3160 , 3130 , 3180 ) based on the received eye lens motion data.
  • Each of the horizontal motors 3120 and 3160 respectively moves the cameras 30 and 32 in the horizontal directions.
  • Each of the vertical motors 3130 and 3180 respectively moves the cameras 30 and 32 in the vertical directions.
  • FIG. 31 illustrates a flow chart showing the operation of the camera controllers 2910 and 2920 according to one aspect of the invention.
  • FIG. 32A illustrates a table for controlling horizontal and vertical motors.
  • FIG. 32B illustrates a conceptual drawing that explains motion of the camera. Referring to FIGS. 31 and 32, the operation of the camera controllers 2910 and 2920 will be described. Since the operation of the camera controllers 2910 and 2920 are substantially the same, only the operation of the camera controller 2910 will be described.
  • the initialization may comprise setting the relationship between the adjusting values and the actual movement amount of the camera 30 as shown in FIG. 32.
  • the eye lens motion data are provided to the servo controller 3140 ( 3210 ).
  • the eye lens motion data comprise (x,y) coordinate values, where x and y represent the horizontal and vertical motions of each of the eye lenses, respectively.
  • the servo controller 3140 determines camera adjusting values (X, Y) based on the provided eye lens motion data. It is determined whether X equals “0” ( 3230 ). If X is “0,” the servo controller 3140 does not move the horizontal motor 3120 ( 3290 ). If X is not “0,” it is determined whether X is greater than “0” ( 3240 ). If X is greater than “0,” the servo controller 3140 operates the horizontal motor 3120 to move the camera 30 in the right direction ( 3270 ). As exemplified in FIG. 32A, if the value X is, for example, “1,” the movement amount is “2°,” and the direction is clockwise ( ⁇ 3 direction). If the value X is, for example, “2,” the movement is “4°” in a clockwise direction.
  • X is not greater than “0,” meaning this means that X is less than “0,” the servo controller 3140 operates the horizontal motor 3120 so as to move the camera 30 in a counterclockwise ( ⁇ 1 ) direction ( 3260 ).
  • the servo controller 3140 operates the horizontal motor 3120 so as to move the camera 30 in a counterclockwise ( ⁇ 1 ) direction ( 3260 ).
  • the value X is, for example, “ ⁇ 1”
  • the movement amount is “2°”
  • the direction is counterclockwise.
  • the value x is, for example, “ ⁇ 3,” the movement is “6°” in a counterclockwise ( ⁇ 1 ) direction.
  • Y it is determined whether Y equals “0” ( 3300 ). If Y is “0,” the servo controller 3140 does not move the vertical motor 3130 ( 3290 ). If Y is not “0,” it is determined whether Y is greater than “0” ( 3310 ). If Y is greater than “0,” the servo controller 3140 operates the vertical motor 3130 to move the camera 30 to +latitudinal (upper: ⁇ 2 ) direction ( 3320 ). If the value Y is, for example, “2,” the movement is “4°” in the upper direction.
  • Y is not greater than “0,” the servo controller 3140 operates the vertical motor 3130 so as to move the camera 30 in the lower direction ( 3330 ). If the value Y is, for example, “ ⁇ 3,” the movement amount is “6°,” and the direction is in a ⁇ latitudinal (lower: ⁇ 4 ) direction.
  • the eye lens motion detection device 3000 is provided to the display site of the system ( 3020 ). A viewer's eye lens motion is detected by the eye lens motion detection device 3000 while the viewer is watching stereoscopic images ( 3030 ). The eye lens motion data are transmitted to the camera site through the transmitter 3010 and the communication network 3015 ( 3040 ). As discussed above, either one transmitter or a pair of transmitters may be used.
  • the receiver 2950 of the camera site receives the eye lens motion data from the display site ( 3050 ).
  • the camera adjusting values are determined based on the eye lens motion data ( 3060 ).
  • the stereoscopic cameras 30 and 32 are controlled by the determined camera adjusting values ( 3070 ). In this way, the stereoscopic cameras 30 and 32 are controlled such that the cameras keep track of the eye lens motion. In terms of the viewer, he or she notices that as soon as his or her eye lenses are moved to a certain direction, stereoscopic images are also moved in the direction to which the eye lenses has moved.
  • FIG. 34 illustrates a stereoscopic camera controller system used for a 3D display system according to another aspect of the invention. For convenience, the display site is not shown.
  • This aspect of the invention selects a pair of stereoscopic cameras corresponding to movement amount of the eye lenses among plural sets of stereoscopic cameras instead of controlling the movement of the pair of stereoscopic cameras.
  • the system comprises a microcomputer 3430 , a memory 3440 , camera selectors 3420 and 3425 , and plural sets of stereoscopic cameras 30 a and 32 a, 30 b and 32 b, and 30 c and 32 c.
  • the memory 3440 stores a table as shown in FIG. 35.
  • the table shows relationship between camera adjusting values and selected cameras.
  • the camera adjusting value “(0,0)” corresponds to, for example, a set of cameras C 33 as shown in FIGS. 35 and 36B.
  • the camera adjusting value “(1,0)” corresponds to a set of cameras C 34 as shown in FIGS. 35 and 36B.
  • the camera adjusting value “(2,2)” corresponds to the C 15 camera set as shown in the Figures.
  • another set of stereoscopic cameras is selected from the sets of cameras such as one of the C 34 camera set and one of the C 32 camera set.
  • FIG. 36A is a top view of the plural sets of stereoscopic cameras.
  • the contour line that is made by connecting all of the object lenses of the plural sets of stereoscopic cameras is similar to the contour line of a viewer's eyes which is exposed to the outside.
  • the microcomputer 3430 determines camera adjusting values based on the received eye lens motion data.
  • the microcomputer 3430 also determines first and second camera selection signals based on the table stored in the memory 3440 .
  • the first selection signal is determined based on the movement of a viewer's left eye lens, and used for controlling the camera selector 3420 .
  • the second selection signal is determined based on the movement of a viewer's right eye lens, and used for controlling the camera selector 3425 .
  • the microcomputer 3430 provides each of the selection signals to the camera selectors 3420 and 3425 , respectively.
  • the camera selectors 3420 and 3425 select the respective camera based on the selection signal.
  • a base set of cameras e.g., C 33
  • the selected set of cameras image the object and transmit the image to the display site through the transmitters 2900 and 2930 .
  • all of the cameras are turned on and a first set of cameras are connected to the transmitters 2900 and 2930 , respectively.
  • each of the camera selectors 3420 and 3425 comprises a switch that performs switching between the plural sets of stereoscopic cameras 30 a and 32 a, 30 b and 32 b, and 30 c and 32 c and the transmitters 2900 and 2925 , respectively.
  • Eye lens motion data are received from the display site ( 3720 ).
  • Camera adjusting values are determined based on the received eye lens motion data ( 3730 ).
  • the camera adjusting values are exemplified in the table of FIG. 35.
  • Camera selection signals are determined based on the determined camera adjusting values ( 3740 ), for example, using the relationship of the table of FIG. 35. It is determined whether a new set of cameras have been selected ( 3750 ). If no new set of cameras are selected, the image output from the base cameras is transmitted to the display site ( 3780 ).
  • a new set of cameras e.g., C 35
  • the base cameras (C 33 ) are disconnected from the transmitter 2900 and the new cameras (C 35 ) are connected to the transmitters 2900 and 2930 ( 3760 ).
  • the selected cameras (C 35 ) image the object ( 3770 ), and the image output from the selected cameras is transmitted to the display site ( 3790 ).
  • the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • FIG. 38 illustrates a 3D display system according to another aspect of the invention.
  • the 3D display system is directed to guide a viewer's eye lens motion using a three-dimensional input device.
  • the system is also directed to adjust displayed images using the 3D input device such that the longitudinal and latitudinal locations of the center points of a viewer's eye lenses are substantially the same as those of the center points of the displayed images.
  • the 3D input device comprises a 3D mouse (will be described later).
  • the system comprises a set of stereoscopic cameras 30 and 32 , a pair of transmitters 2900 and 2930 , a set of display devices 3900 and 3910 , a 3D mouse 3920 , and an input device 3990 .
  • the stereoscopic cameras 30 and 32 , a pair of transmitters 2900 and 2930 , and a pair of receivers 2960 and 2970 are the same as those shown in FIG. 29.
  • the display devices 3900 and 3910 display stereoscopic image that has been transmitted from the camera site. Also, the devices 3900 and 3910 display the pair of 3D mouse cursors that guide a viewer's eye lens movement.
  • the input of the 3D mouse is provided to both the display devices 3900 and 3910 as shown in FIG. 38.
  • the pair of 3D mouse cursors are displayed and moved by the movement of the 3D mouse 3920 .
  • the shape of the 3D mouse cursor comprises a square, an arrow, a cross, a square with a cross therein as shown in FIGS. 40 A- 40 H, a reticle, or a crosshair.
  • a pair of cross square mouse cursors 400 and 420 as shown in FIG. 40 will be used for the convenience.
  • the distance (M d ) between the 3D mouse cursors 400 and 420 is adjusted.
  • the size of the 3D mouse cursors may be adjusted.
  • the viewer adjusts the distance value, for example, by turning a scroll button of the 3D mouse.
  • the viewer can set a distance value from a larger value to a smaller one (10,000 m ->100 m ->5 m ->1 m ->0.5 m ->5 cm).
  • the viewer may set a distance value from a smaller value to a larger one (5 cm ->0.5 m ->1 m ->5 m ->100 m ->10,000 m).
  • the distance value 10,000 m will very often be referred to as an infinity value or infinity.
  • FIG. 39 illustrates one example of a 3D display image.
  • the image comprises a mountain image portion 3810 , a tree image portion 3820 , a house image portion 3830 and a person portion image 3840 . It is assumed that the mountain image 3810 , the tree image 3820 , the house image 3830 , the person image 3840 are photographed in distances “about 10,000 m,” “about 100 m,” “about 5 m,” and “about 1 m,” respectively, spaced from the set of stereoscopic cameras 30 and 32 .
  • the mouse cursor distance M d has M d0 value which is the same as the W a (V amax ) values as shown in FIG. 40A.
  • V a has the maximum value (V amax ).
  • the viewer's sight lines L s1 and L s2 are substantially parallel to each other as shown in FIGS. 40A and 40B.
  • M d has M d1 value which is less than M d0 as shown in FIGS. 40C and 40D.
  • the viewer's sight lines L s1 and L s2 are not parallel any more.
  • the two sight lines are extended, they are converged in an imaginary point “M” as shown in FIG. 40D, the point “O” represents the middle point between the center points of each eye.
  • the viewer feels a sense of distance as if they see an object that is “d 1 (100 m)” distant.
  • the distance between M and O is not physical length but imaginary length. However, since the viewer feels a sense of the distance, as far as the viewer's eye lens distance or directions are concerned, the distance between M and O can be regarded as the actual distance between the viewer's eyes and an actual object. That is, when the viewer sees the two mouse cursors 400 and 420 that are spaced as much as M d1 , they perceive a single (three-dimensional) mouse cursor that is located in the M point, at a 100 m distance.
  • M d has M d2 value which is less than M d1 as shown in FIGS. 40 E and 40 F. Also, when the two sight lines are extended in the screen, they are converged in an imaginary point “M” as shown in FIG. 40F. Similarly, in this situation when the viewer sees the house image 3830 , the viewer feels a sense of distance as if he or she sees an object that is “d 2 (5 m)” away. Thus, when the viewer sees the two mouse cursors 400 and 420 that are spaced as much as M d2 , they perceive a single (three-dimensional) mouse cursor that is located in the M point, at a 5 m distance.
  • the mouse cursors 400 and 420 overlap with each other as shown in FIG. 40G. That is, when the distance value is the same as the actual distance between the point “O” and the center points of the screen as shown in FIG. 40G, the mouse cursors overlap with each other.
  • the M d value is determined according to the distance value that is set by the viewer.
  • FIG. 41 illustrates an exemplary block diagram of the display devices as shown in FIG. 38. Since each of the display devices 3900 and 3910 performs substantially the same functions, only one display device 3900 is illustrated in FIG. 41.
  • the display device 3900 comprises a display screen 3930 , a display driver 3940 , a microcomputer 3950 , a memory 3960 and Interfaces 3970 and 3980 .
  • the display device 3900 adjusts the distance (M d ) between a pair of 3D mouse cursors 400 and 420 according to the distance value set as shown in FIGS. 40 A- 40 H.
  • the display device 3900 moves the center points of the displayed images based on the 3D mouse cursor movement.
  • the display device 3900 moves the displayed images such that the longitudinal and latitudinal locations of the center points of a viewer's eye lenses are substantially the same as those of the center points of the displayed images.
  • the 3D mouse 3920 detects its movement amount.
  • the detected movement amount is provided to the microcomputer 3950 via the interface 3970 .
  • the distance value that the viewer sets is provided to the microcomputer 3950 via the 3D mouse 3920 and the interface 3970 .
  • the interface 3970 comprises a mouse controller.
  • the distance value may be provided to the microcomputer 3950 via the input device 3990 and the interface 3980 .
  • the input device 3990 provides properties of the 3D mouse such as minimum detection amount (A m ), movement sensitivity (B m ), and the mouse cursor size (C m ), the viewer-screen distance (d), and viewer's eye data such as W a and S AL and S AR to the microcomputer 3950 via the interface 3980 .
  • the minimum detection amount represents the least amount of movement which the 3D mouse can detect. That is, when the 3D mouse moves only more than the minimum detection amount, the movement of the 3D mouse can be detected. In one embodiment of the invention, the minimum detection amount is set when the 3D mouse is manufactured.
  • the movement sensitivity represents how sensitive the mouse cursors move based on the movement of the 3D mouse.
  • the scroll button of the 3D mouse has different movement sensitivity, i.e., being either more sensitive or less sensitive, according to the distance value. For example, if the distance value is greater than 1,000 m, a “1 mm turn” of the scroll button may increase or decrease the distance by 2,000 m distance. If the distance value is between 100 m and 1,000 m, a “1 mm turn” of the scroll button may increase or decrease distance by 100 m . Similarly, if the distance value is less than 1 m, a “1 mm turn” of the scroll button may increase or decrease the distance by 10 cm.
  • the mouse cursor size may also be adjusted.
  • the distance (d) represents the distance between the middle point of the viewer's eyes and the screen as exemplified in FIG. 43A.
  • the screen comprises a V shaped mirror, a HMD screen, a projection screen, and a display device screen as shown in FIG. 1B.
  • the input device 3990 provides display device properties to the microcomputer 3950 through the interface 3980 .
  • the display device properties comprise the display device resolution and screen size of the display device 3900 .
  • the resolution represents the number of horizontal and vertical pixels of the device 3900 .
  • the size may comprise horizontal and vertical lengths of the display device 3900 . With the resolution and screen size of the display device 3900 , the length of one pixel can be obtained as, for example, “1 mm” per 10 pixels.
  • the input device 3990 comprises a keyboard, a remote controller, and a pointing input device, etc.
  • the interface 3980 comprises the input device controller.
  • the properties of the 3D mouse are stored in the memory 3960 .
  • the viewer's eye data are detected using a detection device for eye lens movement or provided to the display device 3900 by the viewer.
  • the microcomputer 3950 determines the mouse cursor distance (M d ) based on the distance value set by the viewer.
  • a table (not shown) showing the relationship between the distance value and the M d value as shown in FIGS. 40 A- 40 H according to a viewer's eye data may be stored in the memory 3960 .
  • the microcomputer 3950 determines the cursor distance (M d ) by referring to the table, and provides the determined distance value to the display driver 3940 .
  • the display driver 3940 displays the pair of the mouse cursors 400 and 420 based on the determined M d value in the display screen 3930 .
  • the microcomputer 3950 also determines new locations of the mouse cursors 400 and 420 , and calculates a movement amount for the center points of the display images based on the locations of the mouse cursors 400 and 420 .
  • the memory 4730 may also store data that may be needed to calculate the movement amount for the center points of the display images.
  • 3D mouse properties are set in each of the display devices 3900 and 3910 ( 4200 ).
  • the 3D mouse properties comprise a minimum detection amount (A m ), a movement sensitivity (B m ), and the mouse cursor size (C m ).
  • the 3D mouse properties may be provided by the viewer or stored in the memory 3960 .
  • Display device properties are provided to the display devices 3900 and 3910 ( 4205 ).
  • the display device properties may be stored in the memory 3960 .
  • the viewer's eye data are provided to the display devices 3900 and 3910 ( 4210 ). As discussed above, the viewer's eye data may be automatically detected by a detection device or provided to the display devices 3900 and 3910 by the viewer. In one embodiment of the invention, the viewer's eye data comprise the distance (W a ) between the center points of the eyes, and the S A (S AL and S AR ) value which is the distance between the eye lens center point (A 2 ) and the eye center point (A 3 ).
  • a viewer-screen distance (d) is provided to each of the display devices 3900 and 3910 via, for example, the input device 3990 ( 4220 ).
  • the mouse cursor location and distance value are initialized ( 4230 ).
  • the initialization is performed in an infinity distance value.
  • left and right mouse cursors are located at ( ⁇ W a /2, 0, 0) and (W a /2, 0, 0), respectively, where the origin of the coordinate system is O (0, 0, 0) point as shown in FIG. 43A.
  • the locations of the center points of each displayed image are ( ⁇ W a /2, 0, 0) and (W a /2, 0, 0), respectively.
  • 3D image and 3D mouse cursors are displayed in each of the display devices 3900 and 3910 ( 4240 ).
  • 3D mouse cursors 400 and 420 are displayed on each of the 3D images. Since the mouse cursor location has been initialized, the adjusted mouse cursors 400 and 420 are displayed on the images.
  • initialized distance value It is determined whether initialized distance value has been changed to another value ( 4250 ).
  • the viewer may want to set different distance value from the initialized distance value, he or she may provide the distance value to the display devices 3900 and 3910 .
  • 3D mouse cursor distance (M d ) is adjusted and the 3D mouse cursor location is reinitialized based on the changed distance value ( 4260 ). For example, in case that the initial location is (0, 0, 10,000 m), if another distance value (e.g., 100 m) as shown in FIG. 40C is provided, the mouse cursor distance (M d ) is changed from M d0 to M d1 . However, the x and y values of the point M do not change, even though the z value of the M point is changed from 10,000 m to 100 m.
  • a new location of the 3D mouse cursors 400 and 420 is determined ( 4280 ).
  • the new location of the mouse cursors is determined as follows. First, the number of pixels on which the mouse cursors have moved in the x-direction is determined. For example, left direction movement may have “ ⁇ x” value and right direction movement may have “+x” value. The same applies to “y” direction, i.e., “ ⁇ y” value for lower direction movement and “+y” value for upper direction movement. The “z” direction movement is determined by the distance value.
  • the locations of the center points of the display images to be adjusted are calculated based on the new location of the 3D mouse cursors 400 and 420 ( 4290 ).
  • the locations of the center points of the display images are obtained from the location values of each of the eye lenses, respectively.
  • the location values of the eye lenses are obtained using Equations VII and VIII as described below. Referring to FIG. 43, a method of obtaining the locations of the eye lenses will be described.
  • Equation VII [0276] First, the value for Z L is obtained from Equation VII.
  • M N (I N , J N , K N ) represents the location of the center point of the two mouse cursors M L (I L , J L , K L ) and M R (I R , J R , K R ). Since each of the mouse cursor locations M L and M R is obtained in 4280 , the center point location M N is obtained. That is, I N and J N are obtained by averaging (I L , I R ) and (J L , J R ). K N is determined by the current distance value.
  • Z L is the distance between the left eye center point (A 3L ) and M N .
  • center point locations [(x1, y1, z1); (x2, y2, z2)] for each eye lens are obtained from Equation VIII.
  • a 2L (x1, y1, z1) is the center point location of the left eye lens
  • a 2R (x2, y2, z2) is the center point location of the right eye lens, as shown in FIG. 43A.
  • FIG. 43B illustrates a three-dimensional view of a viewer's eye. Referring to FIG. 43B, it can be seen how eye lens center point (A 2L ) is moving along the surface of the eye.
  • a digital signal processor may be used for calculating the locations of the eye lenses.
  • Each of the center points of the displayed images is moved to the locations (x1, y1) and (x2, y2), respectively as shown in FIG. 44 ( 4300 ).
  • the blank area of the screen after moving may be filled with a background color, e.g., black, as shown in FIG. 44.
  • M N1 is a peak point of a mountain 42 and M N2 is a point of a house 44 . It is assumed that the location values of M N1 and M N2 are determined to be ( ⁇ 0.02 m, 0.04 m, 100 m) and (0.01 m, 0 m, 10 m), respectively by the above calculation method. These determined location values may be stored in the memory 3960 , and the distance between the two locations M N1 and M N2 is calculated as follows.
  • the microcomputer 3950 is programmed to calculate the distance between two locations, or may comprise a distance measure mode.
  • a viewer designates a first location (A: middle point of two mouse cursors 400 and 420 )
  • the location is determined and stored in the memory 3960 .
  • the location value may be displayed in the display screen 3930 or may be provided to a viewer via voice signal. This applies to a second location (B). In this way, the values of the first and second locations (A, B) are determined and the distance between the locations (A, B) is calculated.
  • FIG. 45 illustrates a 3D display system according to another aspect of the invention.
  • the system is directed to control the movement of stereoscopic cameras based on the movement of a viewer's eye lenses.
  • the system comprises a camera site and a display site.
  • the display site comprises a pair of transmitters/receivers 4530 and 4540 , a set of display devices 4510 and 4520 , and an input device 3990 and a 3D mouse 3920 .
  • the input device 3990 and 3D mouse 3920 are substantially the same as those of the system shown in FIG. 38.
  • the display device 4510 comprises interfaces 3970 and 3980 , a microcomputer 4820 , a memory 4830 , and an interface 4810 .
  • the interfaces 3970 and 3980 are substantially the same as those of the display device shown in FIG. 41.
  • the microcomputer 4820 determines the current location values of the mouse cursors, and calculates the location values of the center points of a viewer's eye lenses.
  • the memory 4830 may also store data that may be needed to calculate the movement amount for the center points of the display images.
  • the interface 4810 may modify the location values adapted for transmission, and provide the modified data to the transmitter 4530 .
  • the transmitter 4530 transmits the modified location data to the camera site.
  • the camera site comprises a set of stereoscopic cameras 30 and 32 , a pair of transmitters 4570 and 4600 , a pair of servo mechanisms 4580 and 4590 , and a pair of receivers 4550 and 4560 .
  • Each of the receivers 4550 and 4560 receives the location values transmitted from the display site, and provides the data to the pair of the servo mechanisms, 4580 and 4590 , respectively.
  • the servo mechanisms 4580 and 4590 control the cameras 30 and 32 based on the received location data, respectively.
  • the servo mechanisms 4580 and 4590 control the cameras 30 and 32 such that the longitudinal and latitudinal values of the center points of the object lenses (C 2L , C 2R ; FIGS. 26 and 27) of the cameras 30 and 32 are substantially the same as those of the center points of the viewer's eye lenses as shown in FIGS. 47A and 47C.
  • 3D mouse properties and display device properties are set in each of the display devices 4510 and 4520 ( 4610 ).
  • the 3D mouse properties and display device properties are substantially the same as those explained with regard to FIG. 42.
  • the viewer's eye data and viewer-screen distance (d) are provided to each of the display devices 4510 and 4520 ( 4620 ). Again, the viewer's eye data and viewer-screen distance (d) are substantially the same as those explained with regard to FIG. 42.
  • 3D mouse cursor location and distance value are initialized ( 4630 ).
  • the 3D mouse cursor location is initialized to the center points of each of the display device screens, and the distance value is initialized to the infinity distance value.
  • the 3D image that is received from the camera site, and 3D mouse cursors ( 400 , 420 ) are displayed on the display devices 4510 and 4520 ( 4640 ).
  • the 3D mouse cursor may be displayed on the 3D image. In this situation, the portion of the image under the 3D mouse cursors ( 400 , 420 ) may not be seen by a viewer.
  • the location value data are transmitted to the camera site through each of the transmitter/receivers 4530 and 4540 ( 4680 ). As discussed above, the location values are calculated so long as the mouse cursor is moving. Thus, the location values may comprise a series of data. In one embodiment of the invention, the location values are serially transmitted to the camera site so that the cameras 30 and 32 are controlled based on the received order of the location values. In another embodiment of the invention, the sequence of the generated location values may be obtained and transmitted to the camera site so that the cameras 30 and 32 are controlled according to the sequence. In one embodiment of the invention, the location value data are digital data and may be properly modulated for transmission.
  • the location value data are received in each of the receivers 4550 and 4560 ( 4690 ).
  • one transmitter may be used instead of the two transmitters 4530 and 4540 .
  • one receiver may be used instead of the receivers 4550 and 4560 .
  • Camera adjusting values are determined based on the location values and the stereoscopic cameras 30 and 32 are controlled based on the camera adjusting values ( 4700 ).
  • Each of the servo controllers 4580 and 4590 controls the respective camera 30 and 32 such that each of the center points of the cameras object lenses keeps track of the movement of the center points of each eye lens ( 4710 ).
  • new location values A 2L1 and A 2R1 corresponding to the new location of the 3D mouse cursors are calculated using Equations VIII as discussed above.
  • Each of the servo controllers 4580 and 4590 controls the cameras 30 and 32 such that the center points of each of the camera object lenses are located in C 2L1 and C 2R1 as shown in FIG. 47A.
  • the servo controllers 4580 and 4590 may set the location values of the center points of the camera object lenses so as to conform to the location values of the center points of the eye lenses.
  • the servo controllers 4580 and 4590 comprise a horizontal motor and a vertical motor that move each camera to the horizontal direction (x-direction) and the vertical direction (y-direction), respectively.
  • only one servo controller may be used for controlling movements of both of the cameras 30 and 32 instead of the pair of the servo controllers 4580 and 4590 .
  • each of the servo controllers 4580 and 4590 is controlling the stereoscopic cameras 30 and 32
  • the cameras 30 and 32 are photographing an object.
  • the photographed image is transmitted to the display site and displayed in each of the display devices 4510 and 4520 ( 4720 , 4730 ).
  • the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • FIG. 49 illustrates a 3D display system according to another aspect of the invention.
  • the 3D display system is directed to adjust space magnification for a stereoscopic image based on the space magnification adjusting data provided by a viewer.
  • the system comprises a camera site and a display site.
  • the display site comprises an input device 4910 , a set of display devices 4920 and 4930 , a transmitter 4950 , and a pair of receivers 4940 and 4960 .
  • the input device 4910 provides a viewer's eye distance value (W a ) as shown in FIG. 43A and space magnification adjusting data to at least one of the display devices 4920 and 4930 .
  • the space magnification means the size of space that a viewer perceives from the display images. For example, if the space magnification is “1,” a viewer perceives the same size of the space in the display site as that of the real space that was photographed in the camera site. Also, if the space magnification is “10,” a viewer perceives ten times of the size of the space in the display site larger than that of the real space that was imaged by the camera.
  • the space magnification adjusting data represent data regarding the space magnification that a viewer wants to adjust.
  • the space magnification adjusting data may comprise “0.1” times of space magnification, “1” times of space magnification, “10” times of space magnification, or “100” times of space magnification.
  • the adjustment of the space magnification is performed by an adjustment of the distance between the cameras 30 and 32 , and will be described in more detail later.
  • At least one of the display devices 4920 and 4930 displays the space magnification adjusting data that are provided through the input device 4910 .
  • the at least one of the display devices 4920 and 4930 provides the space magnification adjusting data and eye distance value (W a ) to the transmitter 4950 .
  • the transmitter 4950 transmits the magnification adjusting data and the value W a to the camera site.
  • the space magnification adjusting data and the value W a may be provided directly from the input device 4910 to the transmitter 4950 without passing through the display devices 4920 and 4930 .
  • the receiver 4970 receives the space magnification adjusting data and W a from the transmitter 4950 , and provides the data to the camera controllers 4990 .
  • the camera controller 4990 controls the camera distance based on the space magnification adjusting data and the value W a .
  • the camera controller 4990 comprises a servo controller 4985 and a horizontal motor 4975 as shown in FIG. 50. Referring to FIGS. 50 - 52 , the operation of the camera controller 4990 will be explained.
  • the servo controller 4985 initializes camera distance (C 1 ), for example, such that C 1 is the same as W a ( 5100 ).
  • the space magnification relates to the camera distance (C 1 ) and the eye distance value (W a ).
  • the space magnification is “1,” which means that a viewer sees the same size of the object that is photographed by the cameras 30 and 32 .
  • the space magnification is less than “1,” which means that a viewer perceives a smaller space than a space that is imaged by the cameras 30 and 32 .
  • the space magnification is greater than “1,” which means that a viewer perceives a larger sized object than is imaged by the cameras 30 and 32 .
  • the space magnification adjusting data are provided to the servo controller 4985 ( 5110 ). It is determined whether the adjusting data is “1” ( 5120 ). If the adjusting data are “1,” no adjustment of the camera distance is made ( 5160 ). If the adjusting data are not “1,” it is determined whether the adjusting data is greater than “1.” If the adjusting data are greater than “1,” the servo controller 4985 operates the motor 4975 so as to narrow C 1 until the requested space magnification is obtained ( 5150 ). Referring to FIG. 52, a table showing the relationship between the space magnification and camera distance (C 1 ) is illustrated, where W a is 80 mm. Thus, when C 1 is 80 mm, the space magnification is “1.” In this situation, if the requested space magnification is “10,” the camera distance is adjusted to “8 mm” as shown in FIG. 52.
  • the servo controller 4985 operates the motor 4975 so as to widen C 1 until the requested space magnification is obtained ( 5140 ). As exemplified in FIG. 52, if the requested space magnification is “0.1,” the camera distance is adjusted to “800 mm.”
  • Eye distance (W a ) and space magnification adjusting data (SM) are provided to the at least one of the display devices 4920 and 4930 , or to the transmitter 4950 directly from the input device 4910 ( 5020 ).
  • the eye distance (W a ) and space magnification adjusting data (SM) are transmitted to the camera site ( 5030 ).
  • the camera site receives the W a and SM values and adjusts the camera distance (C 1 ) based on the W a and SM values ( 5040 ).
  • the stereoscopic cameras 30 and 32 image the object with adjusted space magnification ( 5050 ).
  • the image is transmitted to the display site through the transmitters 4980 and 5000 ( 5060 ).
  • Each of the display devices 4920 and 4930 receives and displays the image ( 5070 ).
  • the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • FIG. 54 illustrates a 3D display system according to another aspect of the invention.
  • the system is directed to adjust the location of the display devices based on the relative location of the stereoscopic cameras with regard to an object 5400 .
  • the system comprises a camera site and a display site.
  • the camera site comprises a set of stereoscopic cameras 30 and 32 , a pair of direction detection devices 5410 and 5420 , transmitters 5430 and 5440 .
  • the cameras 30 and 32 may not be parallel to each other as shown in FIG. 54.
  • the direction detection devices 5410 and 5420 detect directions of the stereoscopic cameras 30 and 32 with respect to the object 5400 to be photographed, respectively.
  • the devices 5410 and 5420 detect the tilt angle with respect to an initial location where the two cameras are parallel to each other.
  • the cameras 30 and 32 may be tilted, for example, 10 degrees in a counterclockwise direction as shown in FIG. 54, or in a clockwise direction from the initial location.
  • the detection devices 5410 and 5420 detect the tilted angle of the cameras 30 and 32 , respectively.
  • each of the direction detection devices 5410 and 5420 comprises a typical direction sensor.
  • Each of the transmitters 5430 and 5440 transmits the detected direction data of the cameras 30 and 32 to the display site. If it is detected that only the camera 32 is tilted as shown in FIG. 57, the detection device 5410 may not detect a tilting, and thus only the transmitter 5440 may transmit the detected data to the display site. The same applies to a situation where only the camera 30 is tilted.
  • the display site comprises a pair of receivers 5450 and 5460 , a pair of display device controllers 5470 and 5500 , and a set of display devices 5480 and 5490 .
  • Each of the receivers 5450 and 5460 receives the detected tilting data of the cameras 30 and 32 , and provides the data to each of the display device controllers 5470 and 5500 .
  • the display device controllers 5470 and 5500 determine display adjusting values based on the received camera tilting data.
  • the display adjusting values represent movement amounts to be adjusted for the display devices 5480 and 5490 .
  • the display device controllers 5470 and 5500 determine display adjusting values based on a table as shown in FIG. 55.
  • the display device controller 5500 tilts the corresponding display device 5490 as much as 10 degrees in a clockwise direction as shown in FIG. 54.
  • the camera location with respect to the object 5400 is substantially the same as an eye lens location of the viewer with regard to the screen.
  • the screen may comprise a V shaped mirror, a HMD screen, a projection screen, or a display screen 160 shown in FIG. 1B.
  • the set of stereoscopic cameras 30 and 32 image an object ( 5510 ).
  • Each of the direction detection devices 5410 and 5420 detects a camera direction with respect to the object ( 5520 ). That is, for example, the degree of tilting of each camera 30 and 32 from, for example, a parallel state is detected.
  • the photographed image data (PID) and direction detection data (DDD) are transmitted to the display site ( 5530 ).
  • the PID and DDD are received in the display site, and the DDD are retrieved from the received data ( 5540 , 5550 ).
  • the retrieving may be performed using a typical signal separator.
  • At least one of the display device controllers 5470 and 5500 determines the display device adjusting values based on the retrieved DDD ( 5560 ). The at least one of the display device controllers 5470 and 5500 adjusts the display angle with respect to the viewer's eye lenses by moving a corresponding display device ( 5570 ). The display devices 5480 and 5490 display the received stereoscopic images ( 5580 ).
  • FIG. 57 illustrates a 3D display system according to another aspect of the invention.
  • the system is directed to adjust displayed image based on the relative location of the stereoscopic cameras 30 and 32 with regard to the object 5400 .
  • the system shown in FIG. 57 is substantially the same as the one of FIG. 54 except for the display devices 5710 and 5720 .
  • the display devices 5710 and 5720 adjust the location of the displayed images based on the received camera direction detection data.
  • FIG. 58 an exemplary block diagram of the display device 5720 is illustrated. Though not shown, the display device 5710 is substantially the same as the display device 5720 .
  • the display device 5720 comprises a microcomputer 5910 , a memory 5920 , a display driver 5930 , and a display screen 5940 .
  • the memory 5920 stores a table (not shown) showing the relationship between the camera tilting angle and the adjust amount of displayed images.
  • the microcomputer 5910 determines display image adjusting values based on the received camera direction data and the table of the memory 5920 .
  • the display driver 5930 adjusts the display angle of the display image based on the determined adjusting values, and displays the image in the display screen 5940 .
  • FIGS. 59A and 59B adjustment of the displayed image is illustrated. In one embodiment of the invention, this may be performed by enlarging or reducing the image portion of the left or right sides of the displayed image. For example, according to the tilting angle of the camera, the enlarging or reducing amount is determined. In this embodiment of the invention, enlargement or reduction may be performed by a known image reduction or magnification software.
  • the image of FIG. 59A may correspond to the tilting of the display device in a clockwise direction.
  • the image of FIG. 59B may correspond to the tiling of the display device in a counter clockwise direction.
  • FIG. 60 the operation of FIG. 54 will be explained. As seen in FIG. 60, procedures 5810 - 5850 are the same as those shown in FIG. 55. Display image adjusting values are determined based on the retrieved camera direction detection data (DDD) ( 5860 ). The image to be displayed is adjusted as shown in FIG. 59 based on the determined adjusting values ( 5870 ). The adjusted image is displayed ( 5880 ).
  • DDD camera direction detection data
  • FIG. 61 illustrates a 3D display system according to another aspect of the invention.
  • stereoscopic images and photographing ratios are transmitted via a network such as the Internet, or stored on a persistent memory, such as optical or magnetic disks.
  • the combined data 620 of stereoscopic images 624 and at least one photographing ratio (A:B:C) 622 for the images 624 are shown.
  • the stereoscopic images 624 may comprise stereoscopic broadcasting images, stereoscopic advertisement images, or stereoscopic movie images, stereoscopic product images for Internet shopping, or any other kind of stereoscopic images.
  • the photographing ratio 622 may be fixed for the entire set of stereoscopic images 624 . A method of combining of the stereoscopic images 624 and photographing ratio 622 has been described above in connection with FIG. 7.
  • stereoscopic images 624 are produced from a pair of stereoscopic cameras (not shown) and combined with the photographing ratio 622 .
  • the stereoscopic (broadcasting, advertisement, or movie, etc.) images 624 and the photographing ratio 622 may be transmitted from an Internet server, or a computing device of a broadcasting company.
  • the Internet server may be operated by an Internet broadcasting company, an Internet movie company, an Internet advertising company or an Internet shopping mall company.
  • the photographing ratio is not combined, and rather, is transmitted separately from the stereoscopic images. However, for convenience, the explanation below will be mainly directed to the combined method.
  • the combined data 620 are transmitted to a computing device 627 at a display site via a network 625 .
  • the network 625 may comprise the Internet, a cable, a PSTN, or a wireless network.
  • FIG. 63 an exemplary data format of the combined data 620 is illustrated.
  • the left images and right images of the stereoscopic images 624 are embedded into the combined data 620 such that the images 624 are retrieved sequentially in a set of display devices 626 and 628 .
  • left image 1 and right image 1 , left image 2 and right image 2 are located in sequence in the data format such that the images can be retrieved in that sequence.
  • the computing device 627 receives the combined data 620 and retrieves the stereoscopic images 624 and photographing ratio 622 from the received data.
  • the images 624 and photographing ratio 622 are separately received as they are not combined in transmission.
  • the computing device 627 also provides the left and right images to the display device 626 and 628 , respectively.
  • the data format may be constituted such that the computing device 627 can identify the left and right images of the stereoscopic images 624 when the device 627 retrieves the images 624 such as predetermined order or data tagging.
  • the computing device 627 may comprise any kind of computing devices that can download the images 624 and ratio 622 either in a combined format or separately via the network 625 .
  • a pair of computing devices each retrieving and providing left and right images to the display devices 626 and 628 , respectively may be provided in the display site.
  • the display devices 626 and 628 may comprise the elements of the display devices 86 and 88 disclosed in FIG. 8. In one embodiment of the invention, each of the display devices 626 and 628 may comprise CRT, LCD, HMD, PDP devices, or projection type display devices.
  • the combined data which are stored in a recording medium 630 such as optical or magnetic disks may be provided to the display devices 634 and 636 via a medium retrieval device 632 at the display site.
  • the optical disks may comprise a compact disk (CD) or a digital versatile disk (DVD).
  • the magnetic disk may comprise a hard disk.
  • the recording medium 630 is inserted into the medium retrieval device 632 that retrieves the stereoscopic images 624 and photographing ratio 622 .
  • the medium retrieval device 632 may comprise a CD ROM driver, a DVD ROM driver, or a hard disk driver (HDD), and a host computer for the drivers.
  • the medium retrieval device 632 may be embedded in a computing device (not shown).
  • the medium retrieval device 632 retrieves and provides the stereoscopic images 624 and photographing ratio 622 to the display devices 634 and 636 , respectively.
  • the exemplified data format shown in FIG. 63 may apply to the data stored in the recording medium 630 .
  • the photographing ratio 622 is the same for the entire stereoscopic images. In this embodiment, the photographing ratio 622 is provided once to each of the display devices 634 and 636 , and the same photographing ratio is used throughout the stereoscopic images.
  • the data format recorded in the medium 630 is constituted such that the medium retrieval device 632 can identify the left and right images of the stereoscopic images 624 .
  • the operation of the display devices 634 and 636 is substantially the same as that of the devices 626 and 628 as discussed with regard to FIG. 61.
  • Portable Communication Device Comprising a Pair of Digital Cameras that Produce Stereoscopic Images and a Pair of Display Screens
  • FIG. 64 illustrates an information communication system according to another aspect of the invention.
  • the system comprises a pair of portable communication devices 65 and 67 .
  • the device 65 comprises a pair of digital cameras 640 , 642 , a pair of display screens 644 , 646 , a distance input portion 648 , an eye interval input portion 650 , and a space magnification input portion 652 .
  • the device 65 comprises a receiver and a transmitter, or a transceiver (all not shown).
  • the pair of digital cameras 640 and 642 produce stereoscopic images of a scene or an object and photographing ratios thereof.
  • each of the cameras 640 and 642 comprises substantially the same elements of the camera 20 shown in FIG. 7.
  • the device 65 transmits the produced stereoscopic images and photographing ratios to the device 67 .
  • the pair of display screens 644 and 646 display stereoscopic images received from the device 67 .
  • the distance input portion 648 is provided with the distance values (similar to screen-viewer distances F1 and F2 in FIG. 8) between a viewer's eyes and each of the screens 644 and 646 .
  • the eye interval input portion 650 receives the distance values (exemplified as W a in FIG. 14A) between the center points of a viewer's eyes.
  • the space magnification input portion 652 is provided with adjusting data for space magnification, and provides the adjusting data to the device 65 .
  • each of the distance input portion 648 , the eye interval input portion 645 , and the space magnification input portion 652 comprises key pads that can input numerals 0-9.
  • all of the input portions are embodied as one input device.
  • the device 67 comprises a pair of digital cameras 664 , 666 , a pair of display screens 654 , 656 , a distance input portion 658 , an eye interval input portion 660 , and a space magnification input portion 662 .
  • the device 67 also comprises a receiver and a transmitter, or a transceiver (all not shown).
  • the pair of digital cameras 664 and 666 produce stereoscopic images of a scene or an object and photographing ratios thereof.
  • each of the cameras 664 and 666 comprises substantially the same elements of the camera 20 shown in FIG. 7.
  • the device 67 transmits the produced stereoscopic images and photographing ratios to the device 65 .
  • the pair of display screens 654 and 656 display stereoscopic images received from the device 65 .
  • the distance input portion 658 , the eye interval input portion 660 , and the space magnification input portion 662 are substantially the same as those of the device 65 .
  • the system shown in FIG. 64 may comprise at least one base station (not shown) communicating with the devices 65 and 67 .
  • each of the devices 65 and 67 comprises a cellular phone, an IMT (international mobile telecommunication)-2000 device, and a personal digital assistant (PDA), a hand-held PC or another type of portable telecommunication device.
  • IMT international mobile telecommunication
  • PDA personal digital assistant
  • the space magnification adjusting data and photographing ratios have a standard data format so that the devices 65 and 67 can identify the data easily.
  • FIG. 65 illustrates a pair of information communication devices 65 and 67 according to one aspect of the invention.
  • Each of the devices 65 and 67 displays stereoscopic images received from the other device such that the photographing ratio of one device is substantially the same as the screen ratio of the other device.
  • the device 65 comprises a camera portion 700 , a display portion 720 , and a data processor 740 , e.g., a microcomputer.
  • the camera portion 700 produces and transmits stereoscopic images and photographing ratios thereof to the device 67 .
  • the communication between the devices 65 and 67 may be performed via at least one base station (not shown).
  • the camera portion 700 comprises the pair of digital cameras 640 , 642 , and a transmitter 710 .
  • Each of the digital cameras 640 and 642 produces stereoscopic images and photographing ratios thereof, and combines the images and ratios (combined data 702 and 704 ).
  • the photographing ratios provided in the combined data 702 and 704 are the same.
  • Each of the digital cameras 640 and 642 may comprise the elements of the camera 20 shown in FIG. 7.
  • the transmitter 710 transmits the combined data 702 , 704 to the device 67 .
  • the photographing ratios are not combined, and rather, are transmitted separately from the stereoscopic images.
  • the transmitter 710 may comprise two transmitting portions that transmit the combined data 702 and 704 , respectively.
  • the device 67 receives and displays the stereoscopic images transmitted from the device 65 such that the received photographing ratio is substantially the same as the screen ratio of the device 67 .
  • the display portion 720 receives combined data 714 and 716 of stereoscopic images and photographing ratios thereof from the device 67 , and displays the stereoscopic images such that the received photographing ratio is substantially the same as the screen ratio of the device 65 .
  • the display portion 720 comprises a pair of display devices 706 , 708 , and a receiver 712 .
  • the receiver 712 receives the combined data 714 and 716 that the device 67 transmitted, and provides the combined data 714 , 716 to the display devices 706 , 708 , respectively.
  • the receiver 712 may comprise two receiving portions that receive the combined data 714 and 716 , respectively.
  • the images and photographing ratios are separately received as they are not combined in transmission.
  • Each of the display devices 706 and 708 separates the provided images and ratios from the receiver 712 .
  • the devices 706 and 708 also display the stereoscopic images such that the photographing ratios are substantially the same as the screen ratios of the display devices 706 and 708 , respectively.
  • Each of the display devices 706 and 708 may comprise substantially the same elements of the display device 86 or 88 shown in FIG. 8.
  • the display devices 706 and 708 are connected to the distance input portion 648 shown in FIG. 64 so that the screen-viewer distance for the devices 706 and 708 can be provided to the device 65 .
  • the screen ratios for the devices 706 and 708 are substantially the same. The detailed operation of the display devices 706 and 708 has been explained in connection with FIGS. 8 - 11 .
  • the microcomputer 740 controls the operation of the camera portion 700 and display portion 720 , and data communication with the device 67 .
  • the microcomputer 740 is programmed to control the camera portion 700 such that the digital cameras 640 and 642 produce stereoscopic images and photographing ratios thereof, and that the transmitter 710 transmits the images and ratios to the device 67 when the communication link is established between the devices 65 and 67 .
  • the microcomputer 740 is programmed to control the power of the camera portion 700 and the display portion 720 independently. In this embodiment, even when the cameras 640 and 642 are turned off, the display devices 706 and 708 may display the stereoscopic images received from the device 67 .
  • the cameras 640 and 642 may produce stereoscopic images and photographing ratios thereof, and transmit the images and ratios to the device 67 .
  • the device 65 may comprise an element that performs a voice signal communication with the device 67 .
  • the device 65 may include a volatile memory such as a RAM and/or a non-volatile memory such as a flash memory or a programmable ROM that store data for the communication.
  • the device 65 may comprise a power supply portion such as a battery.
  • the device 65 may include a transceiver that incorporates the transmitter 710 and receiver 712 .
  • the transmitter 710 and receiver 712 may be omitted.
  • the device 67 may be configured to comprise substantially the same elements and perform substantially the same functions as those of the device 65 shown in FIG. 65. Thus, the detailed explanation of embodiments thereof will be omitted.
  • FIG. 66A illustrates an information communication device 65 according to another aspect of the invention.
  • the information communication device 65 controls the display location of the stereoscopic images based on the distance (W a ) between the center points of a viewer's eyes.
  • the device 65 moves the stereoscopic images displayed in the display screens 644 and 646 such that the distance (W d ) between the center points of the displayed stereoscopic images is substantially the same as the W a distance.
  • the device 65 comprises an eye interval input portion 650 , a data processor 722 , e.g., a microcomputer, a pair of display drivers 724 , 726 , and a pair of display screens 644 , 646 .
  • the eye interval input portion 650 and the pair of display screens 644 and 646 are substantially the same as those of FIG. 64.
  • the microcomputer 722 controls the display drivers 724 and 726 based on the received W a distance such that the W d distance is substantially the same as the W a distance. Specifically, the display drivers 724 and 726 moves the stereoscopic images displayed in the display screens 644 and 646 until W d is substantially the same as W a .
  • the detailed explanation with regard to the movement of the stereoscopic images has been provided in connection with FIGS. 15 - 17 .
  • the device 65 moves the display screens 644 and 646 such that the distance (W d ) between the center points of the stereoscopic images is substantially the same as the W a distance.
  • the device 67 comprises the eye interval input portion 650 , a microcomputer 732 , a pair of servo mechanisms 734 , 736 , and the pair of display screens 644 , 646 .
  • the microcomputer 732 controls the servo mechanisms 734 and 736 based on the received W a distance such that the W d distance is substantially the same as the W a distance. Specifically, the servo mechanisms 734 and 736 move the display screens 644 and 646 until W d is substantially the same as W a .
  • the detailed explanation with regard to the movement of the display screens has been provided with regard to FIGS. 18 - 20 .
  • the device 67 may comprise substantially the same elements and performs substantially the same functions as those of the device 65 shown in FIGS. 66A and 66B. Thus, the detailed explanation of embodiments thereof will be omitted.
  • FIG. 67 illustrates an information communication device 65 according to another aspect of the invention.
  • the information communication device 65 adjusts space magnification based on adjusting data for space magnification.
  • the device 65 comprises a camera portion 760 , a display portion 780 , and a microcomputer 750 .
  • the camera portion 760 comprises a pair of digital cameras 640 , 642 , a camera controller 742 , and a transceiver 744 .
  • the transceiver 744 receives adjusting data for space magnification from the device 67 , and provides the adjusting data (C) to the camera controller 742 .
  • Space magnification embodiments have been explained in detail with respect to FIGS. 49 - 53 .
  • the adjusting data for space magnifications are exemplified in FIG. 52.
  • the camera controller 742 controls the distance (interval) between the digital cameras 640 and 642 based on the provided adjusting data (C).
  • the camera controller 742 comprises a motor that adjusts the camera distance, and a servo controller that controls the motor (both not shown).
  • the operation of the camera controller 742 is substantially the same as that of the controller 4990 described in connection with FIGS. 50 - 52 .
  • the digital cameras 640 and 642 produce stereoscopic images in adjusted interval, and transmit the stereoscopic images to the device 67 through the transceiver 744 .
  • the device 67 receives and displays the adjusted stereoscopic images.
  • each of the devices 65 and 67 may display in at least one of the display screens thereof current space magnification, such as “1”, “0.5” or “10,” etc., so that a viewer can know the current space magnification.
  • the devices 65 and 67 may provide a user with an audio signal representing the current space magnification.
  • space magnification adjusting data (A) may be provided to the camera controller 742 , for example, through the space magnification input portion 652 shown in FIG. 64. This embodiment may be useful in a situation where a user of the device 65 wants to provide stereoscopic images in adjusted space magnification to a user of the device 67 .
  • the operation of the camera controller 742 is substantially the same as in a situation where the adjusting data (C) is received from the device 67 .
  • the display portion 780 comprises a pair of display screens 644 , 646 , and a transceiver 746 .
  • Space magnification (SM) adjusting data (B) are provided to the transceiver 746 from a user of the device 65 .
  • the SM adjusting data (B) are used to adjust the interval between the cameras 664 and 666 of the device 67 (FIG. 64).
  • the SM adjusting data (B) may also be provided to at least one of the display screens 644 and 646 so that the SM adjusting data (B) are displayed in the at least one of the display screens 644 and 646 . This is to inform a user of the device 65 of current space magnification.
  • the transceiver 746 transmits the SM adjusting data (B) to the device 67 .
  • the device 67 receives the SM adjusting data (B) and adjusts the interval between the cameras 664 and 666 of the device 67 based on the adjusting data (B). Also, the device 67 transmits stereoscopic images produced in adjusted space magnification to the device 65 .
  • the transceiver 746 receives left and right images from the device 67 and provides the images to the display screens 644 and 646 , respectively.
  • the display screens 644 and 646 display the stereoscopic images.
  • each of the devices 65 and 67 of FIG. 67 may further comprise the functions of the devices 65 and 67 described in connection with FIGS. 65 and 66.
  • the microcomputer 750 controls the operation of the camera portion 760 and display portion 780 , and data communication with the device 67 .
  • the microcomputer 750 is programmed to control the camera portion 760 and display portion 780 such that after the communication link between the devices 65 and 67 is established, the SM adjusting data (B, C) are transmitted or received from or to each other.
  • the microcomputer 750 is programmed to control the camera portion 760 such that the camera controller 742 adjusts the interval between the digital cameras 640 and 642 based on the SM adjusting data (A) even when the communication link between the devices 65 and 67 is not established.
  • the device 65 may include a volatile memory such as a RAM and/or a non-volatile memory such as a flash memory or a programmable ROM that store data for the communication.
  • the device 65 may comprise an element that performs a voice signal transmission.
  • embodiments of the device 67 comprise substantially the same elements and perform the same functions as those of the device 65 shown in FIG. 67. Thus, a detailed explanation of these embodiments will be omitted.
  • the communication device 65 comprises a goggle shaped display device 649 as shown in FIG. 68.
  • the goggle shaped display device comprises a set of display screens 645 and 647 .
  • the display device 649 may be connected to the device 65 through a communication jack 643 .
  • the display device 649 may have a wireless connection to the device 65 .
  • each of the devices 65 and 67 may comprise a head mount display (HMD) device that includes a set of display screens.
  • HMD head mount display
  • FIG. 69 illustrates a 3D display system according to another aspect of the invention.
  • stereoscopic images are produced from three-dimensional structural data.
  • the three-dimensional structural data may comprise 3D game data or 3D animation data.
  • the three-dimensional structural data comprise pixel values (e.g., RGB pixel values) ranging from, for example, (0000, 0000, 0000) to (9999, 9999, 9999) in the locations from (000, 000, 000) to (999, 999, 999) in a 3D coordinate system (x, y, z).
  • Table 1 exemplifies data #1-data #N of the 3D structural data.
  • stereoscopic images are produced from three-dimensional structural data 752 in a remote server.
  • the three-dimensional structural data 752 are projected into a pair of two dimensional planes using known projection portions 754 and 756 , which are also frequently referred to as imaginary cameras or view points in stereoscopic image display technology.
  • the projection portions may comprise a know software that performs the projection function.
  • These projected images are stereoscopic images, each comprising a pair of two-dimensional plane images that are transmitted to a display site. In the display site, the stereoscopic images are displayed in a pair of display devices.
  • stereoscopic images are produced from three-dimensional structural data in a display site.
  • the three-dimensional structural data may be transmitted or downloaded from a remote server to the display site.
  • the projection portions 772 and 774 are located in a computing device 770 .
  • the projection portions 772 and 774 may comprise a software module and be downloaded with the structural data from the remote server to the computing device 770 of the display site.
  • the projected images, i.e., produced stereoscopic images are displayed through a pair of display devices 776 and 778 .
  • the 3D structural data are stored on a recording medium such as optical disks or magnetic disks and inserted and retrieved in the computing device 770 as discussed with regard to FIG. 62.
  • a software module for the projection portions 772 and 774 may be included in the medium.
  • a method of producing stereoscopic images from the three-dimensional structural data is, for example, disclosed in U.S. Pat. No. 6,005,607, issued Dec. 21, 1999, which is incorporated by reference herein.
  • the photographing ratios of the imaginary cameras may be calculated by calculating horizontal and vertical lengths of a photographed object or scene and the distance between the cameras and the object (scene), using the location of the cameras and object in the projected coordinate system.
  • control of the motions of the imaginary cameras may be performed by a computer software that identifies the location of the imaginary cameras and controls the movement of the cameras.
  • control of the space magnification may be performed by adjusting the interval between the imaginary cameras using the identified location of the imaginary cameras in the projected coordinate system.
  • FIG. 70 illustrates a 3D display system according to another aspect of the invention.
  • This aspect of the invention is directed to display stereoscopic images such that the resolution of each display device is substantially the same as that of each stereoscopic camera.
  • the locations of the pixels that are photographed in each camera with regard to a camera frame e.g., 640 ⁇ 480
  • a display screen e.g., 1280 ⁇ 960.
  • the resolution of the display device is double that of the camera.
  • one pixel of the left top corner photographed in the camera is converted to four pixels of the display screen in the same location as shown in FIG. 70.
  • one pixel of the right bottom comer photographed in the camera is converted to four pixels of the display screen in the same location as shown in FIG. 70.
  • This aspect of the invention may be applied to all of the 3D display systems described in this application.

Abstract

This invention relates to a method and system of displaying an image. The method comprises generating a digital image of a scene by a camera. The method also comprises measuring a photographing ratio (A:B:C) of the camera while the digital image is being generated. The parameters A and B are horizontal and vertical lengths of the imaged scene, respectively, and C is a distance between an object lens of the camera and the scene. The method also comprises transmitting the image and the ratio (A:B:C) to a display device. The method comprises displaying the transmitted image such that a screen ratio (D:E:F) of the display device is substantially the same as the photographing ratio, wherein D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point.

Description

    RELATED APPLICATIONS
  • This application is a continuation application, and claims the benefit under 35 U.S.C. §§120 and 365 of PCT application No. PCT/KR01/01398 filed on Aug. 17, 2001 and published on Feb. 21, 2002, in English, which is hereby incorporated by reference herein. This application is related to, and hereby incorporates by reference, the following patent applications: [0001]
  • U.S. Patent Application entitled “METHOD AND SYSTEM FOR CALCULATIN A PHOTOGRAPHING RATIO OF A CAMERA”, filed on even date herewith and having application Ser. No.______ (Attorney Docket No. GRANP2.001C1); [0002]
  • U.S. Patent Application entitled “METHOD AND SYSTEM FOR CONTROLLING THE DISPLAY LOCATION OF STEREOSCOPIC IMAGES”, filed on even date herewith and having application Ser. No.______ (Attorney Docket No. GRANP2.001C1); [0003]
  • U.S. Patent Application entitled “METHOD AND SYSTEM FOR PROVIDING THE MOTION INFORMATION OF STEREOSCOPIC CAMERAS”, filed on even date herewith and having application Ser. No.______ (Attorney Docket No. GRANP2.001C4); [0004]
  • U.S. Patent Application entitled “METHOD AND SYSTEM FOR CONTROLLING THE MOTION OF STEREOSCOPIC CAMERAS BASED ON A VIEWER'S EYE MOTION”, filed on even date herewith and having application Ser. No.______ (Attorney Docket No. GRANP2.001C5); [0005]
  • U.S. Patent Application entitled “METHOD AND SYSTEM OF STEREOSCOPIC IMAGE DISPLAY FOR GUIDING A VIEWER'S EYE MOTION USING A THREE-DIMENSIONAL MOUSE”, filed on even date herewith and having application Ser. No.______ (Attorney Docket No. GRANP2.001C6); [0006]
  • U.S. Patent Application entitled “METHOD AND SYSTEM FOR CONTROLLING THE MOTION OF STEREOSCOPIC CAMERAS USING A THREE-DIMENSIONAL MOUSE”, filed on even date herewith and having application Ser. No. ______ (Attorney Docket No. GRANP2.001C7); [0007]
  • U.S. patent application entitled “METHOD AND SYSTEM FOR CONNTROLLING SPACE MAGNIFICATION FOR STEREOSCOPIC IMAGES”, filed on even date herewith and having application Ser. No. ______ (Attorney Docket No. GRANP2.00 1 C8); [0008]
  • U.S. patent application entitled “METHOD AND SYSTEM FOR ADJUSTING DISPLAY ANGLES OF A STEREOSCOPIC IMAGE BASED ON A CAMERA LOCATION”, filed on even date herewith and having application Ser. No. ______ (Attorney Docket No. GRANP2.001C9); [0009]
  • U.S. patent application entitled “METHOD AND SYSTEM FOR TRANSMITTING OR STORING STEREOSCOPIC IMAGES AND PHOTOGRAPHING RATIOS FOR THE IMAGES”, filed on even date herewith and having application Ser. No. ______ (Attorney Docket No. GRANP2.001C10); AND [0010]
  • U.S. patent application entitled “PORTABLE COMMUNICATION DEVICE FOR STEREOSCOPIC IMAGE DISPLAY AND TRANSMISSION”, filed on even date herewith and having application Ser. No. ______ (Attorney Docket No. GRANP2.001C11)[0011]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0012]
  • The present invention relates to a method and system for generating and/or displaying a more realistic stereoscopic image. Specifically, the present invention relates to a method and system for displaying at least one stereoscopic image in a set of display devices based on received photographing ratios such that each of the screen ratios for the display devices is substantially the same as each of the photographing ratios. [0013]
  • 2. Description of the Related Technology [0014]
  • In general, a human being can recognize an object by sensing the environment through eyes. Also, as the two eyes are spaced apart a predetermined distance from each other, the object perceived by the two eyes is initially sensed as two images, each image being formed by one of the left or right eyes. The object is recognized by the human brain as the two images are partially overlapped. Here, in the portion where the images perceived by a human being overlap, as the two different images transmitted from the left and right eyes are synthesized in the brain, there is a perception of 3-dimensions. [0015]
  • By using the above principle, various conventional 3-D image generating and reproducing systems using cameras and displays have been developed. [0016]
  • As one example of the systems, U.S. Pat. No. 4,729,017 discloses “Stereoscopic display method and apparatus therefor.” With a relatively simple construction, the apparatus allows a viewer to view a stereoscopic image via the naked eye. [0017]
  • As another example of the systems, U.S. Pat. No. 5,978,143 discloses “Stereoscopic recording and display system.” The patent discloses that the stereoscopically shown image content is easily controllable by the observer within the scene, which is recorded by the stereo camera. [0018]
  • As another example of the systems, U.S. Pat. No. 6,005,607 discloses “Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus.” This apparatus stereoscopically displays two-dimensional images generated from three-dimensional structural information. [0019]
  • SUMMARY OF CERTAIN INVENTIVE ASPECTS OF THE INVENTION
  • One aspect of the invention provides a method of displaying an image. The method comprises generating a digital image of a scene by a camera. The method also comprises measuring a photographing ratio (A:B:C) of the camera while the digital image is being generated, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene. The method also comprises transmitting the image and the photographing ratio (A:B:C) to a display device. In addition, the method comprises displaying the transmitted image in the display device such that a screen ratio (D:E:F) of the display device is substantially the same as the photographing ratio (A:B:C), wherein D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point. [0020]
  • Another aspect of the invention provides a method of displaying stereoscopic images. The method comprises producing at least one stereoscopic image of a scene, the stereoscopic image comprising a pair of two-dimensional plane images produced by first and second cameras. The method also comprises measuring a first photographing ratio (A1:B1:C1) of the first camera and a second photographing ratio (A2:B2:C2) of the second camera while the scene is being imaged, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene imaged by each of the first and second cameras, respectively, and C1 and C2 are defined as distances between object lenses of the cameras and the scene, respectively. The method also comprises transmitting the stereoscopic image and the photographing ratios (A1:B1:C1, A2:B2:C2) to first and second display devices, respectively. In addition, the method also comprises displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively. [0021]
  • Another aspect of the invention provides a system for displaying at least one image in at least one display device. The system comprises a receiver, a signal separator, an image size adjusting portion, and at least one display portion. The receiver receives at least one image of a scene and at least one photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene. The signal separator separates the image and the photographing ratio. The image size adjusting portion adjusts a size of the received image to be displayed based on the photographing ratio and at least one screen ratio, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point. The at least one display portion displays the adjusted image. [0022]
  • Still another aspect of the invention provides a method of displaying images in at least one display device. The method comprises receiving at least one image of a scene and photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene. The method also comprises determining a screen ratio of the display device, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the device, respectively, and C is defined as a distance between the display device and a viewing point. The method also comprises adjusting a size of the received image to be displayed such that the photographing ratio (A:B:C) is substantially the same as the screen ratio (D:E:F). The method comprises displaying the adjusted image in the display device. [0023]
  • Still another aspect of the invention provides a method of displaying stereoscopic images. The method comprises producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions. The method also comprises providing a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively. The method also comprises transmitting the produced stereoscopic image and the corresponding photographing ratios (A1:B1:C1, A2:B2:C2) to first and second display devices. The method comprises displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively. [0024]
  • Yet another aspect of the invention provides a method of displaying stereoscopic images. The method comprises producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions. The method also comprises providing a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively. The method also comprises displaying the stereoscopic image in a pair of display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively. [0025]
  • Yet another aspect of the invention provides a system for displaying stereoscopic images. The system comprises first and second projection portions, a computing device and a display portion. The first and second projection portions produce at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions. The computing device provides a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between each of the projection portions and the scene, respectively. The display portion displays the stereoscopic image in a pair of display screens such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display screens is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display screens, respectively, and F1 and F2 are defined as distances between the display screens and viewing points, respectively.[0026]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A illustrates one typical 3-D image generating and reproducing apparatus. [0027]
  • FIG. 1B illustrates another typical 3-D image generating and reproducing apparatus. [0028]
  • FIGS. 2A and 2B illustrate a photographing ratio of a camera. [0029]
  • FIGS. 3A and 3B illustrate a screen ratio of a display device that displays a photographed image. [0030]
  • FIG. 4A illustrates the variation of the distance between an object lens and a film according to the variation of a focal length of a camera. [0031]
  • FIG. 4B illustrates the variation of a photographing ratio according to the variation of the focal length of the camera. [0032]
  • FIG. 4C shows the relationship between a photographing ratio and the focal length of the camera. [0033]
  • FIG. 4D illustrates an exemplary table showing maximum and minimum photographing ratios of a camera. [0034]
  • FIG. 5A illustrates a photographing ratio calculation apparatus according to one aspect of the invention. [0035]
  • FIG. 5B illustrates a photographing ratio calculation apparatus according to another aspect of the invention. [0036]
  • FIG. 6A illustrates an exemplary flowchart for explaining the operation of the photographing ratio calculation apparatus of FIG. 5A. [0037]
  • FIG. 6B illustrates an exemplary flowchart for explaining the operation of the photographing ratio calculation apparatus of FIG. 5B. [0038]
  • FIG. 7 illustrates a camera comprising the photographing ratio calculation apparatus as shown in FIGS. 5A and 5B. [0039]
  • FIG. 8 illustrates a system for displaying stereoscopic images such that a photographing ratio (A:B:C) is substantially the same as a screen ratio (D:E:F). [0040]
  • FIG. 9 illustrates an exemplary flowchart for explaining the operation of the image size adjusting portion of FIG. 8. [0041]
  • FIG. 10 is a conceptual drawing for explaining the image size adjustment in each of the display devices. [0042]
  • FIG. 11 illustrates an exemplary flowchart for explaining the entire operation of the system shown in FIG. 8. [0043]
  • FIG. 12 illustrates examples of the display system according to one aspect of the invention. [0044]
  • FIG. 13 illustrates a 3D display system including an eye position fixing device according to one aspect of the invention. [0045]
  • FIG. 14 illustrates a relationship between the displayed images and a viewer's eyes. [0046]
  • FIG. 15 illustrates a 3D image display system according to one aspect of the invention. [0047]
  • FIG. 16A illustrates an exemplary flowchart for explaining the operation of the system of FIG. 15. [0048]
  • FIG. 17 is a conceptual drawing for explaining the operation of the display device of FIG. 15. [0049]
  • FIG. 18 illustrates a 3D image display system according to another aspect of the invention. [0050]
  • FIG. 19 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 18. [0051]
  • FIG. 20 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 18. [0052]
  • FIG. 21A illustrates an eye lens motion detection device. [0053]
  • FIG. 21B is a conceptual drawing for explaining the movement of the eye lenses. [0054]
  • FIG. 22 is a conceptual drawing for explaining the movement of the center points of the displayed images. [0055]
  • FIG. 23 illustrates a camera system for a 3D display system according to one aspect of the invention. [0056]
  • FIG. 24 illustrates a display system corresponding to the camera system shown in FIG. 23. [0057]
  • FIG. 25 illustrates an exemplary flowchart for explaining the operation of the camera and display systems shown in FIGS. 23 and 24. [0058]
  • FIG. 26A is a conceptual drawing that illustrates parameters for a set of stereoscopic cameras. [0059]
  • FIG. 26B is a conceptual drawing that illustrates parameters for a viewer's eyes. [0060]
  • FIG. 27 is a conceptual drawing that illustrates the movement of a set of stereoscopic cameras. [0061]
  • FIG. 28 is a conceptual drawing for explaining the eye lens movement according to the distance between the viewer and an object FIG. 29 illustrates a 3D display system for controlling a set of stereoscopic cameras according to another aspect of the invention. [0062]
  • FIG. 30 illustrates an exemplary block diagram of the camera controllers shown in FIG. 29. [0063]
  • FIG. 31 illustrates an exemplary flowchart for explaining the operation of the camera controllers according to one aspect of the invention. [0064]
  • FIG. 32A illustrates an exemplary table for controlling horizontal and vertical motors. [0065]
  • FIG. 32B illustrates a conceptual drawing that explains motion of the camera. [0066]
  • FIG. 33 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 29. [0067]
  • FIG. 34 illustrates a stereoscopic camera controller system used for a 3D display system according to another aspect of the invention. [0068]
  • FIG. 35 illustrates an exemplary table showing the relationship between camera adjusting values and selected cameras. [0069]
  • FIG. 36A is a top plan view of the plural sets of stereoscopic cameras. [0070]
  • FIG. 36B is a front elevational view of the plural sets of stereoscopic cameras. [0071]
  • FIG. 37 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 34. [0072]
  • FIG. 38 illustrates a 3D display system according to another aspect of the invention. [0073]
  • FIG. 39 illustrates one example of a 3D display image. [0074]
  • FIGS. [0075] 40A-40H illustrate conceptual drawings that explain the relationship between the 3D mouse cursors and eye lens locations.
  • FIG. 41 illustrates an exemplary block diagram of the display devices as shown in FIG. 38. [0076]
  • FIG. 42 illustrates an exemplary flowchart for explaining the operation of the display devices of FIG. 41. [0077]
  • FIGS. 43A and 43B illustrate conceptual drawings that explain a method for calculating the location of the center points of the eye lens. [0078]
  • FIG. 44 is a conceptual drawing for explaining a determination method of the location of the center points of the displayed images. [0079]
  • FIG. 45 illustrates a 3D display system according to another aspect of the invention. [0080]
  • FIG. 46 illustrates an exemplary block diagram of the display device of FIG. 45. [0081]
  • FIG. 47 is a conceptual drawing for explaining the camera control based on the movement of the eye lenses. [0082]
  • FIG. 48 illustrates an exemplary flowchart for explaining the operation of the system shown in FIG. 45. [0083]
  • FIG. 49 illustrates a 3D display system according to another aspect of the invention. [0084]
  • FIG. 50 illustrates an exemplary block diagram of the camera controller of FIG. 49. [0085]
  • FIG. 51 illustrates an exemplary flowchart for explaining the camera controller of FIG. 50. [0086]
  • FIG. 52 illustrates an exemplary table for explaining the relationship between the space magnification and camera distance. [0087]
  • FIG. 53 illustrates an exemplary flowchart for explaining the operation of the entire system shown in FIG. 49. [0088]
  • FIG. 54 illustrates a 3D display system according to another aspect of the invention. [0089]
  • FIG. 55 illustrates an exemplary table for explaining the relationship between the camera motion and display angle. [0090]
  • FIG. 56 illustrates an exemplary flowchart for explaining the entire operation of the system shown in FIG. 54. [0091]
  • FIG. 57 illustrates a 3D display system according to another aspect of the invention. [0092]
  • FIG. 58 illustrates an exemplary block diagram of the display device of FIG. 57. [0093]
  • FIGS. 59A and 59B are conceptual drawings for explaining the adjustment of the displayed image. [0094]
  • FIG. 60 illustrates an exemplary flowchart for explaining the operation of the system of FIG. 54. [0095]
  • FIG. 61 illustrates an exemplary block diagram for transmitting stereoscopic images and photographing ratios for the images. [0096]
  • FIG. 62 illustrates an exemplary block diagram for storing on a persistent memory stereoscopic images and photographing ratios for the images. [0097]
  • FIG. 63 illustrates an exemplary format of the data that are stored in the recording medium of FIG. 62. [0098]
  • FIG. 64 illustrates an exemplary block diagram of a pair of portable communication devices comprising a pair of digital cameras and a pair of display screens. [0099]
  • FIG. 65 illustrates an exemplary block diagram of a portable communication device for displaying stereoscopic images based on a photographing ratio and a screen ratio. [0100]
  • FIGS. 66A and 66B illustrate an exemplary block diagram of a portable communication device for controlling the location of the stereoscopic images. [0101]
  • FIG. 67 illustrates an exemplary block diagram of a portable communication device for controlling space magnification for stereoscopic images. [0102]
  • FIG. 68 illustrates a conceptual drawing for explaining a portable communication device having separate display screens. [0103]
  • FIGS. 69A and 69B illustrate an exemplary block diagram for explaining the generation of the stereoscopic images from three-dimensional structural data. [0104]
  • FIGS. [0105] 70 illustrates a 3D display system for conforming the resolution between the stereoscopic cameras and display devices.
  • DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS OF THE INVENTION
  • FIG. 1A illustrates one typical 3-D image generating and reproducing apparatus. The system of FIG. 1A uses two display devices so as to display stereoscopic images. The apparatus includes a set of [0106] stereoscopic cameras 110 and 120, spaced apart a predetermined distance from each other. The cameras 110 and 120 may be spaced apart as the same as exists distance between a viewer's two eyes, for photographing an object 100 at two different positions. Each camera 110 and 120 provides each photographed image simultaneously or sequentially to the display devices 140 and 150, respectively. The display devices 140 and 150 are located such that a viewer can watch each image displayed in the devices 140 and 150 through their left and right eyes, respectively. The viewer can recognize a 3-D image by simultaneously or sequentially perceiving and synthesizing the left and right images. That is, when the viewer sees a pair of stereoscopic images with each eye, a single image (object) is perceived having a 3D quality.
  • FIG. 1B illustrates another typical 3-D image generating and reproducing apparatus. The system of FIG. 1B uses one display device so as to display stereoscopic images. The apparatus includes a set of [0107] stereoscopic cameras 110 and 120, spaced apart a predetermined distance from each other for photographing the same object 100 at the two different positions. Each camera 110 and 120 provides each photographed image to a synthesizing device 130. The synthesizing device 130 receives two images from the left and right cameras 110 and 120, and sequentially irradiates the received images on a display device 160. The synthesizing device 130 may be located in either a camera site or a display site. The viewer wears special glasses 170 that allow each displayed image to be seen by each eye. The glasses 170 may include a filter or a shutter that allows the viewer to see each image alternately. The display device 160 may comprise a LCD or a 3-D glasses such as a head mounted display (HMD). Thus, the viewer can recognize a 3-D image by sequentially perceiving the left and right images through each eye.
  • Here, according to the distance between the two cameras and the object to be photographed by the cameras, and the size of the photographed object, the size of the displayed image is determined. Also, as the distance between the left and right images displayed on the display device has the same ratio as the distance between a viewer's two eyes, the viewer feels a sense of viewing the actual object in 3-dimensions. [0108]
  • In the above technology, an object may be photographed by a camera while the object moves, the camera moves, or a magnifying (zoom-in) or reducing (zoom-out) imaging function is performed with respect to the object, not being in a state in which a fixed object is photographed by a fixed camera. In those situations, the distance between the camera and the photographed object, or the size of the photographed object changes. Thus, a viewer may perceive the image having a sense of distance different than is the actual distance from the camera to the object. [0109]
  • Also, even when the distance between the object and the stereoscopic cameras is fixed during photographing, each viewer has their own unique eye distance, a biometric which is measured as the distance between the center points of the viewer's eyes. For example, the distance between an adult's eyes is quite different from that of a child's eyes. Also the eye distance varies between viewers of the same age. In the meantime, in current 3D display systems, the distance between the center points of each stereoscopic image is fixed at the distance value of the average adult (i.e., 70 mm) as exemplified in FIGS. 1A and 1B. However, as discussed above, each viewer has their own personal eye distance. This may cause a headache when the viewer sees stereoscopic images as well as the sense of 3-dimensions being distorted. In certain instances, the sense of 3-dimensions is not even perceived. [0110]
  • In order to display a realistic 3D image, one aspect of the invention is to adjust display images or display devices such that a screen ratio (D:E:F) in the display device is substantially the same as a photographing ratio (A:B:C) in the camera. Hereinafter, the [0111] term 3D images and stereoscopic images will be used to convey the same meaning. Also, a stereoscopic image comprises a pair of two-dimensional plane images produced by a pair of stereoscopic images. Stereoscopic images comprise a plurality of stereoscopic images.
  • Photographing Ratio (A:B:C) and Screen Ratio (D:E:F) [0112]
  • FIGS. 2A and 2B illustrate a photographing ratio of a camera. The ratio relates to a scope or the size of the space, being proportional to a range which is seen through a viewfinder of a camera, that the camera can photograph in a scene. The photographing ratio includes three parameters (A, B, C). Parameters A and B are defined as horizontal and vertical lengths of the space, respectively, including the [0113] object 22 photographed by the camera 20. Parameter C is defined as the perpendicular distance between the camera 20 and the object 22. Generally, a camera has its own horizontal and vertical ranges that can photograph an object, and the ratio of the horizontal and vertical lengths is typically constant, e.g., 4:3 or 16:9. Thus, once one of the horizontal and vertical lengths is determined, the other length may be automatically determined. In one embodiment of the invention, the camera 20 comprises a video camera, a still camera, an analog camera, or a digital camera.
  • For the purpose of the explanation, assume that the [0114] object 22 is located “20 m” away from the camera 20 and is photographed such that the object 22 is included in a single film or an image frame as shown in FIGS. 2A and 2B. If the horizontal distance (A) is 20 m, the vertical distance (B) would be “15 m” for a 4:3 camera ratio. Since the distance between the camera 20 and the object 22 is 10 m, the photographing ratio is 20:15:10=2:1.5:1. In one embodiment of the invention, the present photographing ratio while photographing an object may be determined based on the optical property of a camera object lens, e.g., the maximum photographing ratio and minimum photographing ratio.
  • FIGS. 3A and 3B illustrate a screen ratio of a display device that displays a photographed image. The screen ratio relates to a range or scope that a viewer can see through a display device. The screen ratio includes three parameters (D, E, F). Parameters D and E are defined as horizontal and vertical lengths of the image displayed in the [0115] display device 24, respectively. Parameter F is defined as the perpendicular distance between the display device and a viewer's eye 26. For convenience, only one eye 26 and one display device 24 are illustrated instead of two eyes and a set of display devices in FIGS. 3A and 3B. F may be automatically measured using a distance detection sensor or may be manually measured, or may be fixed. In one embodiment of the invention, parameters D and E are adjusted such that the photographing ratio (A:B:C) equals the screen ratio (D:E:F). Thus the size of the adjusted image in the display device 24 corresponds to that of the image that has been captured by the camera 20. This means that a viewer watches the display image at the same rate the camera 20 photographs an object. Thus, by always maintaining the relationship of the adjustment of being “A:B:C=D:E:F” provides a more realistic 3D image to the viewer. Thus, by one embodiment of the invention, if the camera photographs an object with a large photographing ratio, the image is displayed using a large screen ratio.
  • FIG. 4A illustrates the variation of the distance between an object lens and a film according to the variation of a focal length of the [0116] camera 20. (Note that although the term “film” is used in this specification, the term is not limited to analog image recording media. For instance, a CCD device or CMOS image sensor may be use to capture an image in a digital context. The camera 20 may have more focal length ranges, but only four focal length ranges are exemplified in FIG. 4A.
  • As shown in FIG. 4A, the distance between a film and an object lens ranges from d1-d4 according to the focal length of the [0117] camera 20. The focal length may be adjusted by a focus adjusting portion (which will be explained below) of the camera 20. The distance (d1) is shortest when the focal length is “infinity” (∞). When the camera 20 is set to have an infinity focal length, the camera 20 receives the most amount of light through the object lens. The distance (d4) is longest when the focal length is “0.5 m,” where the camera receives the least amount of light through the object lens. That is, the amount of light coming into the camera 20 varies according to the focal length of the camera 20.
  • Since the location of the object lens is normally fixed, in order to change the distance from d1 to d4, the location of the film ranges from P[0118] s to P1 as much as “d” according to the focal length. The focus adjusting portion of the camera 20 adjusts the location of the film from Ps to P1. The focus adjusting of the camera 20 may be manually performed or may be automatically made.
  • FIG. 4B illustrates the variation of a photographing ratio according to the variation of the focal length of the [0119] camera 20. The photographing ratio (A:B:C) may be expressed as (A/C:B/C). When the camera is set to have an infinity focal length, the value A/C or B/C is the biggest amount, which is shown as “2.0/1” in FIG. 4B. In contrast, when the camera 20 is set to have, e.g., a “0.5 m” focal length, the value A/C or B/C is the smallest amount, which is shown as “1.0/1” in FIG. 4B. That is, the more amount of light the camera receives, the larger the photographing ratio. Similarly, the longer the focal length, the greater the photographing ratio.
  • FIG. 4C shows the relationship between a photographing ratio and a focal length of a camera. The focal length of the camera may be determined, e.g., by detecting a current scale location of the focus adjusting portion of the camera. As shown in FIG. 4C, when the camera has a focal length range of “0.3 m to infinity,” the focus adjusting portion is located in one position of the scales between 0.3 m and infinity while the camera is photographing an object. In this situation, the photographing ratio varies linearly as shown in FIG. 4C. If the camera has a focus adjusting portion that is automatically adjusted while photographing an object, the photographing ratio may be determined by detecting the current focal length that is automatically adjusted. [0120]
  • FIG. 4D illustrates an exemplary table showing maximum and minimum photographing ratios of a camera. As described before, a camera has the maximum photographing ratio (A:B:C=3:2:1) when the focal length is the longest, i.e., a distance of infinity as shown in FIG. 4D. In addition, the camera has the minimum photographing ratio (A:B:C=1.5:1:1) when the focal length is the shortest, i.e., “0.3 m” as shown in FIG. 4D. The maximum and minimum photographing ratios of the camera are determined by the optical characteristic of the camera. In one embodiment, a camera manufacturing company may provide the maximum and minimum photographing ratios in the technical specification of the camera. The table in FIG. 4D is used for determining a photographing ratio when the focus adjusting portion is located in one scale between “0.3 m and an infinity.”[0121]
  • Method and System for Calculating a Photographing Ratio of a Camera [0122]
  • FIG. 5A illustrates a photographing ratio calculation apparatus according to one aspect of the invention. The apparatus comprises a focus adjusting portion (FAP) [0123] 52, a FAP location detection portion 54, a memory 56, and a photographing ratio calculation portion 58. In one embodiment, the photographing ratio calculation apparatus may be embedded into the camera 20.
  • The [0124] focus adjusting portion 52 adjusts the focus of the object lens of the camera 20. The focus adjusting portion 52 may perform its function either manually or automatically. In one embodiment of the invention, the focus adjusting portion 52 may comprise 10 scales between “0.3 m and infinity,” and is located in one of the scales while the camera 20 is photographing an object. In one embodiment of the invention, the focus adjusting portion 52 may use a known focus adjusting portion that is used in a typical camera.
  • The FAP [0125] location detection portion 54 detects the current scale location of the focus adjusting portion 52 among the scales. In one embodiment of the invention, the FAP location detection portion 54 may comprise a known position detection sensor that detects the scale value in which the focus adjusting portion 52 is located. In another embodiment of the invention, since the variation of the scale location is proportional to the distance between the object lens and film as shown in FIG. 4A, the FAP location detection portion 54 may comprise a known distance detection sensor that measures the distance between the object lens and film.
  • The [0126] memory 56 stores data representing maximum and minimum photographing ratios of the camera 20. In one embodiment of the invention, the memory 56 comprise a ROM, a flash memory or a programmable ROM. This may apply to all of the other memories described throughout the specification.
  • The photographing [0127] ratio calculation portion 58 calculates a photographing ratio (A:B:C) based on the detected scale location and the maximum and minimum photographing ratios. In one embodiment of the invention, the photographing ratio calculation portion 58 comprises a digital signal processor (DSP) calculating the ratio (A:B:C) using the following Equations I and II.
  • Equation I: [0128] A = ( A max - A min c ) × ( S cur S tot ) + A min c
    Figure US20030113012A1-20030619-M00001
  • Equation II: [0129] B = ( B max - B min c ) × ( S cur S tot ) + B min c
    Figure US20030113012A1-20030619-M00002
  • In Equations I and II, parameters A[0130] max and Bmax represent horizontal and vertical length values (A and B) of the maximum photographing ratio, respectively, exemplified as “3” and “2” in FIG. 4D. Parameters Amin and Bmin represent horizontal and vertical length values (A and B) of the minimum photographing ratio, respectively, shown as “1.5” and “1” in FIG. 4D. Parameters Scur and Stot represent the current detected scale value and the total scale value, respectively. Parameter “c” represents the distance value of the maximum or minimum photographing ratio. Since the photographing ratio (A:B:C) represents the relative proportion between the three parameters, A, B and C, the parameters may be simplified as shown in FIG. 4D. For example, the photographing ratio, A:B:C=300:200:100, is the same as A:B:C=3:2:1. In one embodiment of the invention, the parameter “c” has the value “1” as shown in FIG. 4D.
  • In another embodiment of the invention, the photographing [0131] ratio calculation portion 58 calculates a photographing ratio (A:B:C) such that the ratio falls between the maximum and minimum photographing ratios and at the same time is proportional to the value of the detected scale location. Thus, as long as the ratio falls between the maximum and minimum photographing ratios and is proportional to the value of the detected scale location, any other equation may be used for calculating the photographing ratio.
  • Referring to FIG. 6A, the operation of the photographing ratio calculation apparatus of FIG. 5A will be explained. The [0132] camera 20 photographs an object (602). In one embodiment of the invention, the camera 20 comprise a single (mono) camera. In another embodiment of the invention, the camera 20 comprise a pair of stereoscopic cameras as shown in FIG. 1A. In either case, the operation of the apparatus will be described based on the single camera for convenience.
  • Maximum and minimum photographing ratios are provided from the [0133] memory 56 to the photographing ratio calculation portion 58 (604). In one embodiment of the invention, the photographing ratio calculation portion 58 may store the maximum and minimum photographing ratios therein. In this situation, the memory 56 may be omitted from the apparatus.
  • The FAP [0134] location detection portion 54 detects the current location of the focus adjusting portion 52 while the camera 20 is photographing the object (606). While the camera is photographing the object, the focal length may be changed. The detected current location of the focus adjusting portion 52 is provided to the photographing ratio calculation portion 58.
  • The photographing [0135] ratio calculation portion 58 calculates a horizontal value (A) of a current photographing ratio from Equation I (608). It is assumed that the detected current location value is “5” among the total scale values “10.” Using Equation I and the table of FIG. 4D, the horizontal value A is obtained as follows. A = ( A max - A min c ) × ( S cur S tot ) + A min c = ( 3 - 1.5 1 ) × ( 5 10 ) + 1.5 1 = 2.25
    Figure US20030113012A1-20030619-M00003
  • The photographing [0136] ratio calculation portion 58 calculates a vertical value (B) of a current photographing ratio from Equation II (610). In the above example, using Equation II and the table of FIG. 4D, the vertical value B is obtained as follows. B = ( B max - B min c ) × ( S cur S tot ) + B min c = ( 2 - 1 1 ) × ( 5 10 ) + 1 1 = 1.5
    Figure US20030113012A1-20030619-M00004
  • The photographing [0137] ratio calculation portion 58 retrieves parameter C from the maximum and minimum ratios that have been used for calculating parameters A and B (612). Referring to the table of FIG. 4D, the distance value (C) is “1.” The photographing ratio calculation portion 58 provides a current photographing ratio (A:B:C) (614). In the above example, the current photographing ratio=2.25:1.5:1.
  • FIG. 5B illustrates a block diagram of a photographing ratio calculation apparatus according to another aspect of the invention. The apparatus comprises an [0138] iris 62, an iris opening detection portion 64, a memory 66 and a photographing ratio calculation portion 68. In one embodiment of the invention, the photographing ratio calculation apparatus is embedded into the camera 20.
  • The [0139] iris 62 is a device that adjusts an amount of light coming into the camera 20 according to the degree of its opening. When the degree of the opening of the iris 62 is largest, the maximum amount of light shines on the film of the camera 20. This largest opening corresponds to the longest focal length and the maximum photographing ratio. In contrast, when the degree of the opening of the iris 62 is smallest, the least amount of light comes into the camera 20. This smallest opening corresponds to the shortest focal length and the minimum photographing ratio. In one embodiment of the invention, the iris 62 may be a known iris that is used in a typical camera.
  • The iris [0140] opening detection portion 64 detects the degree of the opening of the iris 62. The degree of the opening of the iris 62 may be quantitized to a range of, for example, 1-10. Degree “10” may mean the largest opening of the iris 62 and degree “1” may mean the smallest opening of the iris 62. The memory 66 stores data representing maximum and minimum photographing ratios of the camera 20.
  • The photographing [0141] ratio calculation portion 68 calculates a photographing ratio (A:B:C) based on the detected degree of the opening and the maximum and minimum photographing ratios. In one embodiment of the invention, the photographing ratio calculation portion 68 comprises a digital signal processor (DSP) calculating the ratio (A:B:C) using the following Equations III and IV.
  • Equation III: [0142] A = ( A max - A min c ) × ( I cur I largest ) + A min c
    Figure US20030113012A1-20030619-M00005
  • Equation IV: [0143] B = ( B max - B min c ) × ( I cur I largest ) + B min c
    Figure US20030113012A1-20030619-M00006
  • In Equations III and IV, parameters A[0144] max and Bmax, Amin and Bmin, and “c” are the same as the parameters used in Equations I and II. Parameters Icur and Ilargest represent the detected current degree of the opening and the largest degree of the opening, respectively.
  • Referring to FIG. 6B, the operation of the photographing ratio calculation apparatus will be described. The operation with regard to the first two [0145] procedures 702 and 704 is the same as those in FIG. 6A.
  • The iris [0146] opening detection portion 64 detects the current degree of the opening of the iris 62 while the camera 20 is photographing the object (706). The detected degree of the opening of the iris 62 is provided to the photographing ratio calculation portion 68.
  • The photographing [0147] ratio calculation portion 68 calculates a horizontal value (A) of a current photographing ratio from Equation III (708). It is assumed that the detected current opening degree is 2 among the total degree values 10. Using Equation III and FIG. 4D, the horizontal value A is obtained as follows. A = ( A max - A min c ) × ( I cur I largest ) + A min c ( 3 - 1.5 1 ) × ( 2 10 ) + 1.5 1 = 1.8
    Figure US20030113012A1-20030619-M00007
  • The photographing [0148] ratio calculation portion 68 calculates a vertical value (B) of a current photographing ratio from Equation IV (710). In the above example, using equation IV and FIG. 4D, the vertical value B is obtained as follows. B = ( B max - B min c ) × ( I cur I largest ) + B min c ( 2 - 1 1 ) × ( 2 10 ) + 1 1 = 1.2
    Figure US20030113012A1-20030619-M00008
  • The photographing [0149] ratio calculation portion 68 retrieves parameter C from the maximum and minimum ratios that have been used for calculating parameters A and B (712). Referring to FIG. 4D, the distance value is “1.” The photographing ratio calculation portion 68 provides a current photographing ratio (A:B:C) (714). In the above example, a current photographing ratio is 1.8:1.2:1.
  • FIG. 7 illustrates a camera comprising the photographing ratio calculation apparatus as shown in FIGS. 5A and 5B. The [0150] camera 20 comprises an image data processing apparatus 70, a microcomputer 72, a photographing ratio calculation apparatus 74, and a data combiner 76.
  • In one embodiment of the invention, the [0151] camera 20 comprises an analog camera and a digital camera. When the camera 20 photographs an object, the image data processing apparatus 70 performs a typical image processing of the photographed image according to the control of the microcomputer 72. In one embodiment of the invention, the image data processing apparatus 70 may comprise a digitizer that digitizes the photographed analog image into digital values, a memory that stores the digitized data, and a digital signal processor (DSP) that performs an image data processing of the digitized image data (all not shown). The image data processing apparatus 70 provides the processed data to a data combiner 76.
  • In one embodiment, the photographing [0152] ratio calculation apparatus 74 comprises the apparatus shown in FIGS. 5A or 5B. The photographing ratio calculation apparatus 74 calculates a photographing ratio (A:B:C). The calculated photographing ratio (A:B:C) data are provided from the apparatus 74 to the data combiner 76.
  • The [0153] microcomputer 72 controls the image data processing apparatus 70, the photographing ratio calculation apparatus 74, and the data combiner 76 such that the camera 20 outputs the combined data 78. In one embodiment of the invention, the microcomputer 72 controls the image data processing apparatus 70 such that the apparatus properly processes the digital image data. In this embodiment of the invention, the microcomputer 72 controls the photographing ratio calculation apparatus 74 to calculate a photographing ratio for the image being photographed. In this embodiment of the invention, the microcomputer 72 controls the data combiner 76 to combine the processed data and the photographing ratio data corresponding to the processed data. In one embodiment of the invention, the microcomputer 72 may provide a synchronization signal to the data combiner 76 so as to synchronize the image data and the ratio data. As discussed above, as long as the current scale location of the focus adjusting portion or the opening degree of the iris is not changed, the photographing ratio is not changed. The microcomputer 72 may detect the change of the scale location or the opening degree, and control the data combiner 76 such that the image data and the corresponding ratio data are properly combined.
  • In one embodiment of the invention, the [0154] microcomputer 72 is programmed to perform the above function using typical microcomputer products, available from the Intel, IBM and Motorola companies, etc. This product may also apply to the other microcomputers described throughout this specification.
  • The [0155] data combiner 76 combines the image data from the image data processing apparatus 70 and the calculated photographing ratio (A:B:C) data according to the control of the microcomputer 72. The combiner 76 outputs the combined data 78 in which the image data and the ratio data may be synchronized with each other. In one embodiment of the invention, the combiner 76 comprises a known multiplexer.
  • Method and System for Controlling a Screen Ratio Based on a Photographing Ratio [0156]
  • FIG. 8 illustrates a system for displaying stereoscopic images such that a photographing ratio (A:B:C) is substantially the same as a screen ratio (D:E:F). The system comprises a [0157] camera site 80 and a display site 82. The camera site 80 transmits a photographing ratio (A:B:C) and photographed image to the display site 82. The display site 82 displays the transmitted image such that a screen ratio (D:E:F) is substantially the same as the photographing ratio (A:B:C). In one embodiment of the invention, the camera site 80 may comprise a single camera and the display site may comprise a single display device. In another embodiment of the invention, the camera site may comprise a set of stereoscopic cameras and the display site may comprise a set of display devices as shown in FIG. 8.
  • The embodiment of [0158] camera site 80 shown in FIG. 8 comprises a set of stereoscopic cameras 110 and 120, and transmitters 806 and 808. The stereoscopic left and right cameras 110 and 120 may be located as shown in FIG. 1A with regard to an object to be photographed. The cameras 110 and 120 comprise the elements described with respect to FIG. 7. Each of the cameras 110 and 120 provides its own combined data 802 and 804 to the transmitters 806 and 808, respectively. Each transmitter 806 and 808 transmits the combined data 802 and 804 to the display site 82 through a network 84. The network 84 may comprise a wire transmission or a wireless transmission. In one embodiment of the invention, each transmitter 806 and 808 is separate from the cameras 110 and 120. In another embodiment of the invention, each transmitter 806 and 808 may be embedded into each camera 110 and 120. For convenience, it is assumed that both of the photographing ratios are referred to as “A1 :B1:C1” and “A2:B2:C2,” respectively.
  • In one embodiment of the invention, the photographing ratios “A1:B1:C1” and “A2:B2:C2” are substantially the same. In one embodiment of the invention, the [0159] data 802 and 804 may be combined and transmitted to the display site 82. In one embodiment of the invention, the photographing ratio may have a standard data format in each of the camera and display sites so that the display site can identify the photographing ratio easily.
  • The [0160] display site 82 comprises a set of receivers 820, 832, a set of display devices 86, 88. Each receiver 820, 832 receives the combined data transmitted from the camera site 80 and provides each data set to the display devices 86, 88, respectively. In one embodiment of the invention, each of the receivers 820, 832 is separate from the display devices 86, 88. In another embodiment of the invention, receivers 820, 832 may be embedded into each display device 86, 88.
  • The [0161] display devices 86 and 88 comprise data separators 822 and 834, image size adjusting portions 828 and 840, and display screens 830 and 842. The data separators 822 and 834 separate the photographing ratio data (824, 838) and the image data (826, 836) from the received data. In one embodiment of the invention, each of the data separators 822 and 834 comprises a typical demultiplexer.
  • The image [0162] size adjusting portion 828 adjusts the size of the image to be displayed in the display screen 830 based on the photographing ratio (A1:B1:C1), and screen-viewer distance (F1) and display screen size values (G1, H1). The screen-viewer distance (F1) represents the distance between the display screen 830 and one of a viewer's eyes, e.g., a left eye, that is directed to the screen 830. In one embodiment of the invention, the distance F1 may be fixed. In this situation, a viewer's eyes may be located in a eye fixing structure, which will be described in more detail later. Also, the image size adjusting portion 828 may store the fixed value F1 therein. The screen size values G1 and H1 represent the horizontal and vertical dimensions of the screen 830, respectively. In one embodiment of the invention, the size values G1 and H1 may be stored in the image size adjusting portion 828.
  • The image [0163] size adjusting portion 840 adjusts the size of the image to be displayed in the display screen 842 based on the photographing ratio (A2:B2:C2), and screen-viewer distance (F2) and display screen size values (G2, H2). The screen-viewer distance (F2) represents the distance between the display screen 842 and one of a viewer's eyes, e.g., a right eye, that is directed to the screen 842. In one embodiment of the invention, the distance F2 may be fixed. In one embodiment of the invention, the screen-viewer distance (F2) is substantially the same as the screen-viewer distance (F1). The screen size values G2 and H2 represent the horizontal and vertical dimensions of the screen 842, respectively. In one embodiment of the invention, the display screen size values G2 and H2 are substantially the same as the display screen size values G1 and H1.
  • The operation of the image [0164] size adjusting portions 828 and 840 will be described in more detail by referring to FIGS. 9 and 10. Since the operations of the two image size adjusting portions 828 and 840 are substantially the same, for convenience, only the operation with regard to the image size adjusting portion 828 will be explained.
  • The [0165] image data 826, the photographing ratio data (A1:B1:C1) and the screen-viewer distance (F1) are provided to the image size adjusting portion 828 (902). A screen ratio (D1:E1 F1) is calculated based on the photographing ratio (A1:B1:C1) and the screen-viewer distance (F1) using the following Equation V (904). Since the value F1 is already provided, the parameters D1 and E1 of the screen ratio are obtained from Equation V.
  • Equation V: [0166] A1 : B1 : C1 = D1 : E1 : F1 D1 = A1 × F1 C1 E1 = B1 × F1 C1
    Figure US20030113012A1-20030619-M00009
  • The horizontal and vertical screen size values (G1, H1) of the [0167] display screen 830 are provided to the image size adjusting portion 828 (906). In one embodiment of the invention, the screen size values G1 and H1, and the distance value F1 are fixed and stored in the image size adjusting portion 828. In another embodiment of the invention, the screen size values G1 and H1, and the distance value F1 are manually provided to the image size adjusting portion 828.
  • Image magnification (reduction) ratios d and e are calculated from the following Equation VI ([0168] 908). The ratios d and e represent horizontal and vertical magnification (reduction) ratios for the display screens 830 and 842, respectively.
  • Equation VI: [0169] d = D1 G1 e = E1 H1
    Figure US20030113012A1-20030619-M00010
  • This is to perform magnification or reduction of the provided [0170] image 826 with regard to the screen sizes (G1, H1). If the calculated value “D1” is greater than the horizontal screen size value (G1), the provided image needs to be magnified as much as “d.” If the calculated value “D1” is less than the horizontal screen size value (G1), the provided image needs to be reduced as much as “d.” The same applies to the calculated value “E1.” This magnification or reduction enables a viewer to recognize the image at the same ratio that the camera 110 photographed the object. The combination of the display devices 86 and 88 provides a viewer with a more realistic three-dimensional image.
  • It is determined whether the magnification (reduction) ratios (d, e) are greater than “1” ([0171] 910). If both of the ratios (d, e) are greater than 1, the image data 826 are magnified as much as “d” and “e,” respectively, as shown in FIG. 10A (912). In one embodiment of the invention, the portion of the image greater than the screen sizes (G1, H1) is cut out as shown in FIG. 10A (914).
  • If both of the ratios “d” and “e” are not greater than 1, it is determined whether the magnification (reduction) ratios (d, e) are less than “1” ([0172] 916). If both of the ratios d and e are less than 1, the image data 826 are reduced as much as “d” and “e,” respectively, as shown in FIG. 10B (918). In one embodiment of the invention, the blank portion of the screen is filled with background color, e.g., black color, as shown in FIG. 10B (920).
  • If both of the ratios d and e are equal to 1, no adjustment of the image size is made ([0173] 922). In this situation, since the magnification (reduction) ratio is 1, no magnification or reduction of the image is made as shown in FIG. 10C.
  • Now referring to FIG. 11, the entire operation of the system shown in FIG. 8 will be described. Photographing an object is performed using a set of [0174] stereoscopic cameras 110 and 120 (1120), as exemplified in FIG. 1A. Each of the cameras 110 and 120 calculates the photographing ratio (A1 :B 1 :C1) and (A2:B2:C2), respectively (1140), for example, using the method shown in FIG. 6.
  • The image data and the photographing ratio that are calculated for the image are combined for each of the [0175] stereoscopic cameras 110 and 120 (1160). The combined data are illustrated as reference numerals 802 and 804 in FIG. 8. In one embodiment of the invention, the combining is performed per a frame of the image data. In one embodiment of the invention, as long as the photographing ratio remains unchanged, the combining may not be performed and only image data without the photographing ratio may be transmitted to the display site 82. In that situation, when the photographing ratio is changed, the combining may resume. Alternatively, the photographing ratio is not combined, and rather, is transmitted separately from the image data. Each of the transmitters 806 and 808 transmits the combined data to the display site 82 through the communication network 84 (1180).
  • Each of the [0176] receivers 820 and 832 receives the transmitted data from the camera site 80 (1200). The photographing ratio and image data are separated from the combined data (1220). Alternatively to 1200 and 1220, the image data and photographing ratio are separately received as they are not combined in transmission. In one embodiment of the invention, the combined data may not include a photographing ratio. In that circumstance, the photographing ratio that has been received most recently is used for calculating the screen ratio. In one embodiment of the invention, the screen ratio may remain unchanged until the new photographing ratio is received.
  • The screen ratios (D1:E1:F1) and (D2:E2:F2) for each of the [0177] display devices 86 and 88 are calculated using the method described with regard to FIG. 9 (1240). The stereoscopic images are displayed such that each of the photographing ratios (A1 :B1 :C1) and (A2:B2:C2) is substantially the same as each of the screen ratios (D1:E1:F1) and (D2:E2:F2) (1260). In this situation, the image may be magnified or reduced with regard to the screen size of each of the display devices 86 and 88 as discussed with reference to FIGS. 9 and 10.
  • Method and System for Controlling the Display Location of a Stereoscopic Image [0178]
  • FIG. 12 illustrates examples of the display system according to one embodiment of the invention. FIG. 12A illustrates a head mount display (HMD) system. The HMD system comprises the pair of the [0179] display screens 1200 and 1220. For convenience, the electronic display mechanism as exemplified in FIG. 8 is omitted in this HMD system. A viewer wears the HMD on his or her head and watches stereoscopic images through each display screen 1200 and 1220. Thus, in one embodiment of the invention, the screen-viewer's eye distance (F) may be fixed. In another embodiment of the invention, the distance (F) may be measured with a known distance detection sensor and provided to the HMD system. Another embodiment of the invention includes a 3D display system as shown in FIG. 1B. Another embodiment of the display devices includes a pair of projection devices that project a set of stereoscopic images on the screen.
  • FIG. 12B illustrates a 3D display system according to another embodiment of the invention. The display system comprises a V shaped [0180] mirror 1240, and a set of display devices 1260 and 1280. In one embodiment of the invention, the display devices 1260 and 1280 are substantially the same as the display devices 86 and 88 of FIG. 8 except for further comprising an inverting portion (not shown), respectively. The inverting portion inverts the left and right sides of the image to be displayed. The V shaped mirror 1240 reflects the images coming from the display devices 1260 and 1280 to a viewer's eyes. Thus, the viewer watches a reflected image from the V shaped mirror 1240. The 3 D display system comprising the V shaped mirror is disclosed in U.S. application Ser. No. 10/067,628, which was filed on Feb. 4, 2002, by the same inventor as this application and is incorporated by reference herein. For convenience, hereinafter, the description of inventive aspects will be mainly made based on the display system as shown in FIG. 12B, however, the invention is applicable to other display systems such as the one shown in FIG. 12A.
  • FIG. 13 illustrates a 3D display system including an eye [0181] position fixing device 1300 according to one aspect of the invention. Referring to FIGS. 13A and 13B, the eye position fixing device 1300 is located in front of the V shaped mirror 1240 at a predetermined distance from the mirror 1240. The eye position fixing device 1300 is used for fixing the distance between the mirror 1240 and a viewer's eyes. The eye position fixing device 1300 is also used for locating a viewer's eyes such that each of the viewer's eyes are substantially perpendicular to each of the mirror (imaginary) images. A pair of holes 1320 and 1340 defined in the device 1300 are configured to allow the viewer to see each of the center points of the reflected images. In one embodiment of the invention, the size of each of the holes 1320 and 1340 is big enough to allow the viewer to see a complete half portion (left or right portion) of the V shaped mirror 1240 at a predetermined distance and location as exemplified in FIGS. 13A and 13B. In one embodiment of the invention, the eye position fixing device 1300 may be used for fixing the location of a viewer's eyes as necessary with regard to the other aspects of the invention as discussed below.
  • FIG. 14A illustrates a relationship between the displayed images and a viewer's eyes. Distance (W[0182] d) represents the distance between the center points (1430, 1440) of each of the displayed images (1410, 1420). Distance (Wa) represents the distance between the center points (1450, 1460) of each of a viewer's eyes. The distance Wa varies from person to person. Normally the distance increases as a person grows and it does not change when he or she reaches a certain age. The average distance of an adult may be 70 mm. Some people may have 80 mm distance, other people may have 60 mm distance. Distance (Va) represents the distance between the center points (1470, 1480) of each of a viewer's eye lenses. Here, a lens means a piece of round transparent flesh behind the pupil of an eye. The lens moves along the movement of the eye. The distance Va changes according to the distance (F) between an object and the viewer's eyes. The farther the distance (F) is, the greater the value Va becomes. Referring to FIG. 14B, when a viewer sees an object farther than, for example, 10,000 m, Va has the maximum value (Vamax) which is substantially the same as the distance Wa.
  • [0183] Traditional 3D display systems display images without considering the value Wa. This means that the distance value (Wd) is the same for all viewers regardless of the fact that they have a different Wa value. These traditional systems caused several undesirable problems such as headache or dizziness of the viewer, and deterioration of a sense of three dimension. In order to produce a more realistic three-dimensional image and to reduce headaches or dizziness of a viewer, the distance Wd needs to be determined by considering the distance Wa. The consideration of the Wa value may provide a viewer with better and more realistic three-dimensional images. In one embodiment of the invention, the distance Wd is adjusted such that the distance Wd is substantially the same as Wa.
  • FIG. 15 illustrates a 3D image display system according to one aspect of the invention. Once again, the system may be used with, for example, either a HMD system or a display system with the V shaped mirror shown in FIGS. 13A and 13B, a projection display system, respectively. [0184]
  • The system shown in FIG. 15 comprises a pair of [0185] display devices 1260 and 1280, and a pair of input devices 1400 and 1500. Each of the input devices 1400 and 1500 provides the distance value Wa, to each of the display devices 1260 and 1280. In one embodiment of the invention, each of the input devices 1400 and 1500 comprises a keyboard, a mouse, a pointing device, or a remote controller. In one embodiment of the invention, one of the input devices 1400 and 1500 may be omitted and the other input device is used for providing the distance value Wa to both of the display devices 1260 and 1280.
  • The [0186] display devices 1260 and 1280 comprise interfaces 1510 and 1550, microcomputers 1520 and 1560, display drivers 1530 and 1570, and display screens 1540 and 1580, respectively. In one embodiment of the invention, each of the display screens 1540 and 1580 comprises a LCD screen, a CRT screen, or a PDP screen. The interfaces 1510 and 1550 provide the interface between the input devices 1400 and 1500 and the microcomputers 1520 and 1560, respectively. In one embodiment of the invention, each of the interfaces 1510 and 1550 comprises a typical input device controller and/or a typical interface module (not shown).
  • There may be several methods to measure and provide the distance (W[0187] a). As one example, an optometrist may measure the Wa value of a viewer with eye examination equipment. In this situation, the viewer may input the value (Wa) via the input devices 1400, 1500. As another example, an eye lens motion detector may be used in measuring the Wa value. In this situation, the Wa value may be provided from the detector to either the input devices 1400, 1500 or the interfaces 1510, 1550 in FIG. 15.
  • As another example, as shown in FIG. 14C, the W[0188] a value may be measured using a pair of parallel pipes 200, 220, about 1 m in length and about 1 mm in diameter, which are spaced approximately 1 cm apart from a viewer's eyes. Each end of the pipes 200, 220 is open. The pipe distance (Pd) may be adjusted between about 40 mm and about 120 mm by widening or narrowing the pipes 200, 220. The pipes 200, 220 maintain a parallel alignment while they are widened or narrowed. A ruler 240 may be attached into the pipes 200, 220, as shown in FIG. 14C so that the ruler 240 can measure the distance between the pipes 200, 220. When the viewer sees the holes 260, 280 completely through the holes 200, 220, respectively, the ruler 240 indicates the Wa value of the viewer. In another embodiment, red and blue color materials (paper, plastic, or glass) may cover the holes 260, 280, respectively. In this situation, the pipe distance (Pd) is the Wa value of the viewer where the viewer perceives a purple color from the holes 260, 280 by the combination of the red and blue colors.
  • Each of the [0189] microcomputers 1520 and 1560 determines an amount of movement for the displayed images based on the provided Wa value such that the Wd value is substantially the same as the Wa value. In one embodiment of the invention, each microcomputer (1520, 1560) initializes the distance value Wd and determines an amount of movement for the displayed images based on the value Wa and the initialized value Wd. Each of the display drivers 1530 and 1570 moves the displayed images based on the determined movement amount and displays the moved images on each of the display screens 1540 and 1580. In one embodiment of the invention, each microcomputer (1520, 1560) may incorporate the function of each of the display drivers 1530 and 1570. In that situation, the display drivers 1530 and 1570 may be omitted.
  • Referring to FIG. 16, the operation of the system of FIG. 15 will be described. A set of stereoscopic images are displayed in the pair of [0190] display screens 1540 and 1580 (1610). The stereoscopic images may be provided from the stereoscopic cameras 110 and 120, respectively, as exemplified in FIG. 1A. The distance (Wd) between the center points of the displayed images is initialized (1620). In one embodiment of the invention, the initial value may comprise the eye distance value of the average adult, e.g., “70 mm.” The distance (Wa) between the center points of a viewer's eye lenses is provided (1630).
  • It is then determined whether W[0191] a equals Wd (1640). If Wa equals Wd, no movement of the displayed images is made (1680). In this situation, since the distance (Wa) between the center points of the viewer's eye is the same as the distance (Wd) between the center points of the displayed images, no adjustment of the displayed images is made.
  • If W[0192] a does not equal Wd, it is determined whether Wa is greater than Wd (1650). If Wa is greater than Wd, the distance (Wd) needs to be increased until Wd equals Wa. In this situation, the left image 1750 displayed in the left screen 1540 is moved to the left side and the right image 1760 displayed in the right screen 1580 is moved to the right side until the two values are substantially the same as shown in FIG. 17A. Referring to FIG. 17B, movements of the displayed images 1750 and 1760 are conceptually illustrated for the display system with a V shaped mirror. Since the V shaped mirror reflects the displayed images, which have been received from the display devices 1260 and 1280, to a viewer, in order for the viewer to see the adjusted images through the mirror as shown in FIG. 17A, the displayed images 1750 and 1760 need to be moved with regard to the V shaped mirror as shown in FIG. 17B. That is, when the displayed images 1750 and 1760 are moved as shown in FIG. 17B, the viewer who sees the V shaped mirror perceives the image movement as shown in FIG. 17A.
  • With regard to the HMD system shown in FIG. 12A, the movement direction of the displayed images is the same as the direction of those shown in FIG. 17A. With regard to the projection display system described in connection with FIG. 15, since the projection display system projects images into a screen that is located across the projection system, the movement direction of the displayed images is opposite to the direction of those shown in FIG. 17A. [0193]
  • If it is determined that W[0194] a is not greater than Wd, the distance Wd needs to be reduced until Wd equals Wa. Thus, the left image 1770 displayed in the display device 1260 is moved to the right side and the right image 1780 displayed in the display device 1280 is moved to the left side until the two values are substantially the same as shown in FIGS. 17C and 17D. The same explanation with regard to the movement of the displayed images described in FIGS. 17A and 17B applies to the system of FIGS. 17C and 17D.
  • FIG. 18 illustrates a 3D image display system according to another embodiment of the invention. The system comprises an [0195] input device 1810, a microcomputer 1820, a pair of servo mechanisms 1830 and 1835, and a pair of display devices 1840 and 1845. The input device 1810 provides a viewer's input, i.e., the distance value Wa, to each of the display devices 1260 and 1280. In one embodiment of the invention, the input device 1810 may be a keyboard, a mouse, a pointing device, or a remote controller, for example. An interface is omitted for convenience.
  • The [0196] microcomputer 1820 determines an amount of the movement for the display devices 1840 and 1845 based on the provided value Wa such that the Wd value is substantially the same as the Wa value. In one embodiment of the invention, the microcomputer 1820 initializes the distance value (Wd) and determines an amount of the movement for the display devices 1840 and 1845 based on the value Wa and the initialized value Wd. Each of the servo mechanisms 1830 and 1835 moves the display devices 1840 and 1845, respectively, based on the determined movement amount.
  • Referring to FIG. 19, the operation of the system of FIG. 18 will be described. Each of stereoscopic images is displayed in the [0197] display devices 1840 and 1845 (1850). The distance (Wd) between the center points of the displayed images is initialized (1855). In one embodiment of the invention, the initial value may be “70 mm.” The distance (Wa) between the center points of a viewer's eyes is provided to the microcomputer 1820 (1860). It is determined whether Wa equals Wd (1870). If Wa equals Wd, no movement of the display devices 1840 and 1845 is made (1910). If it is determined that Wa is greater than Wd (1880), the servo mechanisms 1830 and 1835 move the display devices 1840 and 1845, respectively such that Wd is widened to Wa as shown in FIGS. 20A and 20B. If it is determined that Wa is not greater than Wd, the servo mechanisms 1830 and 1835 move the display devices 1840 and 1845, respectively such that Wd is narrowed to Wa as shown in FIGS. 20C and 20D.
  • In another embodiment of the invention, the distance (V[0198] a) is automatically detected using a known eye lens motion detector. This embodiment of the invention will be described referring to FIG. 21 A. The detector 2100 detects the distance Va between the center points of a viewer's eye lenses. In addition, the detector 2100 detects the locations of each of the eye lenses. In FIGS. 21A and 21B, A2L and A2R represent the center points of a viewer's eye lenses, and A3L and A3R represent the center points of a viewer's eyes. As seen in FIGS. 21A and 21B, the A3L location is fixed, but the A2L location moves. The detector 2100 detects the current locations of each of the eye lenses. In one embodiment of the invention, the detector 2100 comprises a known eye lens detecting sensor disclosed, for example, in U.S. Pat. No. 5,526,089.
  • The detected distance and location values are provided to a [0199] microcomputer 2120. The microcomputer 2120 receives the distance value Va and determines an amount of movement for the displayed images or an amount of movement for the display devices similarly as described with regard to FIGS. 15-20. The determined amount is used for controlling either the movement of the displayed images or the movement of the display devices. In addition, the microcomputer 2120 determines new locations of the center points of the images based on the location values of the eye lenses. In this way, the microcomputer 2120 controls the display drivers (1530, 1570) or the servo mechanisms (1830, 1835) to move the stereoscopic images from the current center points 2210 and 2230 of the images to, for example, new center points 2220 and 2240 as shown in FIG. 22.
  • Method and System for Providing the Motion Information of Stereoscopic Cameras [0200]
  • FIG. 23 illustrates a camera system for a 3D display system according to one aspect of the invention. The camera system is directed to provide photographed image data and camera motion detection data to a display site. The camera system comprises a set of [0201] stereoscopic cameras 2200, 2210, motion detection devices 2220, 2230, combiners 2240, 2250, and transmitters 2280, 2290. Each of the stereoscopic cameras 2200, 2210 captures an image and provides the captured image data to each of the combiners 2240, 2250.
  • The [0202] motion detection devices 2220 and 2230 detect the motion of the cameras 2200 and 2210, respectively. The motion of the cameras 2200 and 2210 may comprise motions for upper and lower directions, and left and right directions as shown in FIG. 23. Each detection device (2220, 2230) provides the detection data to each of the combiners 2240 and 2250. In one embodiment of the invention, if each of the detection devices 2220 and 2230 does not detect any motion of the cameras 2200 and 2210, the devices 2220 and 2230 may provide no detection data or provide information data representing no motion detection to the combiners 2240 and 2250. In one embodiment of the invention, each of the motion detection devices 2220 and 2230 comprises a typical motion detection sensor. The motion detection sensor may provide textual or graphical detection data to the combiners 2240 and 2250.
  • The [0203] combiners 2240 and 2250 combine the image data and the motion detection data, and provide the combined data 2260 and 2270 to the transmitters 2280 and 2290, respectively. If the combiners 2240 and 2250 receive information data representing no motion detection from the motion detection devices 2220 and 2230, or if the combiners 2240 and 2250 do not receive any motion data, each combiner (2240, 2250) provides only the image data to the transmitters 2280 and 2290 without motion detection data. In one embodiment of the invention, each of the combiners 2240 and 2250 comprises a typical multiplexer. Each of the transmitters 2280 and 2290 transmits the combined data 2260 and 2270 to the display site through a communication network (not shown).
  • FIG. 24 illustrates a display system corresponding to the camera system shown in FIG. 23. The display system is directed to provide camera motion to a viewer. The camera system comprises a pair of [0204] receivers 2300 and 2310, data separators 2320 and 2330, image processors 2340 and 2360, microcomputers 2350 and 2370, on screen data (OSD) circuits 2390 and 2410, combiners 2380 and 2400, display drivers 2420 and 2430, and display screens 2440 and 2450.
  • Each of the [0205] receivers 2300 and 2310 receives the combined data transmitted from the camera system, and provides the received data to the data separators 2320 and 2330, respectively. Each of the data separators 2320 and 2330 separates the image data and the motion detection data from the received data. The image data are provided to the image processors 2340 and 2360. The motion detection data are provided to the microcomputers 2350 and 2370. The image processors 2340 and 2360 perform typical image data processing for the image data, and provide the processed data to the combiners 2380 and 2400, respectively.
  • Each of the [0206] microcomputers 2350 and 2370 determines camera motion information from the motion detection data. In one embodiment of the invention, each microcomputer (2350, 2370) determines camera motion information for at least four directions, e.g., upper, lower, left, right. The microcomputers 2350 and 2370 provide the determined camera motion information to the OSD circuits 2390 and 2410, respectively. Each of the OSD circuits 2390 and 2410 produces OSD data representing camera motion based on the determined motion information. In one embodiment of the invention, the OSD data comprise arrow indications 2442-2448 showing the motions of the cameras 2200 and 2210. The arrows 2442 and 2448 mean that the camera has moved to the upper and lower directions, respectively. The arrows 2444 and 2446 mean that the camera has moved to the left and right directions, respectively.
  • The [0207] combiners 2380 and 2400 combine the processed image data and the OSD data, and provide the combined image to the display drivers 2420 and 2430. Each of the display drivers 2420 and 2430 displays the combined image in each of the display screens 2440 and 2450.
  • Referring to FIG. 25, the operation of the camera and display systems shown in FIGS. 23 and 24 will be described. Each of the [0208] stereoscopic cameras 2200 and 2210 images an object (2460). The pair of the motion detection devices 2220 and 2230 detect the motions of the cameras 2200 and 2210, respectively (2470). The photographed image data and the motion detection data are combined in each of the combiners 2240 and 2250 (2480). The combined data 2260 and 2270 are transmitted to the display site through a communication network (2490). Other embodiments may not have the combining and separation of data as shown in the diagrams.
  • The transmitted data from the camera system are provided to the [0209] data separators 2320 and 2330 via the receivers 2300 and 2310 (2500). The image data and the motion detection data are separated in the data separators 2320 and 2330 (2510). The image data are provided to the image processors 2340 and 2360, and each of the processors 2340 and 2360 processes the image data (2520). The motion detection data are provided to the microcomputers 2350 and 2370, and each of the microcomputers 2350 and 2370 determines motion information from the motion detection data (2520).
  • OSD data corresponding to motion information are generated based on the determined motion information in the [0210] OSD circuits 2390 and 2410 (2530). The processed image data and the OSD data are combined together in the combiners 2380 and 2400 (2540). The combined data are displayed in the display screens 2440 and 2450 (2550). When the OSD data are displayed on the display screens 2440 and 2450, this means that at least one of the cameras 2200 and 2210 has moved. Thus, the image also moves in the direction in which the cameras 2200 and 2210 have moved. This is for guiding a viewer's eye lenses to track the motion of the cameras 2200 and 2210. In one embodiment of the invention, the arrows 2442-2448 are displayed right before the image is moved by the movement of the cameras so that a viewer can expect the movement of the images in advance.
  • In another embodiment of the invention, the display system may allow the viewer to know the movement of the [0211] cameras 2200 and 2210 by providing a voice message that represents the movement of the cameras. By way of example, the voice message may be “the stereoscopic cameras have moved in the upper direction” or “the cameras have moved in the right direction.” In this embodiment of the invention, the OSD circuits 2390 and 2410 may be omitted. In another embodiment of the invention, both of the OSD data and voice message representing the movement of the cameras may be provided to the viewer.
  • In one embodiment of the invention, the camera and display systems shown in FIGS. 23 and 24 comprise the functions in which the image is displayed such that the photographing ratio (A:B:C) equals the screen ratio (A:B:C) as discussed with regard to FIGS. [0212] 7-11. In another embodiment of the invention, the systems may comprise the function that displays stereoscopic images such that the distance between the center points of the stereoscopic images are substantially the same as the distance between the center points of a viewer's eyes as discussed with regard to FIGS. 15-22.
  • Another aspect of the invention provides a 3D display system that controls the movement of the cameras according to a viewer's eye lens movement. Before describing the aspect of the invention, the relationship between a viewer's eyes and a set of stereoscopic cameras will be described by referring to FIGS. [0213] 26-28.
  • FIG. 26A is a conceptual drawing that illustrates parameters for stereoscopic cameras. Each of the [0214] cameras 30 and 32 comprises object lenses 34 and 36, respectively. The camera parameters comprise C2L, C2R, C3L, C3R, SCL, SCR, Vc and Wc. C2L and C2R represent the center points of the object lenses 34 and 36, respectively. C3L and C3R represent rotation axes of the cameras 30 and 32, respectively. SCL represents the line connecting C2L and C3L. SCR represents the line connecting C2R and C3R. Vc represents the distance between C2L and C2R. Wc represents the distance between C3L and C3R.
  • The rotation axes C[0215] 3L and C3R do not move and are the axes around which the cameras 30 and 32 rotate. The rotation axes C3L and C3R allow the cameras 30 and 32 to rotate by behaving like a car windshield wiper, respectively, as shown in FIGS. 27B-27E. FIG. 27A illustrates a default position of the cameras 30 and 32. FIGS. 27B-27D illustrate the horizontal movements of the cameras 30 and 32. FIG. 27E illustrates the vertical movements of the cameras 30 and 32. In one embodiment of the invention, while they are moving and after they move as shown in FIGS. 27B-27E, each of the cameras 30 and 32 is substantially parallel to each other. FIG. 27F is a front view of one of the stereoscopic cameras and exemplifies the movements of the camera in eight directions. The diagonal movements 46 a-46 d may be performed by the combination of the horizontal and vertical movements. For example, the movement “46 a” is made by moving the camera to the left and upper directions.
  • FIG. 26B is a conceptual drawing that illustrates parameters for a viewer's eyes. Each of the [0216] eyes 38 and 40 comprises eye lenses 42 and 44, respectively. Each of the eye lenses is located substantially in the outside surface of the eyes. This means that the distance between each center point of the eyes and each eye lens is substantially the same as the radius of the eye. The eye lens moves along with the rotation of the eye. The eye parameters comprise A2L, A2R, A3L, A3R, SAL, SAR, Va and Wa. A2L and A2R represent the center points of the eye lenses 42 and 44, respectively. Each of the eye lenses 42 and 44 performs substantially the same function as the object lenses 34 and 36 of the stereoscopic cameras 30 and 32 in terms of receiving an image. Thus, the eye parameters A2L and A2R may correspond to the camera parameters C2L and C2R.
  • A[0217] 3L and A3R represent rotation axes of the eyes 38 and 40, respectively. The rotation axes A3L and A3R are the axes around which the eyes 38 and 40 rotate. The rotation axes A3L and A3R allow the eyes 38 and 40 to rotate as shown in FIGS. 28B-28D. As the rotation axes C3L and C3R of the stereoscopic cameras 30 and 32 do not move while the cameras 30 and 32 are rotating, so the rotation axes A3L and A3R of a viewer's eyes 38 and 40 do not move while the eyes 38 and 40 are rotating. Thus, the eye parameters A3L and A3R may correspond to the camera parameters C3L and C3R.
  • S[0218] AL represents the line connecting A2L and A3L. SAR represents the line connecting A2R and A3R. As shown in FIGS. 26A and 26B, the eye parameters SAL and SAR may correspond to the camera parameters SCL and SCR, respectively. Va represents the distance between A2L and A2R. Wa represents the distance between A3L and A3R. Similarly, the eye parameters Va and Wa may correspond to the camera parameters Vc and Wc, respectively.
  • Referring to FIGS. [0219] 28A-28C, it can be seen that when the directions of the eyes 38 and 40 change, only the directions of SAL and SAR change while the rotations axes A3L and A3R are fixed. This means that Wa is constant while the lines SAL and SAR change. Thus, in order to control the movements of the cameras 30 and 32 based on the movements of the eyes 38 and 40, the directions of the camera lines SCL and SCR, need to be controlled based on those of eye lines SAL and SAR while the distance Wc is constant.
  • FIG. 28A illustrates an example of the eye configuration in which a viewer sees an object at least “10,000 m” distant from him or her. This example corresponds to the camera configuration in which the focal length of the cameras are infinity. As discussed before, when a viewer sees an object farther than, for example, “10,000 m,” the distance (V[0220] a) between the center points A2L and A2R of the eye lenses 42 and 44 is substantially the same as the distance (Wa) between the center points A3L and A3R of the eyes 38 and 40.
  • When a viewer sees an object that is located in front of him or her and is closer than, for example, “10 m,” the viewer's left eye rotates in a clockwise direction and right eye rotates in a counter clockwise direction as shown in FIG. 28B. Consequently, the distance V[0221] a becomes shorter than the distance Wa. If a viewer sees an object that is located in a slightly right front side of him or her, each of the eyes rotates in a clockwise direction as shown in FIG. 28C. In this situation, the distance Va may be less than the distance Wa. FIG. 28D exemplifies the movements of the eyes in eight directions.
  • Method and System for Controlling the Motion of Stereoscopic Cameras Based on a Viewer's Eye Lens Motion [0222]
  • FIG. 29 illustrates a 3D display system for controlling a set of stereoscopic cameras according to another aspect of the invention. The system comprises a camera site and a display site. The display site is directed to transmit eye lens motion data to the camera site. The camera site is directed to control the set of [0223] stereoscopic cameras 30 and 32 based on the eye lens motion data.
  • The display site comprises an eye lens [0224] motion detecting device 3000, a transmitter 3010, a pair of display devices 2980 and 2990, a pair of receivers 2960 and 2970, and a V shaped mirror 2985. When the camera site transmits stereoscopic images through a pair of transmitters 2900 and 2930 to the display site, the display site receives the images and displays through the display devices 2980 and 2990. A viewer sees stereoscopic images through the V shaped mirror that reflects the displayed image to the viewer. While the viewer is watching the images, the viewer's eye lenses may move in directions, e.g., latitudinal (upper or lower) and longitudinal (clockwise or counterclockwise) directions. Once again, another display device such as a HMD, or a projection display device as discussed above, may be used.
  • The eye lens [0225] motion detecting device 3000 detects motions of each of a viewer's eye lenses while a viewer is watching 3D images through the V shaped mirror. The motions may comprise current locations of the eye lenses. The detecting device 3000 is substantially the same as the device 2100 shown in FIG. 21A. The detecting device 3000 may convert the movements of the eye lenses to data that a microcomputer 2940 of the camera site can recognize, and provide the converted data to the transmitter 3010. In one embodiment of the invention, the detection data may comprise a pair of (x,y) values for each of the eye lenses.
  • The [0226] transmitter 3010 transmits the eye lens motion data to the camera site through a communication network 3015. The detection data may comprise identification data that identify each of the left and right eye lenses in the camera site. In one embodiment of the invention, the display site may comprise a pair of transmitters each transmitting left and right eye lens motion data to the camera site. In one embodiment of the invention, before transmitting the motion data, data modification such as encoding and/or modulation adapted for transmitting may be performed.
  • The camera site comprises a set of [0227] stereoscopic cameras 30 and 32, a receiver 2950, a microcomputer 2940, a pair of camera controllers 2910 and 2920, the pair of transmitters 2900 and 2930. The receiver 2950 receives the eye lens motion data from the display site, and provides the data to the microcomputer 2940. The microcomputer 2940 determines each of the eye lens motion data from the received data, and provides the left and right eye lens motion data to the camera controllers 2910 and 2920, respectively. In one embodiment of the invention, the camera site may comprise a pair of receivers each of which receives left and right eye lens motion data from the display site, respectively. In that situation, each receiver provides each eye lens detection data to corresponding camera controllers 2910 and 2920, respectively, and the microcomputer 2940 may be omitted.
  • The [0228] camera controllers 2910 and 2920 control each of the cameras 30 and 32 based on the received eye lens motion data. That is, the camera controllers 2910 and 2920 control movement of each of the cameras 30 and 32 in substantially the same directions as each of the eye lenses 42 and 44 moves. Referring to FIG. 30, the camera controllers 2910 and 2920 comprise servo controllers 3140 and 3190, horizontal motors 3120 and 3160, and vertical motors 3130 and 3180, respectively. Each of the servo controllers 3140 and 3190 controls the horizontal and vertical motors (3120, 3160, 3130, 3180) based on the received eye lens motion data. Each of the horizontal motors 3120 and 3160, respectively moves the cameras 30 and 32 in the horizontal directions. Each of the vertical motors 3130 and 3180, respectively moves the cameras 30 and 32 in the vertical directions.
  • FIG. 31 illustrates a flow chart showing the operation of the [0229] camera controllers 2910 and 2920 according to one aspect of the invention. FIG. 32A illustrates a table for controlling horizontal and vertical motors. FIG. 32B illustrates a conceptual drawing that explains motion of the camera. Referring to FIGS. 31 and 32, the operation of the camera controllers 2910 and 2920 will be described. Since the operation of the camera controllers 2910 and 2920 are substantially the same, only the operation of the camera controller 2910 will be described. The servo controller 3140 initializes camera adjusting values (3200). In one embodiment of the invention, the initialization of the camera adjusting values may comprise setting a default value, for example, “(x,y)=(0,0)” which means no movement. These values correspond to the eye lens motion data detected in a situation where a viewer sees the front direction without moving their eye lenses. In one embodiment of the invention, the initialization may comprise setting the relationship between the adjusting values and the actual movement amount of the camera 30 as shown in FIG. 32.
  • The eye lens motion data are provided to the servo controller [0230] 3140 (3210). In one embodiment of the invention, the eye lens motion data comprise (x,y) coordinate values, where x and y represent the horizontal and vertical motions of each of the eye lenses, respectively.
  • The [0231] servo controller 3140 determines camera adjusting values (X, Y) based on the provided eye lens motion data. It is determined whether X equals “0” (3230). If X is “0,” the servo controller 3140 does not move the horizontal motor 3120 (3290). If X is not “0,” it is determined whether X is greater than “0” (3240). If X is greater than “0,” the servo controller 3140 operates the horizontal motor 3120 to move the camera 30 in the right direction (3270). As exemplified in FIG. 32A, if the value X is, for example, “1,” the movement amount is “2°,” and the direction is clockwise (θ3 direction). If the value X is, for example, “2,” the movement is “4°” in a clockwise direction.
  • If X is not greater than “0,” meaning this means that X is less than “0,” the [0232] servo controller 3140 operates the horizontal motor 3120 so as to move the camera 30 in a counterclockwise (θ1) direction (3260). Referring to FIG. 32, if the value X is, for example, “−1,” the movement amount is “2°,” and the direction is counterclockwise. If the value x is, for example, “−3,” the movement is “6°” in a counterclockwise (θ1) direction.
  • Similarly, it is determined whether Y equals “0” ([0233] 3300). If Y is “0,” the servo controller 3140 does not move the vertical motor 3130 (3290). If Y is not “0,” it is determined whether Y is greater than “0” (3310). If Y is greater than “0,” the servo controller 3140 operates the vertical motor 3130 to move the camera 30 to +latitudinal (upper: θ2) direction (3320). If the value Y is, for example, “2,” the movement is “4°” in the upper direction.
  • If Y is not greater than “0,” the [0234] servo controller 3140 operates the vertical motor 3130 so as to move the camera 30 in the lower direction (3330). If the value Y is, for example, “−3,” the movement amount is “6°,” and the direction is in a −latitudinal (lower: θ4) direction.
  • Now, the entire operation of the system shown in FIG. 29 will be described with reference to FIG. 33. The eye lens [0235] motion detection device 3000 is provided to the display site of the system (3020). A viewer's eye lens motion is detected by the eye lens motion detection device 3000 while the viewer is watching stereoscopic images (3030). The eye lens motion data are transmitted to the camera site through the transmitter 3010 and the communication network 3015 (3040). As discussed above, either one transmitter or a pair of transmitters may be used.
  • The [0236] receiver 2950 of the camera site receives the eye lens motion data from the display site (3050). The camera adjusting values are determined based on the eye lens motion data (3060). The stereoscopic cameras 30 and 32 are controlled by the determined camera adjusting values (3070). In this way, the stereoscopic cameras 30 and 32 are controlled such that the cameras keep track of the eye lens motion. In terms of the viewer, he or she notices that as soon as his or her eye lenses are moved to a certain direction, stereoscopic images are also moved in the direction to which the eye lenses has moved.
  • FIG. 34 illustrates a stereoscopic camera controller system used for a 3D display system according to another aspect of the invention. For convenience, the display site is not shown. This aspect of the invention selects a pair of stereoscopic cameras corresponding to movement amount of the eye lenses among plural sets of stereoscopic cameras instead of controlling the movement of the pair of stereoscopic cameras. [0237]
  • The system comprises a [0238] microcomputer 3430, a memory 3440, camera selectors 3420 and 3425, and plural sets of stereoscopic cameras 30 a and 32 a, 30 b and 32 b, and 30 c and 32 c. The memory 3440 stores a table as shown in FIG. 35. The table shows relationship between camera adjusting values and selected cameras. The camera adjusting value “(0,0)” corresponds to, for example, a set of cameras C33 as shown in FIGS. 35 and 36B. The camera adjusting value “(1,0)” corresponds to a set of cameras C34 as shown in FIGS. 35 and 36B. The camera adjusting value “(2,2)” corresponds to the C15 camera set as shown in the Figures. In one embodiment of the invention, another set of stereoscopic cameras is selected from the sets of cameras such as one of the C34 camera set and one of the C32 camera set.
  • FIG. 36A is a top view of the plural sets of stereoscopic cameras. In one embodiment of the invention, the contour line that is made by connecting all of the object lenses of the plural sets of stereoscopic cameras is similar to the contour line of a viewer's eyes which is exposed to the outside. [0239]
  • The [0240] microcomputer 3430 determines camera adjusting values based on the received eye lens motion data. The microcomputer 3430 also determines first and second camera selection signals based on the table stored in the memory 3440. The first selection signal is determined based on the movement of a viewer's left eye lens, and used for controlling the camera selector 3420. The second selection signal is determined based on the movement of a viewer's right eye lens, and used for controlling the camera selector 3425. The microcomputer 3430 provides each of the selection signals to the camera selectors 3420 and 3425, respectively.
  • The [0241] camera selectors 3420 and 3425 select the respective camera based on the selection signal. In one embodiment of the invention, a base set of cameras (e.g., C33) shown in FIG. 36B, image an object and transmit the image to the display site through the transmitters 2900 and 2930, respectively. In this embodiment of the invention, if the camera selectors 3420 and 3425 select another set of cameras, the selected set of cameras image the object and transmit the image to the display site through the transmitters 2900 and 2930. In one embodiment of the invention, all of the cameras are turned on and a first set of cameras are connected to the transmitters 2900 and 2930, respectively. In this embodiment of the invention, when a second set of cameras are selected, the first set of cameras are disconnected from the transmitters 2900 and 2930, and the second set of cameras are connected to the transmitters 2900 and 2930, respectively. In another embodiment of the invention, only a selected set of cameras are turned on and the non-selected set of cameras remain turned off. In one embodiment of the invention, each of the camera selectors 3420 and 3425 comprises a switch that performs switching between the plural sets of stereoscopic cameras 30 a and 32 a, 30 b and 32 b, and 30 c and 32 c and the transmitters 2900 and 2925, respectively.
  • Referring to FIG. 37, the operation of the system shown in FIG. 34 will be described. A base set of cameras (e.g., C[0242] 33) of FIG. 36, image an object (3710). Eye lens motion data are received from the display site (3720). Camera adjusting values are determined based on the received eye lens motion data (3730). The camera adjusting values are exemplified in the table of FIG. 35. Camera selection signals are determined based on the determined camera adjusting values (3740), for example, using the relationship of the table of FIG. 35. It is determined whether a new set of cameras have been selected (3750). If no new set of cameras are selected, the image output from the base cameras is transmitted to the display site (3780). If a new set of cameras (e.g., C35) is selected, the base cameras (C33) are disconnected from the transmitter 2900 and the new cameras (C35) are connected to the transmitters 2900 and 2930 (3760). The selected cameras (C35) image the object (3770), and the image output from the selected cameras is transmitted to the display site (3790).
  • Regarding the embodiments described with regard to FIGS. [0243] 29-37, the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • Method and System of Stereoscopic Image Display for Guiding a Viewer's Eye Lens Motion Using a Three-Dimensional Mouse [0244]
  • FIG. 38 illustrates a 3D display system according to another aspect of the invention. The 3D display system is directed to guide a viewer's eye lens motion using a three-dimensional input device. The system is also directed to adjust displayed images using the 3D input device such that the longitudinal and latitudinal locations of the center points of a viewer's eye lenses are substantially the same as those of the center points of the displayed images. In one embodiment of the invention, the 3D input device comprises a 3D mouse (will be described later). [0245]
  • The system comprises a set of [0246] stereoscopic cameras 30 and 32, a pair of transmitters 2900 and 2930, a set of display devices 3900 and 3910, a 3D mouse 3920, and an input device 3990. The stereoscopic cameras 30 and 32, a pair of transmitters 2900 and 2930, and a pair of receivers 2960 and 2970 are the same as those shown in FIG. 29. The display devices 3900 and 3910 display stereoscopic image that has been transmitted from the camera site. Also, the devices 3900 and 3910 display the pair of 3D mouse cursors that guide a viewer's eye lens movement.
  • In one embodiment of the invention, the input of the 3D mouse is provided to both the [0247] display devices 3900 and 3910 as shown in FIG. 38. In this embodiment of the invention, the pair of 3D mouse cursors are displayed and moved by the movement of the 3D mouse 3920.
  • In one embodiment of the invention, the shape of the 3D mouse cursor comprises a square, an arrow, a cross, a square with a cross therein as shown in FIGS. [0248] 40A-40H, a reticle, or a crosshair. In one embodiment of the invention, a pair of cross square mouse cursors 400 and 420 as shown in FIG. 40 will be used for the convenience. In one embodiment of the invention, when a viewer adjusts a distance value (will be described in more detail referring to FIGS. 39 and 40) for the displayed images, the distance (Md) between the 3D mouse cursors 400 and 420 is adjusted. Also, in this embodiment of the invention, the size of the 3D mouse cursors may be adjusted. In this embodiment of the invention, the viewer adjusts the distance value, for example, by turning a scroll button of the 3D mouse. For example, by turning the scroll button backward (towards the user), the viewer can set a distance value from a larger value to a smaller one (10,000 m ->100 m ->5 m ->1 m ->0.5 m ->5 cm). Also, by turning the scroll button forward (opposite direction of the backward direction), the viewer may set a distance value from a smaller value to a larger one (5 cm ->0.5 m ->1 m ->5 m ->100 m ->10,000 m). Hereinafter the distance value 10,000 m will very often be referred to as an infinity value or infinity.
  • FIG. 39 illustrates one example of a 3D display image. The image comprises a [0249] mountain image portion 3810, a tree image portion 3820, a house image portion 3830 and a person portion image 3840. It is assumed that the mountain image 3810, the tree image 3820, the house image 3830, the person image 3840 are photographed in distances “about 10,000 m,” “about 100 m,” “about 5 m,” and “about 1 m,” respectively, spaced from the set of stereoscopic cameras 30 and 32.
  • When a viewer wants to see the [0250] mountain image 3810 shown in FIG. 39, he or she may set the distance value as a value greater than “10,000 m.” In this situation, the mouse cursor distance Md has Md0 value which is the same as the Wa (Vamax) values as shown in FIG. 40A. As discussed above, when the viewer sees an infinity object, Va has the maximum value (Vamax). Also, the viewer's sight lines Ls1 and Ls2, each of which is an extended line of each of SAL and SAR (each connecting A2 and A3), are substantially parallel to each other as shown in FIGS. 40A and 40B. This means that if the viewer sees the displayed images with their eye lenses spaced as much as Wa as shown in FIGS. 40A and 40B, the viewer feels a sense of distance as if they see an object that is “d0 (10,000 m)” distant. This is because a human being's eyes are spaced apart from each other about 60-80 mm and a sense of 3 dimension is felt by the synthesized images of each eye in the brain. Thus, when the viewer sees the two mouse cursors that are spaced as much as Md=Wa, they perceive a single (three-dimensional) mouse cursor that is located between the two mouse cursors (400, 420) at an infinity distance.
  • When the viewer sets the distance value (d[0251] 1) to, for example, “100 m,” and sees the tree image 3820, In this situation, Md has Md1 value which is less than Md0 as shown in FIGS. 40C and 40D. Also, the viewer's sight lines Ls1 and Ls2 are not parallel any more. Thus, when the two sight lines are extended, they are converged in an imaginary point “M” as shown in FIG. 40D, the point “O” represents the middle point between the center points of each eye. Similarly, if the viewer sees the displayed images with their eye lenses spaced as much as Md1 as shown in FIGS. 40C and 40D, the viewer feels a sense of distance as if they see an object that is “d1 (100 m)” distant. The distance between M and O is not physical length but imaginary length. However, since the viewer feels a sense of the distance, as far as the viewer's eye lens distance or directions are concerned, the distance between M and O can be regarded as the actual distance between the viewer's eyes and an actual object. That is, when the viewer sees the two mouse cursors 400 and 420 that are spaced as much as Md1, they perceive a single (three-dimensional) mouse cursor that is located in the M point, at a 100 m distance.
  • When the viewer sets a smaller distance value (d[0252] 2) to, for example, “5 m” and sees the house image 3830, Md has Md2 value which is less than Md1 as shown in FIGS. 40E and 40F. Also, when the two sight lines are extended in the screen, they are converged in an imaginary point “M” as shown in FIG. 40F. Similarly, in this situation when the viewer sees the house image 3830, the viewer feels a sense of distance as if he or she sees an object that is “d2 (5 m)” away. Thus, when the viewer sees the two mouse cursors 400 and 420 that are spaced as much as Md2, they perceive a single (three-dimensional) mouse cursor that is located in the M point, at a 5 m distance.
  • When the viewer sets a distance value (d[0253] 3) to the distance between the viewer and the screen, as exemplified as “50 cm,” the mouse cursors 400 and 420 overlap with each other as shown in FIG. 40G. That is, when the distance value is the same as the actual distance between the point “O” and the center points of the screen as shown in FIG. 40G, the mouse cursors overlap with each other.
  • As seen in FIGS. [0254] 40A-40G, even though a pair of the 3D mouse cursors 400 and 420 are displayed in each of the display devices 3900 and 3910, the viewer sees one three-dimensional 3D mouse cursor for which he or she feels a sense of distance.
  • When the viewer sets the distance value to a value (d[0255] 4) less than “d3,” the viewer's sight lines are converged in front of the screen and crossed to each other as shown in FIG. 40H. In this situation, the viewer may see two mouse cursors 400 and 420 because the viewer's sight lines are converged in front of the screen.
  • As shown in FIGS. [0256] 40A-40H, the Md value is determined according to the distance value that is set by the viewer.
  • FIG. 41 illustrates an exemplary block diagram of the display devices as shown in FIG. 38. Since each of the [0257] display devices 3900 and 3910 performs substantially the same functions, only one display device 3900 is illustrated in FIG. 41.
  • The [0258] display device 3900 comprises a display screen 3930, a display driver 3940, a microcomputer 3950, a memory 3960 and Interfaces 3970 and 3980. The display device 3900 adjusts the distance (Md) between a pair of 3D mouse cursors 400 and 420 according to the distance value set as shown in FIGS. 40A-40H. The display device 3900 moves the center points of the displayed images based on the 3D mouse cursor movement. In one embodiment of the invention, the display device 3900 moves the displayed images such that the longitudinal and latitudinal locations of the center points of a viewer's eye lenses are substantially the same as those of the center points of the displayed images.
  • The [0259] 3D mouse 3920 detects its movement amount. The detected movement amount is provided to the microcomputer 3950 via the interface 3970. The distance value that the viewer sets is provided to the microcomputer 3950 via the 3D mouse 3920 and the interface 3970. In one embodiment of the invention, the interface 3970 comprises a mouse controller. In another embodiment of the invention, the distance value may be provided to the microcomputer 3950 via the input device 3990 and the interface 3980.
  • The [0260] input device 3990 provides properties of the 3D mouse such as minimum detection amount (Am), movement sensitivity (Bm), and the mouse cursor size (Cm), the viewer-screen distance (d), and viewer's eye data such as Wa and SAL and SAR to the microcomputer 3950 via the interface 3980. The minimum detection amount represents the least amount of movement which the 3D mouse can detect. That is, when the 3D mouse moves only more than the minimum detection amount, the movement of the 3D mouse can be detected. In one embodiment of the invention, the minimum detection amount is set when the 3D mouse is manufactured. The movement sensitivity represents how sensitive the mouse cursors move based on the movement of the 3D mouse. This means that the scroll button of the 3D mouse has different movement sensitivity, i.e., being either more sensitive or less sensitive, according to the distance value. For example, if the distance value is greater than 1,000 m, a “1 mm turn” of the scroll button may increase or decrease the distance by 2,000 m distance. If the distance value is between 100 m and 1,000 m, a “1 mm turn” of the scroll button may increase or decrease distance by 100 m . Similarly, if the distance value is less than 1 m, a “1 mm turn” of the scroll button may increase or decrease the distance by 10 cm.
  • In one embodiment of the invention, the mouse cursor size may also be adjusted. The distance (d) represents the distance between the middle point of the viewer's eyes and the screen as exemplified in FIG. 43A. In one embodiment of the invention, the screen comprises a V shaped mirror, a HMD screen, a projection screen, and a display device screen as shown in FIG. 1B. [0261]
  • Also, the [0262] input device 3990 provides display device properties to the microcomputer 3950 through the interface 3980. In one aspect of the invention, the display device properties comprise the display device resolution and screen size of the display device 3900. The resolution represents the number of horizontal and vertical pixels of the device 3900. For example, if the resolution of the display device 3900 is 640×480, the number of the horizontal pixels is 640, and the number of the vertical pixels is 480. The size may comprise horizontal and vertical lengths of the display device 3900. With the resolution and screen size of the display device 3900, the length of one pixel can be obtained as, for example, “1 mm” per 10 pixels.
  • In one embodiment of the invention, the [0263] input device 3990 comprises a keyboard, a remote controller, and a pointing input device, etc. In one embodiment of the invention, the interface 3980 comprises the input device controller. In one embodiment of the invention, the properties of the 3D mouse are stored in the memory 3960. In one embodiment of the invention, the viewer's eye data are detected using a detection device for eye lens movement or provided to the display device 3900 by the viewer.
  • The [0264] microcomputer 3950 determines the mouse cursor distance (Md) based on the distance value set by the viewer. A table (not shown) showing the relationship between the distance value and the Md value as shown in FIGS. 40A-40H according to a viewer's eye data may be stored in the memory 3960. The microcomputer 3950 determines the cursor distance (Md) by referring to the table, and provides the determined distance value to the display driver 3940. The display driver 3940 displays the pair of the mouse cursors 400 and 420 based on the determined Md value in the display screen 3930. The microcomputer 3950 also determines new locations of the mouse cursors 400 and 420, and calculates a movement amount for the center points of the display images based on the locations of the mouse cursors 400 and 420. The memory 4730 may also store data that may be needed to calculate the movement amount for the center points of the display images.
  • Referring to FIG. 42, the operation of the [0265] display devices 3900 and 3910 will be described. 3D mouse properties are set in each of the display devices 3900 and 3910 (4200). As discussed above, the 3D mouse properties comprise a minimum detection amount (Am), a movement sensitivity (Bm), and the mouse cursor size (Cm). Also, the 3D mouse properties may be provided by the viewer or stored in the memory 3960.
  • Display device properties are provided to the [0266] display devices 3900 and 3910 (4205). In one embodiment of the invention, the display device properties may be stored in the memory 3960.
  • The viewer's eye data are provided to the [0267] display devices 3900 and 3910 (4210). As discussed above, the viewer's eye data may be automatically detected by a detection device or provided to the display devices 3900 and 3910 by the viewer. In one embodiment of the invention, the viewer's eye data comprise the distance (Wa) between the center points of the eyes, and the SA (SAL and SAR) value which is the distance between the eye lens center point (A2) and the eye center point (A3).
  • A viewer-screen distance (d) is provided to each of the [0268] display devices 3900 and 3910 via, for example, the input device 3990 (4220).
  • The mouse cursor location and distance value are initialized ([0269] 4230). In one embodiment of the invention, the initialization is performed in an infinity distance value. In this situation, left and right mouse cursors are located at (−Wa/2, 0, 0) and (Wa/2, 0, 0), respectively, where the origin of the coordinate system is O (0, 0, 0) point as shown in FIG. 43A. Also, the locations of the center points of each displayed image are (−Wa/2, 0, 0) and (Wa/2, 0, 0), respectively.
  • 3D image and 3D mouse cursors are displayed in each of the [0270] display devices 3900 and 3910 (4240). In one embodiment of the invention, 3D mouse cursors 400 and 420 are displayed on each of the 3D images. Since the mouse cursor location has been initialized, the adjusted mouse cursors 400 and 420 are displayed on the images.
  • It is determined whether initialized distance value has been changed to another value ([0271] 4250). When the viewer may want to set different distance value from the initialized distance value, he or she may provide the distance value to the display devices 3900 and 3910.
  • If the initialized distance value has been changed, 3D mouse cursor distance (M[0272] d) is adjusted and the 3D mouse cursor location is reinitialized based on the changed distance value (4260). For example, in case that the initial location is (0, 0, 10,000 m), if another distance value (e.g., 100 m) as shown in FIG. 40C is provided, the mouse cursor distance (Md) is changed from Md0 to Md1. However, the x and y values of the point M do not change, even though the z value of the M point is changed from 10,000 m to 100 m.
  • If the initialized distance value has not been changed, it is determined whether 3D mouse movement has been detected ([0273] 4270).
  • If the 3D mouse movement has been detected, a new location of the [0274] 3D mouse cursors 400 and 420 is determined (4280). In one embodiment of the invention, the new location of the mouse cursors is determined as follows. First, the number of pixels on which the mouse cursors have moved in the x-direction is determined. For example, left direction movement may have “−x” value and right direction movement may have “+x” value. The same applies to “y” direction, i.e., “−y” value for lower direction movement and “+y” value for upper direction movement. The “z” direction movement is determined by the distance value.
  • The locations of the center points of the display images to be adjusted are calculated based on the new location of the [0275] 3D mouse cursors 400 and 420 (4290). In one embodiment of the invention, the locations of the center points of the display images are obtained from the location values of each of the eye lenses, respectively. In this embodiment of the invention, the location values of the eye lenses are obtained using Equations VII and VIII as described below. Referring to FIG. 43, a method of obtaining the locations of the eye lenses will be described.
  • First, the value for Z[0276] L is obtained from Equation VII.
  • Equation VII: [0277] Equation VII : Z L = [ I N - ( - W a 2 ) ] 2 + [ J N - 0 ] 2 + [ K N - 0 ] 2 = [ I N + ( W a 2 ) ] 2 + [ J N ] 2 + [ K N ] 2
    Figure US20030113012A1-20030619-M00011
  • In FIG. 43A, M[0278] N (IN, JN, KN) represents the location of the center point of the two mouse cursors ML (IL, JL, KL) and MR (IR, JR, KR). Since each of the mouse cursor locations ML and MR is obtained in 4280, the center point location MN is obtained. That is, IN and JN are obtained by averaging (IL, IR) and (JL, JR). KN is determined by the current distance value. ZL is the distance between the left eye center point (A3L) and MN. Second, center point locations [(x1, y1, z1); (x2, y2, z2)] for each eye lens are obtained from Equation VIII. A2L (x1, y1, z1) is the center point location of the left eye lens, and A2R (x2, y2, z2) is the center point location of the right eye lens, as shown in FIG. 43A. FIG. 43B illustrates a three-dimensional view of a viewer's eye. Referring to FIG. 43B, it can be seen how eye lens center point (A2L) is moving along the surface of the eye.
  • Equation VIII: [0279] Equation VIII : x1 = ( - W a 2 ) + [ ( I N + W a 2 ) × S ] Z L y1 = 0 + [ ( J N ) × S ] Z L z1 = 0 + [ ( K N ) × S ] Z L x2 = ( W a 2 ) - [ ( I N + W a 2 ) × S ] Z L y2 = 0 + [ ( J N ) × S ] Z L z2 = 0 + [ ( K N ) × S ] Z L
    Figure US20030113012A1-20030619-M00012
  • In one embodiment of the invention, a digital signal processor may be used for calculating the locations of the eye lenses. [0280]
  • Each of the center points of the displayed images is moved to the locations (x1, y1) and (x2, y2), respectively as shown in FIG. 44 ([0281] 4300). In one embodiment of the invention, the blank area of the screen after moving may be filled with a background color, e.g., black, as shown in FIG. 44.
  • It is determined whether the 3D mouse movement has been completed ([0282] 4310). If the 3D mouse movement has not been completed, procedures 4280-4300 are performed until the movement is completed. This ensures that the displayed images are moved so long as the viewer is moving the mouse cursor.
  • By using the above calculation method, the distance between two locations can be measured. Referring to FIG. 43C, M[0283] N1 is a peak point of a mountain 42 and MN2 is a point of a house 44. It is assumed that the location values of MN1 and MN2 are determined to be (−0.02 m, 0.04 m, 100 m) and (0.01 m, 0 m, 10 m), respectively by the above calculation method. These determined location values may be stored in the memory 3960, and the distance between the two locations MN1 and MN2 is calculated as follows.
  • Z L={square root}{square root over ([−0.02−0.01]2 +[0.04−0]2+[100−10]2)}=90
  • In this embodiment, the [0284] microcomputer 3950 is programmed to calculate the distance between two locations, or may comprise a distance measure mode. In this situation, when a viewer designates a first location (A: middle point of two mouse cursors 400 and 420), the location is determined and stored in the memory 3960. In one embodiment, the location value may be displayed in the display screen 3930 or may be provided to a viewer via voice signal. This applies to a second location (B). In this way, the values of the first and second locations (A, B) are determined and the distance between the locations (A, B) is calculated.
  • Method and System for Controlling the Motion of Stereoscopic Cameras Using a Three-Dimensional Mouse [0285]
  • FIG. 45 illustrates a 3D display system according to another aspect of the invention. The system is directed to control the movement of stereoscopic cameras based on the movement of a viewer's eye lenses. [0286]
  • The system comprises a camera site and a display site. The display site comprises a pair of transmitters/[0287] receivers 4530 and 4540, a set of display devices 4510 and 4520, and an input device 3990 and a 3D mouse 3920.
  • The [0288] input device 3990 and 3D mouse 3920 are substantially the same as those of the system shown in FIG. 38. Referring to FIG. 46, the display device 4510 comprises interfaces 3970 and 3980, a microcomputer 4820, a memory 4830, and an interface 4810. The interfaces 3970 and 3980 are substantially the same as those of the display device shown in FIG. 41. The microcomputer 4820 determines the current location values of the mouse cursors, and calculates the location values of the center points of a viewer's eye lenses. The memory 4830 may also store data that may be needed to calculate the movement amount for the center points of the display images.
  • The [0289] interface 4810 may modify the location values adapted for transmission, and provide the modified data to the transmitter 4530. The transmitter 4530 transmits the modified location data to the camera site.
  • Referring to FIG. 45, the camera site comprises a set of [0290] stereoscopic cameras 30 and 32, a pair of transmitters 4570 and 4600, a pair of servo mechanisms 4580 and 4590, and a pair of receivers 4550 and 4560. Each of the receivers 4550 and 4560 receives the location values transmitted from the display site, and provides the data to the pair of the servo mechanisms, 4580 and 4590, respectively.
  • The [0291] servo mechanisms 4580 and 4590 control the cameras 30 and 32 based on the received location data, respectively. In one embodiment of the invention, the servo mechanisms 4580 and 4590 control the cameras 30 and 32 such that the longitudinal and latitudinal values of the center points of the object lenses (C2L, C2R; FIGS. 26 and 27) of the cameras 30 and 32 are substantially the same as those of the center points of the viewer's eye lenses as shown in FIGS. 47A and 47C.
  • Referring to FIG. 48, the operation of the system shown in FIG. 45 will be described. 3D mouse properties and display device properties are set in each of the [0292] display devices 4510 and 4520 (4610). The 3D mouse properties and display device properties are substantially the same as those explained with regard to FIG. 42. The viewer's eye data and viewer-screen distance (d) are provided to each of the display devices 4510 and 4520 (4620). Again, the viewer's eye data and viewer-screen distance (d) are substantially the same as those explained with regard to FIG. 42. 3D mouse cursor location and distance value are initialized (4630). In one embodiment of the invention, the 3D mouse cursor location is initialized to the center points of each of the display device screens, and the distance value is initialized to the infinity distance value. The 3D image that is received from the camera site, and 3D mouse cursors (400, 420) are displayed on the display devices 4510 and 4520 (4640). In one embodiment of the invention, the 3D mouse cursor may be displayed on the 3D image. In this situation, the portion of the image under the 3D mouse cursors (400, 420) may not be seen by a viewer.
  • It is determined whether 3D mouse movement is detected ([0293] 4650). If movement is detected, the new location of the 3D mouse cursors is determined (4660). The location values of the center points of the viewer's eye lenses are calculated based on the new location of the mouse cursors, respectively (4670). The new location and movement of the mouse cursors (400, 420) are illustrated in FIG. 47B. The specific methods for performing the procedures 4650-4670 have been described with regard to FIGS. 42-44.
  • The location value data are transmitted to the camera site through each of the transmitter/[0294] receivers 4530 and 4540 (4680). As discussed above, the location values are calculated so long as the mouse cursor is moving. Thus, the location values may comprise a series of data. In one embodiment of the invention, the location values are serially transmitted to the camera site so that the cameras 30 and 32 are controlled based on the received order of the location values. In another embodiment of the invention, the sequence of the generated location values may be obtained and transmitted to the camera site so that the cameras 30 and 32 are controlled according to the sequence. In one embodiment of the invention, the location value data are digital data and may be properly modulated for transmission.
  • The location value data are received in each of the [0295] receivers 4550 and 4560 (4690). In one embodiment of the invention, one transmitter may be used instead of the two transmitters 4530 and 4540. In that situation, one receiver may be used instead of the receivers 4550 and 4560.
  • Camera adjusting values are determined based on the location values and the [0296] stereoscopic cameras 30 and 32 are controlled based on the camera adjusting values (4700). Each of the servo controllers 4580 and 4590 controls the respective camera 30 and 32 such that each of the center points of the cameras object lenses keeps track of the movement of the center points of each eye lens (4710). As shown in FIG. 47C, new location values A2L1 and A2R1 corresponding to the new location of the 3D mouse cursors are calculated using Equations VIII as discussed above. Each of the servo controllers 4580 and 4590 controls the cameras 30 and 32 such that the center points of each of the camera object lenses are located in C2L1 and C2R1 as shown in FIG. 47A. To do this, the servo controllers 4580 and 4590 may set the location values of the center points of the camera object lenses so as to conform to the location values of the center points of the eye lenses. In one embodiment of the invention, the servo controllers 4580 and 4590 comprise a horizontal motor and a vertical motor that move each camera to the horizontal direction (x-direction) and the vertical direction (y-direction), respectively. In one embodiment of the invention, only one servo controller may be used for controlling movements of both of the cameras 30 and 32 instead of the pair of the servo controllers 4580 and 4590.
  • While each of the [0297] servo controllers 4580 and 4590 is controlling the stereoscopic cameras 30 and 32, the cameras 30 and 32 are photographing an object. The photographed image is transmitted to the display site and displayed in each of the display devices 4510 and 4520 (4720, 4730).
  • Regarding the embodiments described with regard to FIGS. [0298] 45-48, the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • Method and System for Conntrolling Space Magnification for Stereoscopic Images [0299]
  • FIG. 49 illustrates a 3D display system according to another aspect of the invention. The 3D display system is directed to adjust space magnification for a stereoscopic image based on the space magnification adjusting data provided by a viewer. [0300]
  • The system comprises a camera site and a display site. The display site comprises an [0301] input device 4910, a set of display devices 4920 and 4930, a transmitter 4950, and a pair of receivers 4940 and 4960.
  • The [0302] input device 4910 provides a viewer's eye distance value (Wa) as shown in FIG. 43A and space magnification adjusting data to at least one of the display devices 4920 and 4930. The space magnification means the size of space that a viewer perceives from the display images. For example, if the space magnification is “1,” a viewer perceives the same size of the space in the display site as that of the real space that was photographed in the camera site. Also, if the space magnification is “10,” a viewer perceives ten times of the size of the space in the display site larger than that of the real space that was imaged by the camera. In addition, if the space magnification is “0.1,” a viewer perceives ten times the size of the space in the display site less than that of the real space that was imaged by the camera. The space magnification adjusting data represent data regarding the space magnification that a viewer wants to adjust. In one embodiment of the invention, the space magnification adjusting data may comprise “0.1” times of space magnification, “1” times of space magnification, “10” times of space magnification, or “100” times of space magnification. The adjustment of the space magnification is performed by an adjustment of the distance between the cameras 30 and 32, and will be described in more detail later.
  • At least one of the [0303] display devices 4920 and 4930 displays the space magnification adjusting data that are provided through the input device 4910. The at least one of the display devices 4920 and 4930 provides the space magnification adjusting data and eye distance value (Wa) to the transmitter 4950. The transmitter 4950 transmits the magnification adjusting data and the value Wa to the camera site. In one embodiment of the invention, the space magnification adjusting data and the value Wa may be provided directly from the input device 4910 to the transmitter 4950 without passing through the display devices 4920 and 4930.
  • The [0304] receiver 4970 receives the space magnification adjusting data and Wa from the transmitter 4950, and provides the data to the camera controllers 4990. The camera controller 4990 controls the camera distance based on the space magnification adjusting data and the value Wa. The camera controller 4990 comprises a servo controller 4985 and a horizontal motor 4975 as shown in FIG. 50. Referring to FIGS. 50-52, the operation of the camera controller 4990 will be explained.
  • The [0305] servo controller 4985 initializes camera distance (C1), for example, such that C1 is the same as Wa (5100). The space magnification relates to the camera distance (C1) and the eye distance value (Wa). When C1 is the same as Wa, the space magnification is “1,” which means that a viewer sees the same size of the object that is photographed by the cameras 30 and 32. When C1 is greater than Wa, the space magnification is less than “1,” which means that a viewer perceives a smaller space than a space that is imaged by the cameras 30 and 32. When C1 is less than Wa, the space magnification is greater than “1,” which means that a viewer perceives a larger sized object than is imaged by the cameras 30 and 32.
  • The space magnification adjusting data (SM) are provided to the servo controller [0306] 4985 (5110). It is determined whether the adjusting data is “1” (5120). If the adjusting data are “1,” no adjustment of the camera distance is made (5160). If the adjusting data are not “1,” it is determined whether the adjusting data is greater than “1.” If the adjusting data are greater than “1,” the servo controller 4985 operates the motor 4975 so as to narrow C1 until the requested space magnification is obtained (5150). Referring to FIG. 52, a table showing the relationship between the space magnification and camera distance (C1) is illustrated, where Wa is 80 mm. Thus, when C1 is 80 mm, the space magnification is “1.” In this situation, if the requested space magnification is “10,” the camera distance is adjusted to “8 mm” as shown in FIG. 52.
  • If the adjusting data are less than “1,” the [0307] servo controller 4985 operates the motor 4975 so as to widen C1 until the requested space magnification is obtained (5140). As exemplified in FIG. 52, if the requested space magnification is “0.1,” the camera distance is adjusted to “800 mm.”
  • Referring to FIG. 53, the operation of the entire system shown in FIG. 49 will be described. Stereoscopic images are displayed through the [0308] display devices 4920 and 4930 (5010). Eye distance (Wa) and space magnification adjusting data (SM) are provided to the at least one of the display devices 4920 and 4930, or to the transmitter 4950 directly from the input device 4910 (5020). The eye distance (Wa) and space magnification adjusting data (SM) are transmitted to the camera site (5030). The camera site receives the Wa and SM values and adjusts the camera distance (C1) based on the Wa and SM values (5040). The stereoscopic cameras 30 and 32 image the object with adjusted space magnification (5050). The image is transmitted to the display site through the transmitters 4980 and 5000 (5060). Each of the display devices 4920 and 4930 receives and displays the image (5070).
  • Regarding the embodiments described with regard to FIGS. [0309] 49-53, the camera control may be used in remote control technology such as a remote surgery, remote control of a vehicle, an airplane, or aircraft, fighter, or remote control of construction, investigation or automatic assembly equipments.
  • Method and System for Adjusting Display Angles of Stereoscopic Image Based on a Camera Location [0310]
  • FIG. 54 illustrates a 3D display system according to another aspect of the invention. The system is directed to adjust the location of the display devices based on the relative location of the stereoscopic cameras with regard to an [0311] object 5400.
  • The system comprises a camera site and a display site. The camera site comprises a set of [0312] stereoscopic cameras 30 and 32, a pair of direction detection devices 5410 and 5420, transmitters 5430 and 5440. In this embodiment of the invention, the cameras 30 and 32 may not be parallel to each other as shown in FIG. 54. The direction detection devices 5410 and 5420 detect directions of the stereoscopic cameras 30 and 32 with respect to the object 5400 to be photographed, respectively. In one embodiment of the invention, the devices 5410 and 5420 detect the tilt angle with respect to an initial location where the two cameras are parallel to each other. In some situations, the cameras 30 and 32 may be tilted, for example, 10 degrees in a counterclockwise direction as shown in FIG. 54, or in a clockwise direction from the initial location. The detection devices 5410 and 5420 detect the tilted angle of the cameras 30 and 32, respectively. In one embodiment of the invention, each of the direction detection devices 5410 and 5420 comprises a typical direction sensor.
  • Each of the [0313] transmitters 5430 and 5440 transmits the detected direction data of the cameras 30 and 32 to the display site. If it is detected that only the camera 32 is tilted as shown in FIG. 57, the detection device 5410 may not detect a tilting, and thus only the transmitter 5440 may transmit the detected data to the display site. The same applies to a situation where only the camera 30 is tilted.
  • The display site comprises a pair of [0314] receivers 5450 and 5460, a pair of display device controllers 5470 and 5500, and a set of display devices 5480 and 5490. Each of the receivers 5450 and 5460 receives the detected tilting data of the cameras 30 and 32, and provides the data to each of the display device controllers 5470 and 5500. The display device controllers 5470 and 5500 determine display adjusting values based on the received camera tilting data. The display adjusting values represent movement amounts to be adjusted for the display devices 5480 and 5490. In one embodiment of the invention, the display device controllers 5470 and 5500 determine display adjusting values based on a table as shown in FIG. 55. In this embodiment of the invention, if the camera 32 is tilted 10 degrees in a counter clockwise direction as shown in FIG. 54, the display device controller 5500 tilts the corresponding display device 5490 as much as 10 degrees in a clockwise direction as shown in FIG. 54. In this way, the camera location with respect to the object 5400 is substantially the same as an eye lens location of the viewer with regard to the screen. As discussed above, the screen may comprise a V shaped mirror, a HMD screen, a projection screen, or a display screen 160 shown in FIG. 1B.
  • Referring to FIG. 56, the entire operation of the system shown in FIG. 54 will be explained. The set of [0315] stereoscopic cameras 30 and 32 image an object (5510). Each of the direction detection devices 5410 and 5420 detects a camera direction with respect to the object (5520). That is, for example, the degree of tilting of each camera 30 and 32 from, for example, a parallel state is detected. The photographed image data (PID) and direction detection data (DDD) are transmitted to the display site (5530). The PID and DDD are received in the display site, and the DDD are retrieved from the received data (5540, 5550). In one embodiment of the invention, the retrieving may be performed using a typical signal separator.
  • At least one of the [0316] display device controllers 5470 and 5500 determines the display device adjusting values based on the retrieved DDD (5560). The at least one of the display device controllers 5470 and 5500 adjusts the display angle with respect to the viewer's eye lenses by moving a corresponding display device (5570). The display devices 5480 and 5490 display the received stereoscopic images (5580).
  • FIG. 57 illustrates a 3D display system according to another aspect of the invention. The system is directed to adjust displayed image based on the relative location of the [0317] stereoscopic cameras 30 and 32 with regard to the object 5400.
  • The system shown in FIG. 57 is substantially the same as the one of FIG. 54 except for the [0318] display devices 5710 and 5720. The display devices 5710 and 5720 adjust the location of the displayed images based on the received camera direction detection data. Referring to FIG. 58, an exemplary block diagram of the display device 5720 is illustrated. Though not shown, the display device 5710 is substantially the same as the display device 5720. The display device 5720 comprises a microcomputer 5910, a memory 5920, a display driver 5930, and a display screen 5940. The memory 5920 stores a table (not shown) showing the relationship between the camera tilting angle and the adjust amount of displayed images. The microcomputer 5910 determines display image adjusting values based on the received camera direction data and the table of the memory 5920. The display driver 5930 adjusts the display angle of the display image based on the determined adjusting values, and displays the image in the display screen 5940.
  • Referring to FIGS. 59A and 59B, adjustment of the displayed image is illustrated. In one embodiment of the invention, this may be performed by enlarging or reducing the image portion of the left or right sides of the displayed image. For example, according to the tilting angle of the camera, the enlarging or reducing amount is determined. In this embodiment of the invention, enlargement or reduction may be performed by a known image reduction or magnification software. The image of FIG. 59A may correspond to the tilting of the display device in a clockwise direction. Similarly, the image of FIG. 59B may correspond to the tiling of the display device in a counter clockwise direction. [0319]
  • Referring to FIG. 60, the operation of FIG. 54 will be explained. As seen in FIG. 60, procedures [0320] 5810-5850 are the same as those shown in FIG. 55. Display image adjusting values are determined based on the retrieved camera direction detection data (DDD) (5860). The image to be displayed is adjusted as shown in FIG. 59 based on the determined adjusting values (5870). The adjusted image is displayed (5880).
  • Method and System for Transmitting or Storing on a Persistent Memory Stereoscopic Images and Photographing Ratios [0321]
  • FIG. 61 illustrates a 3D display system according to another aspect of the invention. In this aspect of the invention, stereoscopic images and photographing ratios are transmitted via a network such as the Internet, or stored on a persistent memory, such as optical or magnetic disks. [0322]
  • Referring to FIG. 61, the combined [0323] data 620 of stereoscopic images 624 and at least one photographing ratio (A:B:C) 622 for the images 624 are shown. The stereoscopic images 624 may comprise stereoscopic broadcasting images, stereoscopic advertisement images, or stereoscopic movie images, stereoscopic product images for Internet shopping, or any other kind of stereoscopic images. In one embodiment of the invention, the photographing ratio 622 may be fixed for the entire set of stereoscopic images 624. A method of combining of the stereoscopic images 624 and photographing ratio 622 has been described above in connection with FIG. 7.
  • In one embodiment, [0324] stereoscopic images 624 are produced from a pair of stereoscopic cameras (not shown) and combined with the photographing ratio 622. In one embodiment of the invention, the stereoscopic (broadcasting, advertisement, or movie, etc.) images 624 and the photographing ratio 622 may be transmitted from an Internet server, or a computing device of a broadcasting company. The Internet server may be operated by an Internet broadcasting company, an Internet movie company, an Internet advertising company or an Internet shopping mall company. In another embodiment, the photographing ratio is not combined, and rather, is transmitted separately from the stereoscopic images. However, for convenience, the explanation below will be mainly directed to the combined method.
  • The combined [0325] data 620 are transmitted to a computing device 627 at a display site via a network 625. In one embodiment of the invention, the network 625 may comprise the Internet, a cable, a PSTN, or a wireless network. Referring to FIG. 63, an exemplary data format of the combined data 620 is illustrated. The left images and right images of the stereoscopic images 624 are embedded into the combined data 620 such that the images 624 are retrieved sequentially in a set of display devices 626 and 628. For example, left image 1 and right image 1, left image 2 and right image 2, are located in sequence in the data format such that the images can be retrieved in that sequence. In one embodiment, the computing device 627 receives the combined data 620 and retrieves the stereoscopic images 624 and photographing ratio 622 from the received data. In another embodiment, the images 624 and photographing ratio 622 are separately received as they are not combined in transmission.
  • The [0326] computing device 627 also provides the left and right images to the display device 626 and 628, respectively. In one embodiment of the invention, the data format may be constituted such that the computing device 627 can identify the left and right images of the stereoscopic images 624 when the device 627 retrieves the images 624 such as predetermined order or data tagging. In one embodiment of the invention, the computing device 627 may comprise any kind of computing devices that can download the images 624 and ratio 622 either in a combined format or separately via the network 625. In one embodiment, a pair of computing devices each retrieving and providing left and right images to the display devices 626 and 628, respectively may be provided in the display site.
  • The [0327] display devices 626 and 628 display the received stereoscopic images such that the screen ratios (D1:E1:F1, D2:E2:F2) of each of the display devices 626 and 628 are substantially the same as the photographing ratio (A:B:C). In one embodiment of the invention, the screen ratios (D1:E1:F1, D2:E2:F2) are the same (D1:E1:F1=D2:E2:F2=D:E:F). The display devices 626 and 628 may comprise the elements of the display devices 86 and 88 disclosed in FIG. 8. In one embodiment of the invention, each of the display devices 626 and 628 may comprise CRT, LCD, HMD, PDP devices, or projection type display devices.
  • In another embodiment of the invention, as shown in FIG. 62, the combined data which are stored in a [0328] recording medium 630 such as optical or magnetic disks may be provided to the display devices 634 and 636 via a medium retrieval device 632 at the display site. In one embodiment, the optical disks may comprise a compact disk (CD) or a digital versatile disk (DVD). Also, the magnetic disk may comprise a hard disk.
  • The [0329] recording medium 630 is inserted into the medium retrieval device 632 that retrieves the stereoscopic images 624 and photographing ratio 622. In one embodiment of the invention, the medium retrieval device 632 may comprise a CD ROM driver, a DVD ROM driver, or a hard disk driver (HDD), and a host computer for the drivers. The medium retrieval device 632 may be embedded in a computing device (not shown).
  • The [0330] medium retrieval device 632 retrieves and provides the stereoscopic images 624 and photographing ratio 622 to the display devices 634 and 636, respectively. The exemplified data format shown in FIG. 63 may apply to the data stored in the recording medium 630. In one embodiment of the invention, the photographing ratio 622 is the same for the entire stereoscopic images. In this embodiment, the photographing ratio 622 is provided once to each of the display devices 634 and 636, and the same photographing ratio is used throughout the stereoscopic images.
  • In one embodiment of the invention, the data format recorded in the medium [0331] 630 is constituted such that the medium retrieval device 632 can identify the left and right images of the stereoscopic images 624. The operation of the display devices 634 and 636 is substantially the same as that of the devices 626 and 628 as discussed with regard to FIG. 61.
  • Portable Communication Device Comprising a Pair of Digital Cameras that Produce Stereoscopic Images and a Pair of Display Screens [0332]
  • FIG. 64 illustrates an information communication system according to another aspect of the invention. The system comprises a pair of [0333] portable communication devices 65 and 67. The device 65 comprises a pair of digital cameras 640, 642, a pair of display screens 644, 646, a distance input portion 648, an eye interval input portion 650, and a space magnification input portion 652. The device 65 comprises a receiver and a transmitter, or a transceiver (all not shown).
  • The pair of [0334] digital cameras 640 and 642 produce stereoscopic images of a scene or an object and photographing ratios thereof. In one embodiment of the invention, each of the cameras 640 and 642 comprises substantially the same elements of the camera 20 shown in FIG. 7. The device 65 transmits the produced stereoscopic images and photographing ratios to the device 67. The pair of display screens 644 and 646 display stereoscopic images received from the device 67.
  • The [0335] distance input portion 648 is provided with the distance values (similar to screen-viewer distances F1 and F2 in FIG. 8) between a viewer's eyes and each of the screens 644 and 646. The eye interval input portion 650 receives the distance values (exemplified as Wa in FIG. 14A) between the center points of a viewer's eyes. The space magnification input portion 652 is provided with adjusting data for space magnification, and provides the adjusting data to the device 65. In one embodiment of the invention, each of the distance input portion 648, the eye interval input portion 645, and the space magnification input portion 652 comprises key pads that can input numerals 0-9. In another embodiment, all of the input portions are embodied as one input device.
  • The [0336] device 67 comprises a pair of digital cameras 664, 666, a pair of display screens 654, 656, a distance input portion 658, an eye interval input portion 660, and a space magnification input portion 662. The device 67 also comprises a receiver and a transmitter, or a transceiver (all not shown).
  • The pair of [0337] digital cameras 664 and 666 produce stereoscopic images of a scene or an object and photographing ratios thereof. In one embodiment of the invention, each of the cameras 664 and 666 comprises substantially the same elements of the camera 20 shown in FIG. 7. The device 67 transmits the produced stereoscopic images and photographing ratios to the device 65. The pair of display screens 654 and 656 display stereoscopic images received from the device 65.
  • The [0338] distance input portion 658, the eye interval input portion 660, and the space magnification input portion 662 are substantially the same as those of the device 65.
  • The system shown in FIG. 64 may comprise at least one base station (not shown) communicating with the [0339] devices 65 and 67. In one embodiment of the invention, each of the devices 65 and 67 comprises a cellular phone, an IMT (international mobile telecommunication)-2000 device, and a personal digital assistant (PDA), a hand-held PC or another type of portable telecommunication device.
  • In one embodiment of the invention, the space magnification adjusting data and photographing ratios have a standard data format so that the [0340] devices 65 and 67 can identify the data easily.
  • The Devices Displaying Stereoscopic Images are Implemented such that the Photographing Ratio is Substantially the Same as the Screen Ratio [0341]
  • FIG. 65 illustrates a pair of [0342] information communication devices 65 and 67 according to one aspect of the invention. Each of the devices 65 and 67 displays stereoscopic images received from the other device such that the photographing ratio of one device is substantially the same as the screen ratio of the other device. The device 65 comprises a camera portion 700, a display portion 720, and a data processor 740, e.g., a microcomputer.
  • The [0343] camera portion 700 produces and transmits stereoscopic images and photographing ratios thereof to the device 67. As discussed above, the communication between the devices 65 and 67 may be performed via at least one base station (not shown). The camera portion 700 comprises the pair of digital cameras 640, 642, and a transmitter 710. Each of the digital cameras 640 and 642 produces stereoscopic images and photographing ratios thereof, and combines the images and ratios (combined data 702 and 704). In one embodiment of the invention, the photographing ratios provided in the combined data 702 and 704 are the same. Each of the digital cameras 640 and 642 may comprise the elements of the camera 20 shown in FIG. 7.
  • The production of the stereoscopic images and the calculation of the photographing ratios, and the combining of the images and ratios have been explained in detail with regard to FIGS. [0344] 5-11. The transmitter 710 transmits the combined data 702, 704 to the device 67. In another embodiment, the photographing ratios are not combined, and rather, are transmitted separately from the stereoscopic images.
  • In one embodiment of the invention, the [0345] transmitter 710 may comprise two transmitting portions that transmit the combined data 702 and 704, respectively. The device 67 receives and displays the stereoscopic images transmitted from the device 65 such that the received photographing ratio is substantially the same as the screen ratio of the device 67.
  • The [0346] display portion 720 receives combined data 714 and 716 of stereoscopic images and photographing ratios thereof from the device 67, and displays the stereoscopic images such that the received photographing ratio is substantially the same as the screen ratio of the device 65.
  • The [0347] display portion 720 comprises a pair of display devices 706, 708, and a receiver 712. The receiver 712 receives the combined data 714 and 716 that the device 67 transmitted, and provides the combined data 714, 716 to the display devices 706, 708, respectively. In one embodiment of the invention, the receiver 712 may comprise two receiving portions that receive the combined data 714 and 716, respectively. In another embodiment, the images and photographing ratios are separately received as they are not combined in transmission.
  • Each of the [0348] display devices 706 and 708 separates the provided images and ratios from the receiver 712. The devices 706 and 708 also display the stereoscopic images such that the photographing ratios are substantially the same as the screen ratios of the display devices 706 and 708, respectively. Each of the display devices 706 and 708 may comprise substantially the same elements of the display device 86 or 88 shown in FIG. 8. In one embodiment, the display devices 706 and 708 are connected to the distance input portion 648 shown in FIG. 64 so that the screen-viewer distance for the devices 706 and 708 can be provided to the device 65. In one embodiment of the invention, the screen ratios for the devices 706 and 708 are substantially the same. The detailed operation of the display devices 706 and 708 has been explained in connection with FIGS. 8-11.
  • The [0349] microcomputer 740 controls the operation of the camera portion 700 and display portion 720, and data communication with the device 67. In one embodiment of the invention, the microcomputer 740 is programmed to control the camera portion 700 such that the digital cameras 640 and 642 produce stereoscopic images and photographing ratios thereof, and that the transmitter 710 transmits the images and ratios to the device 67 when the communication link is established between the devices 65 and 67. In another embodiment of the invention, the microcomputer 740 is programmed to control the power of the camera portion 700 and the display portion 720 independently. In this embodiment, even when the cameras 640 and 642 are turned off, the display devices 706 and 708 may display the stereoscopic images received from the device 67. Also, when the display devices 706 and 708 are turned off, the cameras 640 and 642 may produce stereoscopic images and photographing ratios thereof, and transmit the images and ratios to the device 67. In this embodiment, the device 65 may comprise an element that performs a voice signal communication with the device 67.
  • The [0350] device 65 may include a volatile memory such as a RAM and/or a non-volatile memory such as a flash memory or a programmable ROM that store data for the communication. The device 65 may comprise a power supply portion such as a battery.
  • In another embodiment of the invention, the [0351] device 65 may include a transceiver that incorporates the transmitter 710 and receiver 712. In this situation, the transmitter 710 and receiver 712 may be omitted.
  • Though not specifically shown, the [0352] device 67 may be configured to comprise substantially the same elements and perform substantially the same functions as those of the device 65 shown in FIG. 65. Thus, the detailed explanation of embodiments thereof will be omitted.
  • The devices controlling the display location of the stereoscopic images FIG. 66A illustrates an [0353] information communication device 65 according to another aspect of the invention. In this aspect of the invention, the information communication device 65 controls the display location of the stereoscopic images based on the distance (Wa) between the center points of a viewer's eyes.
  • In one embodiment of the invention, the [0354] device 65 moves the stereoscopic images displayed in the display screens 644 and 646 such that the distance (Wd) between the center points of the displayed stereoscopic images is substantially the same as the Wa distance. The device 65 comprises an eye interval input portion 650, a data processor 722, e.g., a microcomputer, a pair of display drivers 724, 726, and a pair of display screens 644, 646. The eye interval input portion 650 and the pair of display screens 644 and 646 are substantially the same as those of FIG. 64.
  • The [0355] microcomputer 722 controls the display drivers 724 and 726 based on the received Wa distance such that the Wd distance is substantially the same as the Wa distance. Specifically, the display drivers 724 and 726 moves the stereoscopic images displayed in the display screens 644 and 646 until Wd is substantially the same as Wa. The detailed explanation with regard to the movement of the stereoscopic images has been provided in connection with FIGS. 15-17.
  • In another embodiment of the invention, as shown in FIG. 66B, the [0356] device 65 moves the display screens 644 and 646 such that the distance (Wd) between the center points of the stereoscopic images is substantially the same as the Wa distance. In this embodiment, the device 67 comprises the eye interval input portion 650, a microcomputer 732, a pair of servo mechanisms 734, 736, and the pair of display screens 644, 646.
  • The [0357] microcomputer 732 controls the servo mechanisms 734 and 736 based on the received Wa distance such that the Wd distance is substantially the same as the Wa distance. Specifically, the servo mechanisms 734 and 736 move the display screens 644 and 646 until Wd is substantially the same as Wa. The detailed explanation with regard to the movement of the display screens has been provided with regard to FIGS. 18-20.
  • Though not specifically shown, the [0358] device 67 may comprise substantially the same elements and performs substantially the same functions as those of the device 65 shown in FIGS. 66A and 66B. Thus, the detailed explanation of embodiments thereof will be omitted.
  • The devices adjusting space magnification of stereoscopic images FIG. 67 illustrates an [0359] information communication device 65 according to another aspect of the invention. In this aspect of the invention, the information communication device 65 adjusts space magnification based on adjusting data for space magnification. The device 65 comprises a camera portion 760, a display portion 780, and a microcomputer 750.
  • The [0360] camera portion 760 comprises a pair of digital cameras 640, 642, a camera controller 742, and a transceiver 744. The transceiver 744 receives adjusting data for space magnification from the device 67, and provides the adjusting data (C) to the camera controller 742. Space magnification embodiments have been explained in detail with respect to FIGS. 49-53. The adjusting data for space magnifications are exemplified in FIG. 52.
  • The [0361] camera controller 742 controls the distance (interval) between the digital cameras 640 and 642 based on the provided adjusting data (C). In one embodiment of the invention, the camera controller 742 comprises a motor that adjusts the camera distance, and a servo controller that controls the motor (both not shown). The operation of the camera controller 742 is substantially the same as that of the controller 4990 described in connection with FIGS. 50-52. The digital cameras 640 and 642 produce stereoscopic images in adjusted interval, and transmit the stereoscopic images to the device 67 through the transceiver 744. The device 67 receives and displays the adjusted stereoscopic images. In this way, the device 67 can adjust the space magnification for a scene imaged by the cameras 640, 642 of the device 65. In one embodiment of the invention, each of the devices 65 and 67 may display in at least one of the display screens thereof current space magnification, such as “1”, “0.5” or “10,” etc., so that a viewer can know the current space magnification. In another embodiment of the invention, the devices 65 and 67 may provide a user with an audio signal representing the current space magnification.
  • In another embodiment, space magnification adjusting data (A) may be provided to the [0362] camera controller 742, for example, through the space magnification input portion 652 shown in FIG. 64. This embodiment may be useful in a situation where a user of the device 65 wants to provide stereoscopic images in adjusted space magnification to a user of the device 67. In one embodiment, the operation of the camera controller 742 is substantially the same as in a situation where the adjusting data (C) is received from the device 67.
  • The [0363] display portion 780 comprises a pair of display screens 644, 646, and a transceiver 746. Space magnification (SM) adjusting data (B) are provided to the transceiver 746 from a user of the device 65. The SM adjusting data (B) are used to adjust the interval between the cameras 664 and 666 of the device 67 (FIG. 64). The SM adjusting data (B) may also be provided to at least one of the display screens 644 and 646 so that the SM adjusting data (B) are displayed in the at least one of the display screens 644 and 646. This is to inform a user of the device 65 of current space magnification. The transceiver 746 transmits the SM adjusting data (B) to the device 67.
  • The [0364] device 67 receives the SM adjusting data (B) and adjusts the interval between the cameras 664 and 666 of the device 67 based on the adjusting data (B). Also, the device 67 transmits stereoscopic images produced in adjusted space magnification to the device 65. The transceiver 746 receives left and right images from the device 67 and provides the images to the display screens 644 and 646, respectively. The display screens 644 and 646 display the stereoscopic images. In one embodiment, each of the devices 65 and 67 of FIG. 67 may further comprise the functions of the devices 65 and 67 described in connection with FIGS. 65 and 66.
  • The [0365] microcomputer 750 controls the operation of the camera portion 760 and display portion 780, and data communication with the device 67. In one embodiment of the invention, the microcomputer 750 is programmed to control the camera portion 760 and display portion 780 such that after the communication link between the devices 65 and 67 is established, the SM adjusting data (B, C) are transmitted or received from or to each other. In another embodiment of the invention, the microcomputer 750 is programmed to control the camera portion 760 such that the camera controller 742 adjusts the interval between the digital cameras 640 and 642 based on the SM adjusting data (A) even when the communication link between the devices 65 and 67 is not established.
  • The [0366] device 65 may include a volatile memory such as a RAM and/or a non-volatile memory such as a flash memory or a programmable ROM that store data for the communication. The device 65 may comprise an element that performs a voice signal transmission.
  • Though not specifically shown, embodiments of the [0367] device 67 comprise substantially the same elements and perform the same functions as those of the device 65 shown in FIG. 67. Thus, a detailed explanation of these embodiments will be omitted.
  • The Device Comprising Separate Display Screens [0368]
  • In another embodiment of the invention, the [0369] communication device 65 comprises a goggle shaped display device 649 as shown in FIG. 68. The goggle shaped display device comprises a set of display screens 645 and 647. In one embodiment of the invention, the display device 649 may be connected to the device 65 through a communication jack 643. In another embodiment of the invention, the display device 649 may have a wireless connection to the device 65.
  • The [0370] device 67 may be applied to the embodiments described with regard to FIGS. 65-67. In one embodiment of the invention, each of the devices 65 and 67 may comprise a head mount display (HMD) device that includes a set of display screens.
  • Other Aspects of the Invention [0371]
  • FIG. 69 illustrates a 3D display system according to another aspect of the invention. In this aspect of the invention, stereoscopic images are produced from three-dimensional structural data. The three-dimensional structural data may comprise 3D game data or 3D animation data. [0372]
  • As one example, the three-dimensional structural data comprise pixel values (e.g., RGB pixel values) ranging from, for example, (0000, 0000, 0000) to (9999, 9999, 9999) in the locations from (000, 000, 000) to (999, 999, 999) in a 3D coordinate system (x, y, z). In this situation, Table 1 exemplifies data #1-data #N of the 3D structural data. [0373]
    TABLE 1
    Data #1 in Data #2 in Data #N in
    a location a location a location
    (001, 004, 002) (001, 004, 004) . . . (025, 400, 087)
    (0001, 0003, 1348) (0010, 0033, 1234) . . . (0001, 3003, 1274)
  • In one embodiment of the invention, as shown in FIG. 69A, stereoscopic images are produced from three-dimensional [0374] structural data 752 in a remote server. The three-dimensional structural data 752 are projected into a pair of two dimensional planes using known projection portions 754 and 756, which are also frequently referred to as imaginary cameras or view points in stereoscopic image display technology. The projection portions may comprise a know software that performs the projection function. These projected images are stereoscopic images, each comprising a pair of two-dimensional plane images that are transmitted to a display site. In the display site, the stereoscopic images are displayed in a pair of display devices.
  • In another embodiment of the invention, as shown in FIG. 69A, stereoscopic images are produced from three-dimensional structural data in a display site. In this embodiment, the three-dimensional structural data may be transmitted or downloaded from a remote server to the display site. The [0375] projection portions 772 and 774 are located in a computing device 770. In one embodiment of the invention, the projection portions 772 and 774 may comprise a software module and be downloaded with the structural data from the remote server to the computing device 770 of the display site. The projected images, i.e., produced stereoscopic images are displayed through a pair of display devices 776 and 778. In another embodiment of the invention, the 3D structural data are stored on a recording medium such as optical disks or magnetic disks and inserted and retrieved in the computing device 770 as discussed with regard to FIG. 62. In this situation, a software module for the projection portions 772 and 774 may be included in the medium.
  • A method of producing stereoscopic images from the three-dimensional structural data is, for example, disclosed in U.S. Pat. No. 6,005,607, issued Dec. 21, 1999, which is incorporated by reference herein. [0376]
  • This aspect of the invention may be applied to all of the aspects of the invention described above. In some embodiments, however, some modification may be made. As one example, the photographing ratios of the imaginary cameras (projection portions, view points) may be calculated by calculating horizontal and vertical lengths of a photographed object or scene and the distance between the cameras and the object (scene), using the location of the cameras and object in the projected coordinate system. [0377]
  • As another example, the control of the motions of the imaginary cameras may be performed by a computer software that identifies the location of the imaginary cameras and controls the movement of the cameras. [0378]
  • As another example, the control of the space magnification may be performed by adjusting the interval between the imaginary cameras using the identified location of the imaginary cameras in the projected coordinate system. [0379]
  • FIG. 70 illustrates a 3D display system according to another aspect of the invention. This aspect of the invention is directed to display stereoscopic images such that the resolution of each display device is substantially the same as that of each stereoscopic camera. In this aspect of the invention, the locations of the pixels that are photographed in each camera with regard to a camera frame (e.g., 640×480) are substantially the same as those of the pixels that are displayed in each display device with regard to a display screen (e.g., 1280×960). Referring to FIG. 70, the resolution of the display device is double that of the camera. Thus, one pixel of the left top corner photographed in the camera is converted to four pixels of the display screen in the same location as shown in FIG. 70. Similarly, one pixel of the right bottom comer photographed in the camera is converted to four pixels of the display screen in the same location as shown in FIG. 70. This aspect of the invention may be applied to all of the 3D display systems described in this application. [0380]
  • The above systems have been described showing a communication location connecting the display to a remote camera site. However, these various inventions can be practiced without a receiver/a transmitter and network so that functions are performed at a single site. Some of the above systems also have been described based on a viewer's eye lens motion or location. However, the systems can be practiced based on a viewer's eye pupils or corneas. [0381]
  • While the above description has pointed out novel features of the invention as applied to various embodiments, the skilled person will understand that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made without departing from the scope of the invention. Therefore, the scope of the invention is defined by the appended claims rather than by the foregoing description. All variations coming within the meaning and range of equivalency of the claims are embraced within their scope. [0382]

Claims (25)

What is claimed is:
1. A method of displaying an image, comprising:
generating a digital image of a scene by a camera;
measuring a photographing ratio (A:B:C) of the camera while the digital image is being generated, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene;
transmitting the image and the photographing ratio (A:B:C) to a display device; and
displaying the transmitted image in the display device such that a screen ratio (D:E:F) of the display device is substantially the same as the photographing ratio (A:B:C), wherein D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point.
2. The method of claim 1, wherein the viewing point comprises a point where the midpoint between a viewer's eyes is located at a substantially perpendicular angle with regard to the center point of the displayed image.
3. The method of claim 2, wherein the camera comprises a focus adjusting device, the focus adjusting device comprising a plurality of scales, wherein the method further comprises storing maximum and minimum photographing ratios of the camera, and detecting a scale location of the focus adjusting device while generating the digital image, and wherein the parameters A and B are calculated using Equations I and II,
wherein Equation I is:
A = ( A max - A min c ) × ( S cur S tot ) + A min c ,
Figure US20030113012A1-20030619-M00013
and wherein Equation II is:
B = ( B max - B min c ) × ( S cur S tot ) + B min c ,
Figure US20030113012A1-20030619-M00014
wherein: Amax and Bmax are parameters for the maximum photographing ratio, Amin and Bmin are parameters for the minimum photographing ratio, c is equal to 1, and Stot and Scur are the total number of scales and the detected scale location value, respectively.
4. The method of claim 3, further comprising:
providing a pixel resolution of the camera;
providing a pixel resolution of the display device; and
converting the resolution of the camera to the resolution of the display device based on the number of pixels to be displayed by the display device.
5. A method of displaying stereoscopic images, comprising:
producing at least one stereoscopic image of a scene, the stereoscopic image comprising a pair of two-dimensional plane images produced by first and second cameras;
measuring a first photographing ratio (A1:B1:C1) of the first camera and a second photographing ratio (A2:B2:C2) of the second camera while the scene is being imaged, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene imaged by each of the first and second cameras, respectively, and C1 and C2 are defined as distances between object lenses of the cameras and the scene, respectively;
transmitting the stereoscopic image and the photographing ratios (A1:B1:C1, A2:B2:C2) to first and second display devices, respectively; and
displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
6. The method of claim 5, wherein the viewing points comprises points where each of a viewer's eyes is located at a substantially perpendicular angle with regard to the center points of the displayed images, respectively.
7. A system for displaying at least one image in at least one display device, comprising:
a receiver configured to receive at least one image of a scene and at least one photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene;
a signal separator configured to separate the image and the photographing ratio;
an image size adjusting portion configured to adjust a size of the received image to be displayed based on the photographing ratio and at least one screen ratio, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the display device, respectively, and C is defined as a distance between the display device and a viewing point; and
at least one display portion configured to display the adjusted image.
8. The system of claim 7, wherein the viewing point comprises a point where the midpoint between a viewer's eyes is located at a substantially perpendicular angle with regard to the center point of the displayed image.
9. The system of claim 8, wherein the image size adjusting portion is further configured to receive the parameter F and calculate the two parameters (D, E) of the screen ratio using Equation V, wherein Equation V is:
D = A × F C E = B × F C .
Figure US20030113012A1-20030619-M00015
10. The system of claim 8, wherein the image size adjusting portion is further configured to calculate image adjustment ratios (d, e) using Equation VI, wherein Equation VI is:
d = D G e = E H ,
Figure US20030113012A1-20030619-M00016
wherein the image size adjusting portion is further configured to adjust the displayed image based on the screen ratio, wherein parameters G and H are horizontal and vertical screen sizes of the display device, respectively, and configured to adjust the horizontal and vertical lengths of the displayed image by ratios d and e, respectively.
11. The system of claim 10, wherein the image size adjusting portion is further configured to magnify the display image and cut out a portion of the image greater than the screen size (G, H), where both ratios d and e are greater than 1.
12. The system of claim 10, wherein the image size adjusting portion is further configured to reduce the display image and fill a blank portion of the screen with a background color, where both ratios d and e are less than 1.
13. The system of claim 7, wherein the at least one camera comprises a pair of stereoscopic cameras configured to produce at least one stereoscopic image, the stereoscopic image comprising a pair of two-dimensional plane images produced by the set of stereoscopic cameras, respectively,
wherein the at least one display portion comprises a pair of display portions configured to display the pair of two-dimensional plane images, respectively,
and wherein the at least one photographing ratio comprises a pair of photographing ratios of each of the stereoscopic cameras, and the at least one screen ratio comprises a pair of screen ratios of each of the display portions.
14. The system of claim 13, wherein the display portions are selected from one of the following: a head mount display and a set of projection display devices, a set of LCD devices, a set of CRT devices, and a set of plasma display panel devices.
15. A method of displaying images in at least one display device, comprising:
receiving at least one image of a scene and photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene;
determining a screen ratio of the display device, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the device, respectively, and C is defined as a distance between the display device and a viewing point;
adjusting a size of the received image to be displayed such that the photographing ratio (A:B:C) is substantially the same as the screen ratio (D:E:F); and
displaying the adjusted image in the display device.
16. The method of claim 15, wherein the viewing point comprises a point where the middle point between a viewer's eyes is located in substantially perpendicular with regard to center point of the displayed image
17. The method of claim 15, wherein the at least one camera comprises a pair of stereoscopic cameras configured to produce at least one stereoscopic image, the stereoscopic image comprising a pair of two-dimensional plane images produced by the set of stereoscopic cameras, respectively,
wherein the at least one display device comprises a set of display devices configured to display the pair of two-dimensional plane images, respectively,
and wherein the at least one photographing ratio comprises a pair of photographing ratios of each of the stereoscopic cameras, and the at least one screen ratio comprises a pair of screen ratios of each of the display devices.
18. The method of claim 15, wherein the parameter F is determined using a distance detection sensor.
19. The method of claim 15, wherein the parameters D and E are stored in the display device.
20. The method of claim 15, wherein the parameters A1, B1 and C1 are substantially the same as the parameters A2, B2 and C2, respectively.
21. The method of claim 15, wherein the parameter F1 is substantially the same as the parameter F2.
22. A system for displaying images in at least one display device, comprising:
means for receiving at least one image of a scene and photographing ratio, the at least one image being produced by at least one camera, wherein the photographing ratio is defined as a ratio (A:B:C) between three parameters A, B and C, wherein parameters A and B are defined as horizontal and vertical lengths of the scene imaged by the camera, respectively, and C is defined as a distance between an object lens of the camera and the scene;
means for determining a screen ratio of the display device, wherein the screen ratio is defined as a ratio (D:E:F) between three parameters D, E and F, wherein parameters D and E are defined as horizontal and vertical lengths of the image displayed in the device, respectively, and C is defined as a distance between the display device and a viewing point;
means for adjusting a size of the received image to be displayed such that the photographing ratio (A:B:C) is substantially the same as the screen ratio (D:E:F); and means for displaying the adjusted image in the display device.
23. A method of displaying stereoscopic images, comprising:
producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions;
providing a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively;
transmitting the produced stereoscopic image and the corresponding photographing ratios (A1 :B1:C1, A2:B2:C2) to first and second display devices; and
displaying the transmitted stereoscopic image in the display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
24. A method of displaying stereoscopic images, comprising:
producing at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions;
providing a first photographing ratio (A1:B1 :C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between the first and second projection portions and the scene, respectively; and
displaying the stereoscopic image in a pair of display devices such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display devices is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display devices, respectively, and F1 and F2 are defined as distances between the display devices and viewing points, respectively.
25. A system for displaying stereoscopic images, comprising:
first and second projection portions configured to produce at least one stereoscopic image of a scene from three-dimensional structural data, the stereoscopic image comprising a pair of two-dimensional plane images projected by first and second projection portions;
a computing device configured to provide a first photographing ratio (A1:B1:C1) of the first projection portion and a second photographing ratio (A2:B2:C2) of the second projection portion, respectively, wherein A1 and A2, and B1 and B2 are defined as horizontal and vertical lengths of the scene projected by each of the first and second projection portions, respectively, and C1 and C2 are defined as distances between each of the projection portions and the scene, respectively; and
a display portion configured to display the stereoscopic image in a pair of display screens such that each screen ratio (D1:E1:F1, D2:E2:F2) of the display screens is substantially the same as each photographing ratio (A1:B1:C1, A2:B2:C2), wherein D1 and D2, and E1 and E2 are defined as horizontal and vertical lengths of the two-dimensional plane images displayed in each of the display screens, respectively, and F1 and F2 are defined as distances between the display screens and viewing points, respectively.
US10/280,246 2001-08-17 2002-10-24 Method and system for controlling a screen ratio based on a photographing ratio Abandoned US20030113012A1 (en)

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
PCT/KR2001/001398 WO2002015595A1 (en) 2000-08-18 2001-08-17 A method and system of revision for 3-dimensional image
KR01-67246 2001-10-30
KR10-2001-0067246A KR100397066B1 (en) 2001-10-30 2001-10-30 A method for controlling photo-rate and camera having apparatus for sensing photo-rate
KR01-67245 2001-10-30
KR10-2001-0067245A KR100445799B1 (en) 2001-10-30 2001-10-30 Method and apparatus for image display
KR02-10423 2002-02-27
KR02-10422 2002-02-27
KR20020010423 2002-02-27
KR20020010422 2002-02-27
KR02-10424 2002-02-27
KR20020010424 2002-02-27

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2001/001398 Continuation WO2002015595A1 (en) 2000-08-18 2001-08-17 A method and system of revision for 3-dimensional image

Publications (1)

Publication Number Publication Date
US20030113012A1 true US20030113012A1 (en) 2003-06-19

Family

ID=27532378

Family Applications (13)

Application Number Title Priority Date Filing Date
US10/280,465 Expired - Fee Related US7091931B2 (en) 2001-08-17 2002-10-24 Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse
US10/280,241 Abandoned US20030107645A1 (en) 2001-08-17 2002-10-24 Method and system for controlling the display location of stereoscopic images
US10/280,179 Expired - Fee Related US7190825B2 (en) 2001-08-17 2002-10-24 Portable communication device for stereoscopic image display and transmission
US10/280,344 Abandoned US20030112508A1 (en) 2001-08-17 2002-10-24 Method and system for controlling space magnification for stereoscopic images
US10/280,464 Abandoned US20030112326A1 (en) 2001-08-17 2002-10-24 Method and system for transmitting or storing stereoscopic images and photographing ratios for the images
US10/280,419 Expired - Fee Related US7084838B2 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
US10/280,251 Abandoned US20030107643A1 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion
US10/280,246 Abandoned US20030113012A1 (en) 2001-08-17 2002-10-24 Method and system for controlling a screen ratio based on a photographing ratio
US10/280,436 Abandoned US20030122925A1 (en) 2001-08-17 2002-10-24 Method and system for providing the motion information of stereoscopic cameras
US10/280,239 Abandoned US20030117395A1 (en) 2001-08-17 2002-10-24 Method and system for calculating a photographing ratio of a camera
US10/280,248 Abandoned US20030107646A1 (en) 2001-08-17 2002-10-24 Method and system for adjusting display angles of a stereoscopic image based on a camera location
US11/253,222 Abandoned US20060050014A1 (en) 2001-08-17 2005-10-18 Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
US11/583,542 Abandoned US20070035619A1 (en) 2001-08-17 2006-10-19 Method and system for controlling space magnification for stereoscopic images

Family Applications Before (7)

Application Number Title Priority Date Filing Date
US10/280,465 Expired - Fee Related US7091931B2 (en) 2001-08-17 2002-10-24 Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse
US10/280,241 Abandoned US20030107645A1 (en) 2001-08-17 2002-10-24 Method and system for controlling the display location of stereoscopic images
US10/280,179 Expired - Fee Related US7190825B2 (en) 2001-08-17 2002-10-24 Portable communication device for stereoscopic image display and transmission
US10/280,344 Abandoned US20030112508A1 (en) 2001-08-17 2002-10-24 Method and system for controlling space magnification for stereoscopic images
US10/280,464 Abandoned US20030112326A1 (en) 2001-08-17 2002-10-24 Method and system for transmitting or storing stereoscopic images and photographing ratios for the images
US10/280,419 Expired - Fee Related US7084838B2 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
US10/280,251 Abandoned US20030107643A1 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion

Family Applications After (5)

Application Number Title Priority Date Filing Date
US10/280,436 Abandoned US20030122925A1 (en) 2001-08-17 2002-10-24 Method and system for providing the motion information of stereoscopic cameras
US10/280,239 Abandoned US20030117395A1 (en) 2001-08-17 2002-10-24 Method and system for calculating a photographing ratio of a camera
US10/280,248 Abandoned US20030107646A1 (en) 2001-08-17 2002-10-24 Method and system for adjusting display angles of a stereoscopic image based on a camera location
US11/253,222 Abandoned US20060050014A1 (en) 2001-08-17 2005-10-18 Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
US11/583,542 Abandoned US20070035619A1 (en) 2001-08-17 2006-10-19 Method and system for controlling space magnification for stereoscopic images

Country Status (1)

Country Link
US (13) US7091931B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030107643A1 (en) * 2001-08-17 2003-06-12 Byoungyi Yoon Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
US20080010169A1 (en) * 2006-07-07 2008-01-10 Dollens Joseph R Method and system for managing and displaying product images
US20080143824A1 (en) * 2003-01-22 2008-06-19 Sony Corporation Three-dimensional image pickup apparatus, three-dimensional display apparatus, three-dimensional image pickup and display apparatus and information recording method
US20110102559A1 (en) * 2009-10-30 2011-05-05 Kazuhiko Nakane Video display control method and apparatus
US20120026158A1 (en) * 2010-02-05 2012-02-02 Sony Computer Entertainment Inc. Three-dimensional image generation device, three-dimensional image generation method, and information storage medium
US20130235086A1 (en) * 2010-03-09 2013-09-12 Panasonic Corporation Electronic zoom device, electronic zoom method, and program
US8554639B2 (en) 2006-07-07 2013-10-08 Joseph R. Dollens Method and system for managing and displaying product images
US9691098B2 (en) 2006-07-07 2017-06-27 Joseph R. Dollens Method and system for managing and displaying product images with cloud computing
US10614513B2 (en) 2006-07-07 2020-04-07 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display
US11049175B2 (en) 2006-07-07 2021-06-29 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display with audio commands and responses
US11481834B2 (en) 2006-07-07 2022-10-25 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display with artificial realities

Families Citing this family (155)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI114961B (en) * 2001-12-31 2005-01-31 Nokia Corp Method and apparatus for forming an image in an electronic device
US7734085B2 (en) * 2002-06-28 2010-06-08 Sharp Kabushiki Kaisha Image data delivery system, image data transmitting device thereof, and image data receiving device thereof
US20040155971A1 (en) * 2003-02-06 2004-08-12 Manish Sharma Method and system for building a view of an object
US7463823B2 (en) * 2003-07-24 2008-12-09 Brainlab Ag Stereoscopic visualization device for patient image data and video images
US20050046698A1 (en) * 2003-09-02 2005-03-03 Knight Andrew Frederick System and method for producing a selectable view of an object space
GB2405764A (en) * 2003-09-04 2005-03-09 Sharp Kk Guided capture or selection of stereoscopic image pairs.
US8442276B2 (en) 2006-03-03 2013-05-14 Honeywell International Inc. Invariant radial iris segmentation
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US8705808B2 (en) 2003-09-05 2014-04-22 Honeywell International Inc. Combined face and iris recognition system
US7593550B2 (en) 2005-01-26 2009-09-22 Honeywell International Inc. Distance iris recognition
US8049812B2 (en) 2006-03-03 2011-11-01 Honeywell International Inc. Camera with auto focus capability
US8090157B2 (en) 2005-01-26 2012-01-03 Honeywell International Inc. Approaches and apparatus for eye detection in a digital image
US8098901B2 (en) 2005-01-26 2012-01-17 Honeywell International Inc. Standoff iris recognition system
JP4413065B2 (en) * 2004-01-28 2010-02-10 健爾 西 Image display device and image display system
US7629989B2 (en) * 2004-04-02 2009-12-08 K-Nfb Reading Technology, Inc. Reducing processing latency in optical character recognition for portable reading machine
WO2005101097A2 (en) * 2004-04-05 2005-10-27 Vesely Michael A Horizontal perspective display
US20060033737A1 (en) * 2004-08-16 2006-02-16 Old William M Methods and system for visualizing data sets
US9124877B1 (en) 2004-10-21 2015-09-01 Try Tech Llc Methods for acquiring stereoscopic images of a location
DE102004059729B3 (en) * 2004-12-11 2006-04-13 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Imaging method for the autostereoscopic generation of three-dimensional image data from scanned sub-pixel extracts from left and right views of an image uses an optical separating grid
JP2006280731A (en) * 2005-04-01 2006-10-19 Aruze Corp Game program, game apparatus and recording medium
US8717423B2 (en) * 2005-05-09 2014-05-06 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US8049962B2 (en) * 2005-06-07 2011-11-01 Reald Inc. Controlling the angular extent of autostereoscopic viewing zones
JP4665167B2 (en) * 2005-06-29 2011-04-06 ソニー株式会社 Stereo image processing apparatus, stereo image processing method, and stereo image processing program
US8885017B2 (en) * 2005-07-14 2014-11-11 3Ality Digital Systems, Llc Real-time process and technology using image processing to maintain and ensure viewer comfort during capture, live transmission, and post-production of stereoscopic 3D imagery
US20070064098A1 (en) * 2005-09-19 2007-03-22 Available For Licensing Systems and methods for 3D rendering
US7697827B2 (en) 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US8130260B2 (en) * 2005-11-09 2012-03-06 Johns Hopkins University System and method for 3-dimensional display of image data
US20070147827A1 (en) * 2005-12-28 2007-06-28 Arnold Sheynman Methods and apparatus for wireless stereo video streaming
WO2007101276A1 (en) 2006-03-03 2007-09-07 Honeywell International, Inc. Single lens splitter camera
GB2450023B (en) 2006-03-03 2011-06-08 Honeywell Int Inc An iris image encoding method
DE602007007062D1 (en) 2006-03-03 2010-07-22 Honeywell Int Inc IRISER IDENTIFICATION SYSTEM WITH IMAGE QUALITY METERING
WO2007103834A1 (en) 2006-03-03 2007-09-13 Honeywell International, Inc. Indexing and database search system
AU2007281940B2 (en) 2006-03-03 2010-12-16 Gentex Corporation Modular biometrics collection system architecture
JP4810295B2 (en) * 2006-05-02 2011-11-09 キヤノン株式会社 Information processing apparatus and control method therefor, image processing apparatus, program, and storage medium
WO2008039252A2 (en) 2006-05-15 2008-04-03 Retica Systems, Inc. Multimodal ocular biometric system
US8170293B2 (en) 2006-09-15 2012-05-01 Identix Incorporated Multimodal ocular biometric system and methods
JP2008096868A (en) 2006-10-16 2008-04-24 Sony Corp Imaging display device, and imaging display method
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
US10795457B2 (en) 2006-12-28 2020-10-06 D3D Technologies, Inc. Interactive 3D cursor
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
TWI331872B (en) * 2006-12-29 2010-10-11 Quanta Comp Inc Method for displaying stereoscopic image
PL3758381T3 (en) * 2007-04-12 2021-07-05 Dolby International Ab Tiling in video encoding and decoding
US8396321B1 (en) * 2007-04-25 2013-03-12 Marvell International Ltd. Method and apparatus for processing image data from a primary sensor and a secondary sensor
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
US20090119821A1 (en) * 2007-11-14 2009-05-14 Jeffery Neil Stillwell Belt with ball mark repair tool
US8436907B2 (en) 2008-05-09 2013-05-07 Honeywell International Inc. Heterogeneous video capturing system
US8213782B2 (en) 2008-08-07 2012-07-03 Honeywell International Inc. Predictive autofocusing system
US8090246B2 (en) 2008-08-08 2012-01-03 Honeywell International Inc. Image acquisition system
US8542432B2 (en) * 2008-08-14 2013-09-24 Reald Inc. Autostereoscopic display system with efficient pixel layout
JP5268521B2 (en) * 2008-09-25 2013-08-21 キヤノン株式会社 Ophthalmic device and method
CN102246528B (en) * 2008-10-14 2016-01-20 瑞尔D股份有限公司 There is the lens type display system of biased color filter array
US8280119B2 (en) 2008-12-05 2012-10-02 Honeywell International Inc. Iris recognition system using quality metrics
EP2385705A4 (en) * 2008-12-30 2011-12-21 Huawei Device Co Ltd Method and device for generating stereoscopic panoramic video stream, and method and device of video conference
US20100186234A1 (en) 2009-01-28 2010-07-29 Yehuda Binder Electric shaver with imaging capability
KR20100080704A (en) * 2009-01-02 2010-07-12 삼성전자주식회사 Method and apparatus for obtaining image data
CN102439972B (en) * 2009-02-27 2016-02-10 基础制造有限公司 Based on the telecommunication platform of earphone
US8314832B2 (en) * 2009-04-01 2012-11-20 Microsoft Corporation Systems and methods for generating stereoscopic images
JP2010244245A (en) * 2009-04-03 2010-10-28 Sony Corp Information processing apparatus, information processing method and program
JP5510700B2 (en) * 2009-04-03 2014-06-04 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5409107B2 (en) * 2009-05-13 2014-02-05 任天堂株式会社 Display control program, information processing apparatus, display control method, and information processing system
US20100309290A1 (en) * 2009-06-08 2010-12-09 Stephen Brooks Myers System for capture and display of stereoscopic content
US8472681B2 (en) 2009-06-15 2013-06-25 Honeywell International Inc. Iris and ocular recognition system using trace transforms
US8630464B2 (en) 2009-06-15 2014-01-14 Honeywell International Inc. Adaptive iris matching using database indexing
GB2471137B (en) * 2009-06-19 2011-11-30 Sony Comp Entertainment Europe 3D image processing method and apparatus
US8502864B1 (en) * 2009-07-28 2013-08-06 Robert Watkins Systems, devices, and/or methods for viewing images
JP5293500B2 (en) * 2009-08-25 2013-09-18 ソニー株式会社 Display device and control method
JP5405264B2 (en) 2009-10-20 2014-02-05 任天堂株式会社 Display control program, library program, information processing system, and display control method
JP4754031B2 (en) * 2009-11-04 2011-08-24 任天堂株式会社 Display control program, information processing system, and program used for stereoscopic display control
KR101615234B1 (en) * 2009-11-19 2016-04-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US8922625B2 (en) * 2009-11-19 2014-12-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2355526A3 (en) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US8717360B2 (en) * 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
DE102010009737A1 (en) * 2010-03-01 2011-09-01 Institut für Rundfunktechnik GmbH Method and arrangement for reproducing 3D image content
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
US20110304695A1 (en) * 2010-06-10 2011-12-15 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120001906A1 (en) * 2010-06-30 2012-01-05 Blue Sky Studios, Inc. Methods and systems for 3d animation
KR101645465B1 (en) * 2010-07-23 2016-08-04 삼성전자주식회사 Apparatus and method for generating a three-dimension image data in portable terminal
KR101735612B1 (en) * 2010-08-16 2017-05-15 엘지전자 주식회사 Mobile terminal and operation control method thereof
US8767053B2 (en) * 2010-08-26 2014-07-01 Stmicroelectronics, Inc. Method and apparatus for viewing stereoscopic video material simultaneously with multiple participants
US8704879B1 (en) 2010-08-31 2014-04-22 Nintendo Co., Ltd. Eye tracking enabling 3D viewing on conventional 2D display
US8742887B2 (en) 2010-09-03 2014-06-03 Honeywell International Inc. Biometric visitor check system
KR20120029228A (en) * 2010-09-16 2012-03-26 엘지전자 주식회사 Transparent display device and method for providing object information
JP5263355B2 (en) * 2010-09-22 2013-08-14 株式会社ニコン Image display device and imaging device
US8849011B2 (en) * 2010-10-07 2014-09-30 Himax Media Solutions, Inc. Video processing system and method thereof for compensating boundary of image
WO2012054063A1 (en) 2010-10-22 2012-04-26 Hewlett-Packard Development Company L.P. An augmented reality display system and method of display
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
JP5699566B2 (en) * 2010-11-29 2015-04-15 ソニー株式会社 Information processing apparatus, information processing method, and program
US9237331B2 (en) * 2011-01-18 2016-01-12 Disney Enterprises, Inc. Computational stereoscopic camera system
US8953242B2 (en) 2011-03-31 2015-02-10 Honeywell International Inc. Varible focus stereoscopic display system and method
US8786529B1 (en) 2011-05-18 2014-07-22 Zspace, Inc. Liquid crystal variable drive voltage
US8885882B1 (en) 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
JP5860634B2 (en) * 2011-08-23 2016-02-16 任天堂株式会社 Information processing system, information processing method, server program, server device, and server system
US9037354B2 (en) 2011-09-09 2015-05-19 Thales Avionics, Inc. Controlling vehicle entertainment systems responsive to sensed passenger gestures
TWI526706B (en) * 2011-10-05 2016-03-21 原相科技股份有限公司 Image system
CN103959340A (en) * 2011-12-07 2014-07-30 英特尔公司 Graphics rendering technique for autostereoscopic three dimensional display
US8619148B1 (en) * 2012-01-04 2013-12-31 Audience, Inc. Image correction after combining images from multiple cameras
US9060718B2 (en) 2012-02-13 2015-06-23 Massachusetts Institute Of Technology Methods and apparatus for retinal imaging
US9188731B2 (en) 2012-05-18 2015-11-17 Reald Inc. Directional backlight
US9235057B2 (en) 2012-05-18 2016-01-12 Reald Inc. Polarization recovery in a directional display device
EP2850488A4 (en) 2012-05-18 2016-03-02 Reald Inc Directional backlight
JP6508832B2 (en) 2012-05-18 2019-05-08 リアルディー スパーク エルエルシー Control of multiple light sources in directional backlights
US9678267B2 (en) 2012-05-18 2017-06-13 Reald Spark, Llc Wide angle imaging directional backlights
US8526677B1 (en) * 2012-07-16 2013-09-03 Google Inc. Stereoscopic camera with haptic feedback for object and location detection
US9058053B2 (en) * 2012-10-26 2015-06-16 The Boeing Company Virtual reality display system
CN102944935B (en) * 2012-11-13 2014-12-24 京东方科技集团股份有限公司 Binocular head-wearing display device and method thereof for adjusting image spacing
CN102930550A (en) * 2012-11-20 2013-02-13 天津理工大学 Method for determining separation distance of virtual camera in drawing stereo images
US9607011B2 (en) * 2012-12-19 2017-03-28 Intel Corporation Time-shifting image service
US9342145B2 (en) * 2013-01-22 2016-05-17 Kabushiki Kaisha Toshiba Cursor control
WO2014127134A1 (en) * 2013-02-13 2014-08-21 Massachusetts Institute Of Technology Methods and apparatus for retinal imaging
BR112015020160B1 (en) 2013-02-22 2022-01-18 Reald Spark, Llc DIRECTIONAL BACKLIGHTING
WO2014204950A1 (en) 2013-06-17 2014-12-24 Reald Inc. Controlling light sources of a directional backlight
WO2015057625A1 (en) 2013-10-14 2015-04-23 Reald Inc. Control of directional display
CN106062620B (en) 2013-10-14 2020-02-07 瑞尔D斯帕克有限责任公司 Light input for directional backlight
US9772495B2 (en) * 2013-11-04 2017-09-26 Weng-Kong TAM Digital loupe device
JP2015119464A (en) * 2013-11-12 2015-06-25 セイコーエプソン株式会社 Display device and control method of the same
TWI558588B (en) * 2014-04-09 2016-11-21 Papago Inc Driving Recorder with Correcting Shooting Pose and Its Correction
EP3161550A4 (en) 2014-06-26 2018-04-18 RealD Spark, LLC Directional privacy display
WO2016001908A1 (en) * 2014-07-03 2016-01-07 Imagine Mobile Augmented Reality Ltd 3 dimensional anchored augmented reality
US10070120B2 (en) * 2014-09-17 2018-09-04 Qualcomm Incorporated Optical see-through display calibration
EP3204686B1 (en) 2014-10-08 2019-07-17 RealD Spark, LLC Connection unit for a directional backlight
US10356383B2 (en) 2014-12-24 2019-07-16 Reald Spark, Llc Adjustment of perceived roundness in stereoscopic image of a head
JP6582419B2 (en) * 2015-01-27 2019-10-02 セイコーエプソン株式会社 Head-mounted display device, head-mounted display device control method, and computer program
WO2016149446A1 (en) 2015-03-17 2016-09-22 Blue Sky Studios, Inc. Methods, systems and tools for 3d animation
RU2596062C1 (en) 2015-03-20 2016-08-27 Автономная Некоммерческая Образовательная Организация Высшего Профессионального Образования "Сколковский Институт Науки И Технологий" Method for correction of eye image using machine learning and method of machine learning
CN108323187B (en) 2015-04-13 2024-03-08 瑞尔D斯帕克有限责任公司 Wide-angle imaging directional backlight source
WO2016191598A1 (en) 2015-05-27 2016-12-01 Reald Inc. Wide angle imaging directional backlights
CN108351951B (en) 2015-10-26 2023-02-07 瑞尔D斯帕克有限责任公司 Intelligent privacy system, equipment and method thereof
WO2017083526A1 (en) 2015-11-10 2017-05-18 Reald Inc. Distortion matching polarization conversion systems and methods thereof
CN108463667B (en) 2015-11-13 2020-12-01 瑞尔D斯帕克有限责任公司 Wide-angle imaging directional backlight
CN108431670B (en) 2015-11-13 2022-03-11 瑞尔D斯帕克有限责任公司 Surface features for imaging directional backlights
ES2912310T3 (en) 2016-01-05 2022-05-25 Reald Spark Llc Gaze Correction in Multiview Images
WO2017200950A1 (en) 2016-05-19 2017-11-23 Reald Spark, Llc Wide angle imaging directional backlights
WO2017205183A1 (en) 2016-05-23 2017-11-30 Reald Spark, Llc Wide angle imaging directional backlights
WO2018129059A1 (en) 2017-01-04 2018-07-12 Reald Spark, Llc Optical stack for imaging directional backlights
WO2018187154A1 (en) 2017-04-03 2018-10-11 Reald Spark, Llc Segmented imaging directional backlights
WO2019032604A1 (en) 2017-08-08 2019-02-14 Reald Spark, Llc Adjusting a digital representation of a head region
US11115647B2 (en) 2017-11-06 2021-09-07 Reald Spark, Llc Privacy display apparatus
AU2018390994B2 (en) 2017-12-22 2023-11-16 Mirage 3.4D Pty Ltd Camera projection technique system and method
RU2020100251A (en) * 2018-01-02 2022-02-03 Лумус Лтд. ACTIVE ALIGNMENT AUGMENTED REALITY DISPLAYS AND RELATED METHODS
US10802356B2 (en) 2018-01-25 2020-10-13 Reald Spark, Llc Touch screen for privacy display
US11079285B2 (en) 2018-05-04 2021-08-03 Raytheon Technologies Corporation Automated analysis of thermally-sensitive coating and method therefor
US10928362B2 (en) 2018-05-04 2021-02-23 Raytheon Technologies Corporation Nondestructive inspection using dual pulse-echo ultrasonics and method therefor
US10958843B2 (en) * 2018-05-04 2021-03-23 Raytheon Technologies Corporation Multi-camera system for simultaneous registration and zoomed imagery
US11268881B2 (en) 2018-05-04 2022-03-08 Raytheon Technologies Corporation System and method for fan blade rotor disk and gear inspection
US10473593B1 (en) 2018-05-04 2019-11-12 United Technologies Corporation System and method for damage detection by cast shadows
US10685433B2 (en) 2018-05-04 2020-06-16 Raytheon Technologies Corporation Nondestructive coating imperfection detection system and method therefor
US10902664B2 (en) 2018-05-04 2021-01-26 Raytheon Technologies Corporation System and method for detecting damage using two-dimensional imagery and three-dimensional model
US10943320B2 (en) 2018-05-04 2021-03-09 Raytheon Technologies Corporation System and method for robotic inspection
US10488371B1 (en) 2018-05-04 2019-11-26 United Technologies Corporation Nondestructive inspection using thermoacoustic imagery and method therefor
US10914191B2 (en) 2018-05-04 2021-02-09 Raytheon Technologies Corporation System and method for in situ airfoil inspection
WO2021025241A1 (en) * 2019-08-07 2021-02-11 Samsung Electronics Co., Ltd. Method and bendable device for constructing 3d data item
CN112214030B (en) * 2020-09-11 2023-03-14 中国航空工业集团公司成都飞机设计研究所 One-station-control dual-computer display control method for unmanned aerial vehicle
WO2022060673A1 (en) 2020-09-16 2022-03-24 Reald Spark, Llc Vehicle external illumination device
CN113099113B (en) * 2021-03-31 2022-12-27 北京小米移动软件有限公司 Electronic terminal, photographing method and device and storage medium

Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4729017A (en) * 1985-02-28 1988-03-01 Canon Kabushiki Kaisha Stereoscopic display method and apparatus therefor
US4734756A (en) * 1981-12-31 1988-03-29 3-D Video Corporation Stereoscopic television system
US4740836A (en) * 1983-12-05 1988-04-26 Craig Dwin R Compatible 3D video display using commercial television broadcast standards and equipment
US4951075A (en) * 1988-01-08 1990-08-21 Minolta Camera Kabushiki Kaisha Zoom camera
US4994898A (en) * 1987-06-22 1991-02-19 Aspex Limited Color television system for processing signals from a television camera to produce a stereoscopic effect
US4999713A (en) * 1988-03-07 1991-03-12 Sharp Kabushiki Kaisha Interlocked zooming apparatus for use in stereoscopic cameras
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5355163A (en) * 1992-09-28 1994-10-11 Sony Corporation Video camera that automatically maintains size and location of an image within a frame
US5357277A (en) * 1991-10-04 1994-10-18 Sony Coporation Stereoscopic imaging using memory effect liquid crystal displays
US5526089A (en) * 1992-09-14 1996-06-11 Nikon Corporation Camera with sight line detecting device
US5532716A (en) * 1991-12-09 1996-07-02 Kabushiki Kaisha Toshiba Resolution conversion system
US5625408A (en) * 1993-06-24 1997-04-29 Canon Kabushiki Kaisha Three-dimensional image recording/reconstructing method and apparatus therefor
US5638461A (en) * 1994-06-09 1997-06-10 Kollmorgen Instrument Corporation Stereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
US5640222A (en) * 1996-03-15 1997-06-17 Paul; Eddie Method and apparatus for producing stereoscopic images
US5727242A (en) * 1994-05-09 1998-03-10 Image Technology International, Inc. Single-lens multiple aperture camera for 3D photographic/video applications
US5805168A (en) * 1992-06-05 1998-09-08 International Business Machines Corporation Apparatus and method for converting line segment data to three-dimensional data
US5978143A (en) * 1997-09-19 1999-11-02 Carl-Zeiss-Stiftung Stereoscopic recording and display system
US5993004A (en) * 1996-09-19 1999-11-30 Sharp Kabushiki Kaisha Display
US6005607A (en) * 1995-06-29 1999-12-21 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus
US6025883A (en) * 1996-10-16 2000-02-15 Samsung Electronics Co., Ltd. Resolution conversion apparatus and method for a display device
US6042231A (en) * 1996-08-02 2000-03-28 Vega Vista, Inc. Methods and systems for relieving eye strain
US6072462A (en) * 1993-08-31 2000-06-06 Zilog, Inc. Technique for generating on-screen display characters using software implementation
US6078423A (en) * 1995-06-07 2000-06-20 Richmond Holographic Research & Development Limited Stereoscopic display device
US6101338A (en) * 1998-10-09 2000-08-08 Eastman Kodak Company Speech recognition camera with a prompting display
US6172700B1 (en) * 1997-01-16 2001-01-09 Ricoh Company, Ltd. Writing device for an image forming apparatus
US6178043B1 (en) * 1998-11-24 2001-01-23 Korea Institute Of Science And Technology Multiview three-dimensional image display system
US6204876B1 (en) * 1996-06-26 2001-03-20 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics moving image generating apparatus
US6208348B1 (en) * 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US6220709B1 (en) * 1996-10-09 2001-04-24 Helmut Tan Projection system, in particular for three dimensional representations on a viewing device
US20010024231A1 (en) * 2000-03-21 2001-09-27 Olympus Optical Co., Ltd. Stereoscopic image projection device, and correction amount computing device thereof
US6307585B1 (en) * 1996-10-04 2001-10-23 Siegbert Hentschke Position-adaptive autostereoscopic monitor (PAM)
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US20010052123A1 (en) * 2000-03-08 2001-12-13 Eiji Kawai Electronic information content distribution processing system, information distribution apparatus, information processing apparatus, and electronic information content distribution processing method
US6333757B1 (en) * 1993-11-12 2001-12-25 Reveo, Inc. Method and apparatus for producing and displaying spectrally-multiplexed images of three-dimensional imagery for use in stereoscopic viewing thereof
US20020008906A1 (en) * 2000-05-12 2002-01-24 Seijiro Tomita Stereoscopic picture displaying apparatus
US6388666B1 (en) * 1998-10-27 2002-05-14 Imax Corporation System and method for generating stereoscopic image data
US6414716B1 (en) * 1996-11-29 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for controlling an imaging apparatus, imaging operation control system, and storage medium storing a program implementing such a method
US6449309B1 (en) * 1996-03-12 2002-09-10 Olympus Optical Co., Ltd. Stereoscopic display that controls binocular parallax between two images and controls image reconstitution according to parallax data
US6507359B1 (en) * 1993-09-20 2003-01-14 Canon Kabushiki Kaisha Image display system
US20030063778A1 (en) * 2001-09-28 2003-04-03 Canon Kabushiki Kaisha Method and apparatus for generating models of individuals
US6583808B2 (en) * 2001-10-04 2003-06-24 National Research Council Of Canada Method and system for stereo videoconferencing
US20030206653A1 (en) * 1995-07-28 2003-11-06 Tatsushi Katayama Image sensing and image processing apparatuses
US6747610B1 (en) * 1997-07-22 2004-06-08 Sanyo Electric Co., Ltd. Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image
US6752498B2 (en) * 2001-05-14 2004-06-22 Eastman Kodak Company Adaptive autostereoscopic display system

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2613263A (en) * 1949-04-08 1952-10-07 Earl D Hilburn Plural camera television transmitter with electronic wipeout control
AT286776B (en) * 1968-11-05 1970-12-28 Eumig Photographic or cinematographic camera
US4418993A (en) * 1981-05-07 1983-12-06 Stereographics Corp. Stereoscopic zoom lens system for three-dimensional motion pictures and television
US4910592A (en) * 1988-01-13 1990-03-20 Picker International, Inc. Radiation imaging automatic gain control
US5175616A (en) 1989-08-04 1992-12-29 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Canada Stereoscopic video-graphic coordinate specification system
DE58906847D1 (en) 1989-11-10 1994-03-10 Thor Chemie Gmbh Stabilized aqueous solutions of 3-isothiazolinones.
JPH0435395A (en) 1990-05-28 1992-02-06 Shigeo Nakagawa Stereoscopic monitor
JP3030563B2 (en) * 1990-10-29 2000-04-10 オリンパス光学工業株式会社 Real image finder
US5383013A (en) * 1992-09-18 1995-01-17 Nec Research Institute, Inc. Stereoscopic computer vision system
KR100276681B1 (en) 1992-11-07 2001-01-15 이데이 노부유끼 Video camera system
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5394202A (en) 1993-01-14 1995-02-28 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
US7019770B1 (en) * 1993-03-12 2006-03-28 Telebuyer, Llc Videophone system for scrutiny monitoring with computer control
CA2148631C (en) 1994-06-20 2000-06-13 John J. Hildin Voice-following video system
WO1996011548A1 (en) * 1994-10-07 1996-04-18 Sony Corporation Video camera and its setup method
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5949477A (en) * 1995-04-06 1999-09-07 Hoglin; Irving M. Three dimensional stereoscopic television system
JP3579162B2 (en) 1995-06-29 2004-10-20 松下電器産業株式会社 3D CG image generation device
US6847336B1 (en) * 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system
EP1029282A2 (en) 1997-11-07 2000-08-23 ViA, Inc. Interactive devices and methods
JP4532610B2 (en) * 1998-01-30 2010-08-25 キヤノン株式会社 CAMERA CONTROL SYSTEM AND METHOD, AND STORAGE MEDIUM CONTAINING PROGRAM FOR EXECUTING OPERATION PROCESS
IL138808A0 (en) * 1998-04-02 2001-10-31 Kewazinga Corp A navigable telepresence method and system utilizing an array of cameras
US6476850B1 (en) * 1998-10-09 2002-11-05 Kenneth Erbey Apparatus for the generation of a stereoscopic display
KR100294925B1 (en) 1999-06-03 2001-07-12 윤종용 3-D graphic image manufacturing method and binocular visual disparity adjustment method therefor
US6987532B1 (en) * 1999-09-20 2006-01-17 Canon Kabushiki Kaisha Image sensing apparatus, control, and method of designing an optical system therefor
JP3502796B2 (en) * 1999-10-29 2004-03-02 株式会社スクウェア・エニックス 3D model display method and apparatus in video game, game apparatus, and computer-readable recording medium storing 3D model display program for video game
US6625408B1 (en) * 2000-05-18 2003-09-23 Nexpress Solutions Llc Pin mount for optical writer/image-recording element in a document printer/copier
JP2002092656A (en) 2000-09-11 2002-03-29 Canon Inc Stereoscopic image display device and image data displaying method
KR20020079268A (en) 2001-04-14 2002-10-19 유준희 The System and Method composing 3D contents with 3D human body in Virtual Reality with real time.
US7091931B2 (en) * 2001-08-17 2006-08-15 Geo-Rae Co., Ltd. Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse
KR100434657B1 (en) 2002-02-19 2004-06-04 한국가상현실 (주) 3 dimensional house interier method based on real-time displaying method and recording medium therefor

Patent Citations (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4734756A (en) * 1981-12-31 1988-03-29 3-D Video Corporation Stereoscopic television system
US4740836A (en) * 1983-12-05 1988-04-26 Craig Dwin R Compatible 3D video display using commercial television broadcast standards and equipment
US4729017A (en) * 1985-02-28 1988-03-01 Canon Kabushiki Kaisha Stereoscopic display method and apparatus therefor
US4994898A (en) * 1987-06-22 1991-02-19 Aspex Limited Color television system for processing signals from a television camera to produce a stereoscopic effect
US4951075A (en) * 1988-01-08 1990-08-21 Minolta Camera Kabushiki Kaisha Zoom camera
US4999713A (en) * 1988-03-07 1991-03-12 Sharp Kabushiki Kaisha Interlocked zooming apparatus for use in stereoscopic cameras
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5357277A (en) * 1991-10-04 1994-10-18 Sony Coporation Stereoscopic imaging using memory effect liquid crystal displays
US5532716A (en) * 1991-12-09 1996-07-02 Kabushiki Kaisha Toshiba Resolution conversion system
US5805168A (en) * 1992-06-05 1998-09-08 International Business Machines Corporation Apparatus and method for converting line segment data to three-dimensional data
US5526089A (en) * 1992-09-14 1996-06-11 Nikon Corporation Camera with sight line detecting device
US5355163A (en) * 1992-09-28 1994-10-11 Sony Corporation Video camera that automatically maintains size and location of an image within a frame
US5625408A (en) * 1993-06-24 1997-04-29 Canon Kabushiki Kaisha Three-dimensional image recording/reconstructing method and apparatus therefor
US6072462A (en) * 1993-08-31 2000-06-06 Zilog, Inc. Technique for generating on-screen display characters using software implementation
US6507359B1 (en) * 1993-09-20 2003-01-14 Canon Kabushiki Kaisha Image display system
US6333757B1 (en) * 1993-11-12 2001-12-25 Reveo, Inc. Method and apparatus for producing and displaying spectrally-multiplexed images of three-dimensional imagery for use in stereoscopic viewing thereof
US5727242A (en) * 1994-05-09 1998-03-10 Image Technology International, Inc. Single-lens multiple aperture camera for 3D photographic/video applications
US5638461A (en) * 1994-06-09 1997-06-10 Kollmorgen Instrument Corporation Stereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US6078423A (en) * 1995-06-07 2000-06-20 Richmond Holographic Research & Development Limited Stereoscopic display device
US6417880B1 (en) * 1995-06-29 2002-07-09 Matsushita Electric Industrial Co., Ltd. Stereoscopic CG image generating apparatus and stereoscopic TV apparatus
US6268880B1 (en) * 1995-06-29 2001-07-31 Matsushita Electric Industrial Co., Ltd. Stereoscopic CG image generating apparatus and stereoscopic TV apparatus
US6005607A (en) * 1995-06-29 1999-12-21 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus
US20010033327A1 (en) * 1995-06-29 2001-10-25 Kenya Uomori Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus
US6175379B1 (en) * 1995-06-29 2001-01-16 Matsushita Electric Industrial Co., Ltd. Stereoscopic CG image generating apparatus and stereoscopic TV apparatus
US20030206653A1 (en) * 1995-07-28 2003-11-06 Tatsushi Katayama Image sensing and image processing apparatuses
US6449309B1 (en) * 1996-03-12 2002-09-10 Olympus Optical Co., Ltd. Stereoscopic display that controls binocular parallax between two images and controls image reconstitution according to parallax data
US5640222A (en) * 1996-03-15 1997-06-17 Paul; Eddie Method and apparatus for producing stereoscopic images
US6204876B1 (en) * 1996-06-26 2001-03-20 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics moving image generating apparatus
US6042231A (en) * 1996-08-02 2000-03-28 Vega Vista, Inc. Methods and systems for relieving eye strain
US5993004A (en) * 1996-09-19 1999-11-30 Sharp Kabushiki Kaisha Display
US6307585B1 (en) * 1996-10-04 2001-10-23 Siegbert Hentschke Position-adaptive autostereoscopic monitor (PAM)
US6220709B1 (en) * 1996-10-09 2001-04-24 Helmut Tan Projection system, in particular for three dimensional representations on a viewing device
US6025883A (en) * 1996-10-16 2000-02-15 Samsung Electronics Co., Ltd. Resolution conversion apparatus and method for a display device
US6414716B1 (en) * 1996-11-29 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for controlling an imaging apparatus, imaging operation control system, and storage medium storing a program implementing such a method
US6172700B1 (en) * 1997-01-16 2001-01-09 Ricoh Company, Ltd. Writing device for an image forming apparatus
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US6747610B1 (en) * 1997-07-22 2004-06-08 Sanyo Electric Co., Ltd. Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image
US5978143A (en) * 1997-09-19 1999-11-02 Carl-Zeiss-Stiftung Stereoscopic recording and display system
US6208348B1 (en) * 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US6101338A (en) * 1998-10-09 2000-08-08 Eastman Kodak Company Speech recognition camera with a prompting display
US6388666B1 (en) * 1998-10-27 2002-05-14 Imax Corporation System and method for generating stereoscopic image data
US6178043B1 (en) * 1998-11-24 2001-01-23 Korea Institute Of Science And Technology Multiview three-dimensional image display system
US20010052123A1 (en) * 2000-03-08 2001-12-13 Eiji Kawai Electronic information content distribution processing system, information distribution apparatus, information processing apparatus, and electronic information content distribution processing method
US20010024231A1 (en) * 2000-03-21 2001-09-27 Olympus Optical Co., Ltd. Stereoscopic image projection device, and correction amount computing device thereof
US20020008906A1 (en) * 2000-05-12 2002-01-24 Seijiro Tomita Stereoscopic picture displaying apparatus
US6752498B2 (en) * 2001-05-14 2004-06-22 Eastman Kodak Company Adaptive autostereoscopic display system
US20030063778A1 (en) * 2001-09-28 2003-04-03 Canon Kabushiki Kaisha Method and apparatus for generating models of individuals
US6583808B2 (en) * 2001-10-04 2003-06-24 National Research Council Of Canada Method and system for stereo videoconferencing

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030112326A1 (en) * 2001-08-17 2003-06-19 Byoungyi Yoon Method and system for transmitting or storing stereoscopic images and photographing ratios for the images
US20030117395A1 (en) * 2001-08-17 2003-06-26 Byoungyi Yoon Method and system for calculating a photographing ratio of a camera
US20030107643A1 (en) * 2001-08-17 2003-06-12 Byoungyi Yoon Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion
US8223195B2 (en) * 2003-01-22 2012-07-17 Sony Corporation Three-dimensional image pickup apparatus, three-dimensional display apparatus, three-dimensional image pickup and display apparatus and information recording method
US20080143824A1 (en) * 2003-01-22 2008-06-19 Sony Corporation Three-dimensional image pickup apparatus, three-dimensional display apparatus, three-dimensional image pickup and display apparatus and information recording method
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
US9030532B2 (en) * 2004-08-19 2015-05-12 Microsoft Technology Licensing, Llc Stereoscopic image display
US20080010169A1 (en) * 2006-07-07 2008-01-10 Dollens Joseph R Method and system for managing and displaying product images
US8260689B2 (en) 2006-07-07 2012-09-04 Dollens Joseph R Method and system for managing and displaying product images
US8554639B2 (en) 2006-07-07 2013-10-08 Joseph R. Dollens Method and system for managing and displaying product images
US9691098B2 (en) 2006-07-07 2017-06-27 Joseph R. Dollens Method and system for managing and displaying product images with cloud computing
US10614513B2 (en) 2006-07-07 2020-04-07 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display
US11049175B2 (en) 2006-07-07 2021-06-29 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display with audio commands and responses
US11481834B2 (en) 2006-07-07 2022-10-25 Joseph R. Dollens Method and system for managing and displaying product images with progressive resolution display with artificial realities
US20110102559A1 (en) * 2009-10-30 2011-05-05 Kazuhiko Nakane Video display control method and apparatus
US9066076B2 (en) * 2009-10-30 2015-06-23 Mitsubishi Electric Corporation Video display control method and apparatus
US20120026158A1 (en) * 2010-02-05 2012-02-02 Sony Computer Entertainment Inc. Three-dimensional image generation device, three-dimensional image generation method, and information storage medium
US8749547B2 (en) * 2010-02-05 2014-06-10 Sony Corporation Three-dimensional stereoscopic image generation
US20130235086A1 (en) * 2010-03-09 2013-09-12 Panasonic Corporation Electronic zoom device, electronic zoom method, and program

Also Published As

Publication number Publication date
US7190825B2 (en) 2007-03-13
US20030117395A1 (en) 2003-06-26
US7084838B2 (en) 2006-08-01
US20070035619A1 (en) 2007-02-15
US20030107645A1 (en) 2003-06-12
US20030112508A1 (en) 2003-06-19
US7091931B2 (en) 2006-08-15
US20030107646A1 (en) 2003-06-12
US20030117396A1 (en) 2003-06-26
US20060050014A1 (en) 2006-03-09
US20030107643A1 (en) 2003-06-12
US20030108236A1 (en) 2003-06-12
US20030112328A1 (en) 2003-06-19
US20030112326A1 (en) 2003-06-19
US20030122925A1 (en) 2003-07-03

Similar Documents

Publication Publication Date Title
US7084838B2 (en) Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
KR101313740B1 (en) OSMU( One Source Multi Use)-type Stereoscopic Camera and Method of Making Stereoscopic Video Content thereof
US9864191B2 (en) Viewer with varifocal lens and video display system
US20120176474A1 (en) Rotational adjustment for stereo viewing
US8816939B2 (en) Monocular display apparatus
US9513490B2 (en) Three channel delivery of stereo images
US6788274B2 (en) Apparatus and method for displaying stereoscopic images
JP3783977B2 (en) 3D image device and 3D image display method
US11143876B2 (en) Optical axis control based on gaze detection within a head-mountable display
WO2003073739A2 (en) Method and system for displaying stereoscopic image
WO2002015595A1 (en) A method and system of revision for 3-dimensional image
US9179139B2 (en) Alignment of stereo images pairs for viewing
JP6649010B2 (en) Information processing device
JP2001218231A (en) Device and method for displaying stereoscopic image
US8581962B2 (en) Techniques and apparatus for two camera, and two display media for producing 3-D imaging for television broadcast, motion picture, home movie and digital still pictures
WO2023136073A1 (en) Image display device and image display method
CN108234990B (en) Stereoscopic display device and stereoscopic display method
KR20020046372A (en) 3-dimensional displaying apparatus and method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: GEO-RAE CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOON, BYOUNGYI;REEL/FRAME:013750/0517

Effective date: 20030124

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE