WO1999060529A1 - A display system - Google Patents

A display system Download PDF

Info

Publication number
WO1999060529A1
WO1999060529A1 PCT/SG1999/000045 SG9900045W WO9960529A1 WO 1999060529 A1 WO1999060529 A1 WO 1999060529A1 SG 9900045 W SG9900045 W SG 9900045W WO 9960529 A1 WO9960529 A1 WO 9960529A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
hand
display system
reflecting surface
means comprises
Prior art date
Application number
PCT/SG1999/000045
Other languages
French (fr)
Inventor
Timothy Poston
Original Assignee
National University Of Singapore
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SG9801129A external-priority patent/SG87768A1/en
Application filed by National University Of Singapore filed Critical National University Of Singapore
Publication of WO1999060529A1 publication Critical patent/WO1999060529A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • G02B27/022Viewing apparatus
    • G02B27/024Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies
    • G02B27/026Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies and a display device, e.g. CRT, LCD, for adding markings or signs or to enhance the contrast of the viewed object

Definitions

  • This invention relates to the field of user interfaces with computers, and particularly to "Virtual Reality" systems where the sensory space of the user is brought into correspondence with the numerically defined space of the computer.
  • virtual image is used in the sense standard in optics; an image visible to the eye and perceived as being at a certain location, but (unlike the "real image” formed in a camera) for which no relevant light rays pass through that location.
  • a “virtual object” is used to mean an entity whose geometry is defined within a computer, which is so displayed by computer graphics as to appear to be at a particular physical location, and whose position and/or properties can be modified by user activity at or near that location.
  • natural work volume is used to mean the region easily reached by the hands of a user and convenient for the eyes: approximately (less for a smaller user) within a height of 30cm from the table, 50cm wide, and between 15cm and 50cm from the eyes.
  • Virtual Reality is often implemented using a head-mounted display, but issues of calibration and speed currently make this an unsatisfactory environment for sustained, careful work.
  • a means of delivering distinct images to the left and right eyes this may be by placing the images on two monitors (or areas of one monitor) and using mirrors to place them appropriately in front of the eyes; or the left and right views may be alternated - over time, or over rows of pixels - on one monitor, using polarization and polarizing filters to determine what reaches each eye.
  • co-location of the handheld sensor of the visually perceived position of the objects that it controls require that the hand and sensor be placed between the user's eyes and the display.
  • This allows the hand to block the user's view of displayed objects whose intended position is nearer the user's eyes, rendering the perceived 3D relationships inconsistent. It is therefore better for the user to see an image of the display screen in a reflecting surface, which can hide the hand(s) and sensor(s); the hand-held tool is visually represented within the display as a graphical object, so that normal graphics produce a consistent view of what objects are nearer and further.
  • the image should appear within the user's natural work volume preferably with support for the hand and arm to reduce fatigue.
  • the invention here disclosed in contrast, is constructed around a monitor in the normal position ( and hence usable both for 3D interaction and for routine editing, etc.), advantageously with movable structure between the screen and user that adds little to the desk.
  • Position sensors and force-feedback interaction tools are advantageously integrated with the display.
  • a display system comprising: a substantially upright display unit; and a reflecting surface disposed between a user of the system and the display unit, the reflecting surface being disposed such that it slopes downwardly away from said user, wherein, in use, a virtual image of said display unit and an image displayed by said display unit is created within a natural work space of said user.
  • the system further comprises position sensing means for sensing perceived movement of said virtual image by said user, said perceived movement being translated to movement of said image displayed by said display unit.
  • the display unit may provide either a mono or a stereo display. This advantageously facilitates interaction of the virtual image by the user of the system.
  • the position sensing means comprises transmitting means and receiving means, one of said transmitting means and said receiving means, in use, being fixedly disposed relative to said reflecting surface, and the other of said transmitting means and said receiving means, in use, being moveably disposed relative to said one of said transmitting means and said receiving means.
  • the transmitting means may comprise at least one emitter of ultrasound or electromagnetic energy, in which case the receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said at least one emitter.
  • the transmitting means comprises a plurality of emitters of ultrasound or electromagnetic energy, each of said emitters being fixedly disposed relative to said reflecting surface
  • the receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said plurality of emitters
  • a hand-held device comprising said at least one sensor, in use, being held in a hand or both hands of said user of said system, the position and orientation of said at least one sensor being determined with reference to ultrasound or electromagnetic energy received from said emitters.
  • the receiving means comprises a plurality of sensors for receiving ultrasound or electromagnetic energy, each of said sensors being fixedly disposed relative to said reflecting surface
  • the transmitting means comprises at least one emitter of ultrasound or electromagnetic energy
  • a hand-held device comprising said at least one emitter, in use, being held in a hand or both hands of said user of said system, the position and orientation of said emitter being determined with reference to ultrasound or electromagnetic energy received from said emitter.
  • the transmitting means and receiving means may also take different forms.
  • the transmitting means comprises a plurality of reference points or objects, each of said points or objects being fixedly disposed relative to said reflecting surface
  • the receiving means comprises a camera, a hand-held device comprising said camera, in use, being held in a hand or both hands of said user of said system, the position and orientation of the camera being determined from camera views with reference to the reference points or objects.
  • the receiving means comprises a plurality of cameras, each of said cameras being fixedly disposed relative to said reflecting surface
  • the transmitting means comprises a reference point or object, a hand-held device comprising said reference point or object, in use, being held in a hand or both hands of said user of said system, the position and orientation of said reference point or object being determined with reference to camera views of said reference point or object.
  • the position sensing means comprises a hand-held device in use being held in a hand or both hands of said user of said system, said hand-held device being attached to a robotic arm, the position and orientation of said hand-held device being computable with reference to joint angles of said robotic arm at any given moment.
  • the robotic arm preferably detects forces applied by said user, and/or delivers computer-selected force to said hand-held device.
  • the handheld device comprises signal means which, on activation by said user, delivers a desired signal to a computer generating said image displayed by said display unit.
  • the signal means may comprise, for example, a button or buttons, and/or rotating or sliding attachments.
  • the reflecting surface is partially reflecting such that, in use, said image displayed by said display unit is superimposed onto a real object.
  • the display system may additionally comprise filter means for controlling images presented to each eye of said user such that said user perceives said virtual image to be a stereoscopic image.
  • the filter means preferably comprises a passive or a computer-synchronized filter mounted relative to said reflecting surface or on the head of said user.
  • the reflecting surface is mounted such that it can be retracted out of the way of the user so that the user can see the display unit, or is such that the user can look over the top of the reflecting surface to see the display unit.
  • said reflecting surface is mounted on a frame extending from said display unit.
  • the display system preferably further comprises support means for supporting the wrists, forearms and/or elbows of said user.
  • a six-degree-of-freedom position sensing means (optionally supplying force) as discussed above, held in a hand or in each hand of the user controls the virtual image or tool.
  • the virtual image appears to be where the sensing means is held, and interacts with the other displayed objects in ways controlled by application software using code supplied with the system.
  • Figure 1 shows a schematic side view of the display system, including real and virtual light rays between the user's eye and the real and virtual (reflected) display screen, and the apparent positions of a virtual tool and a virtual object for a user looking at the real or virtual screen.
  • Figure 2 shows a perspective view of the display system of figure 1 omitting the real and virtual images.
  • the reach-in display system incorporates a display unit (1) in an upright position (that is in the normal attitude range for desktop computers), capable of stereo display under control of computer graphics hardware.
  • a display unit (1) mounted in front of the display unit (1), in such a position that either (a) the user can look over it and see the display unit in the usual way, or (b) the mounting can be swivelled out of the way, is a light frame (2).
  • a position-sensing device (3) held in each of the user's hands.
  • the location and orientation of the position-sensing device (3) relative to a fixed reference device or device (4) may be deduced from controlled signals emitted by, and/or signals received by the position-sensing device (3) or the reference devices (4) or both; each device (3) carries one or more buttons (5), which report "pressed” or “unpressed” states and transitions between these, or scaled input devices which report user- controlled values in a continuous range.
  • the coupling between devices (3) and (4) may be through one detecting emissions or reflections from the other, or by a mechanical connection (6), which may detect force applied by the user and may transmit force felt by the user.
  • position determination is based on camera observation of position-sensing device (3), emission of energy detected by position-sensing device (3), or detection of energy emitted by position- sensing device (3), there may be several of the fixed devices (4).
  • linkage is mechanical, a single attachment point per position sensor is sufficient.
  • an emission-detection system supporting the sensor used in the other hand can also determine the base point of the mechanical system, which need not therefore be rigidly linked to the screen in one permanent position.
  • the frame (2) supports a mirror (7) sloping downward from the user, so that the user can see within the natural work volume a displayed stereo image in the virtual image (8) of the screen (1).
  • the two-point attachment of the frame and mirror in Figure 1 is for clarity. Other support arrangements, and method of permitting the structure to be rotated out of the way, are obvious to one skilled in the art.
  • the attitude of the mirror is important, but it can be held there in many ways.
  • a surface sloped like a drafting table, with or without a central hole that allows the hands and the sensors (3) to move below its main surface, may be added for better support of forearms and wrists.
  • Filters (9) control which screen images are visible to which eye, so that software can present the appropriate view to each eye to generate in the brain stereoscopic combined image, whose elements have perceived depth.
  • the filters (9) may be mounted on the frame, or worn by the user as glasses.
  • the head may be tracked for better correspondence between the user's eye positions and the viewpoints for which the left and right view are rendered; but if the rendering delay is long relative to head movement speed, this option is counter-productive.
  • the position reference devices (4) may be mounted on the frame (2), directly on the monitor, or otherwise rigidly linked to the monitor, software performs the relative position deduction, allowing a virtual object called a "tool" (10) to appear to the user in the display (7) at the location corresponding to the physical position of the position- sensing device (3), by creating in the real screen a stereo image (11) at the apparent position shown.
  • An overall control loop allows an application program in the computer (12) to determine when (10) is to interact with a virtual object (13) generated or managed by the application program (for example, when (10) is close to ( 3) and a button (5) is clicked or held down), upon which the program modifies the position of the object the relative position of its parts, the properties logically assigned to it (linkage to other objects, stiffness, toxicity, ore concentration, or whatever the application requires), or the way that all or part of it is displayed, according to the needs of the application and the choices made by the user.

Abstract

A 'Virtual Reality' display system for user to interact with computer is provided. The sensory space of the user is brought into correspondence with the numerically defined space of the computer by sloping a reflecting surface (7) downward away from the user, rather than upward. A screen (1) in a standard position-upright or near it creates a reflected image in the region of maximum user dexterity. Following earlier systems, a six-degree-of-freedom position sensor (3) (optionally supplying force) in each hand controls a virtual tool (10), that appears in the reflected image (8) to be where the sensor (3) is held, and interacts with the other displayed objects in ways controlled by application software using code supplied with the system.

Description

A DISPLAY SYSTEM
FIELD OF INVENTION
This invention relates to the field of user interfaces with computers, and particularly to "Virtual Reality" systems where the sensory space of the user is brought into correspondence with the numerically defined space of the computer.
BACKGROUND OF THE INVENTION
1. Nomenclature and Definition of Terms
In the following description, "virtual image" is used in the sense standard in optics; an image visible to the eye and perceived as being at a certain location, but (unlike the "real image" formed in a camera) for which no relevant light rays pass through that location. In contrast, a "virtual object" is used to mean an entity whose geometry is defined within a computer, which is so displayed by computer graphics as to appear to be at a particular physical location, and whose position and/or properties can be modified by user activity at or near that location. The term "natural work volume" is used to mean the region easily reached by the hands of a user and convenient for the eyes: approximately (less for a smaller user) within a height of 30cm from the table, 50cm wide, and between 15cm and 50cm from the eyes.
2. Description of the Related Art
The currently most common way for a computer user to interact with a three- dimensional dataset in the computer (whether this be a medical or seismic scan of a volume, representation of the surface of a skull or vehicle or mountain, etc.) is to display it as a three-dimensional picture on the computer monitor, but to control it through the two-dimensional motions of a mouse on the desk surface. Since this cannot simultaneously and independently modify the three point-position variables of depth in the screen, height on the screen and left-right placement, let alone the three orientation variables corresponding to rotation about the three standard axes, every change of position becomes a painful task in indirect management. Other systems use three-dimensional position sensors (with or without report of rotation), still with the controlling hand at a different place from where the controlled object is seen. When an effort is made to give the user consistent position information, with reports from hand and eye agreeing as they do with real objects, this is "Virtual Reality", where the user functions in a space shared with the computer.
Virtual Reality is often implemented using a head-mounted display, but issues of calibration and speed currently make this an unsatisfactory environment for sustained, careful work. We are concerned here with systems in which the user interacts with a fixed 3D display system. In such a system there is a means of delivering distinct images to the left and right eyes, this may be by placing the images on two monitors (or areas of one monitor) and using mirrors to place them appropriately in front of the eyes; or the left and right views may be alternated - over time, or over rows of pixels - on one monitor, using polarization and polarizing filters to determine what reaches each eye.
Where this involves looking directly at the monitor, co-location of the handheld sensor of the visually perceived position of the objects that it controls require that the hand and sensor be placed between the user's eyes and the display. This allows the hand to block the user's view of displayed objects whose intended position is nearer the user's eyes, rendering the perceived 3D relationships inconsistent. It is therefore better for the user to see an image of the display screen in a reflecting surface, which can hide the hand(s) and sensor(s); the hand-held tool is visually represented within the display as a graphical object, so that normal graphics produce a consistent view of what objects are nearer and further. The image should appear within the user's natural work volume preferably with support for the hand and arm to reduce fatigue.
A number of such systems have been proposed, in the patent and open literatures. The earliest is in a paper by Schmandt [1982]; an identical configuration (still without arm support) is included in a patent by Deering [1993]. Several papers by Serra, Poston et al. report work on a similar configuration, as does Blackewell et al. [1995]. All involve a downward-facing monitor supported above the user's head height by a substantial frame, with a reflecting surface that is either horizontal or sloping upward in the direction forward of the user. The result is a wide, high system that requires considerable desk or laboratory space, and does not allow the user access to the monitor for normal desktop-computer purposes. (Its reflected image is more conveniently placed, but upsidedown; software specialized to this environment adjusts for this, but standard software becomes unusable.)
The invention here disclosed, in contrast, is constructed around a monitor in the normal position ( and hence usable both for 3D interaction and for routine editing, etc.), advantageously with movable structure between the screen and user that adds little to the desk. Position sensors and force-feedback interaction tools are advantageously integrated with the display.
SUMMARY OF THE INVENTION
According to the present invention there is provided a display system comprising: a substantially upright display unit; and a reflecting surface disposed between a user of the system and the display unit, the reflecting surface being disposed such that it slopes downwardly away from said user, wherein, in use, a virtual image of said display unit and an image displayed by said display unit is created within a natural work space of said user.
Advantageously, the system further comprises position sensing means for sensing perceived movement of said virtual image by said user, said perceived movement being translated to movement of said image displayed by said display unit. The display unit may provide either a mono or a stereo display. This advantageously facilitates interaction of the virtual image by the user of the system.
According to one embodiment, the position sensing means comprises transmitting means and receiving means, one of said transmitting means and said receiving means, in use, being fixedly disposed relative to said reflecting surface, and the other of said transmitting means and said receiving means, in use, being moveably disposed relative to said one of said transmitting means and said receiving means.
The transmitting means may comprise at least one emitter of ultrasound or electromagnetic energy, in which case the receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said at least one emitter.
In a preferred embodiment the transmitting means comprises a plurality of emitters of ultrasound or electromagnetic energy, each of said emitters being fixedly disposed relative to said reflecting surface, and the receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said plurality of emitters, a hand-held device comprising said at least one sensor, in use, being held in a hand or both hands of said user of said system, the position and orientation of said at least one sensor being determined with reference to ultrasound or electromagnetic energy received from said emitters.
Alternatively, the receiving means comprises a plurality of sensors for receiving ultrasound or electromagnetic energy, each of said sensors being fixedly disposed relative to said reflecting surface, and the transmitting means comprises at least one emitter of ultrasound or electromagnetic energy, a hand-held device comprising said at least one emitter, in use, being held in a hand or both hands of said user of said system, the position and orientation of said emitter being determined with reference to ultrasound or electromagnetic energy received from said emitter.
The transmitting means and receiving means may also take different forms. For example, in another embodiment the transmitting means comprises a plurality of reference points or objects, each of said points or objects being fixedly disposed relative to said reflecting surface, and the receiving means comprises a camera, a hand-held device comprising said camera, in use, being held in a hand or both hands of said user of said system, the position and orientation of the camera being determined from camera views with reference to the reference points or objects.
Alternatively, the receiving means comprises a plurality of cameras, each of said cameras being fixedly disposed relative to said reflecting surface, and the transmitting means comprises a reference point or object, a hand-held device comprising said reference point or object, in use, being held in a hand or both hands of said user of said system, the position and orientation of said reference point or object being determined with reference to camera views of said reference point or object.
In yet another embodiment the position sensing means comprises a hand-held device in use being held in a hand or both hands of said user of said system, said hand-held device being attached to a robotic arm, the position and orientation of said hand-held device being computable with reference to joint angles of said robotic arm at any given moment. In this embodiment, the robotic arm preferably detects forces applied by said user, and/or delivers computer-selected force to said hand-held device.
In any of the above described embodiments comprising a hand-held device which is held by the user of the system during operation thereof, it is preferred that the handheld device comprises signal means which, on activation by said user, delivers a desired signal to a computer generating said image displayed by said display unit. The signal means may comprise, for example, a button or buttons, and/or rotating or sliding attachments.
Aspects of the system may be varied depending on the desired effect to be obtained when using the system. For example, in one embodiment, the reflecting surface is partially reflecting such that, in use, said image displayed by said display unit is superimposed onto a real object. Further, the display system may additionally comprise filter means for controlling images presented to each eye of said user such that said user perceives said virtual image to be a stereoscopic image. The filter means preferably comprises a passive or a computer-synchronized filter mounted relative to said reflecting surface or on the head of said user.
In a preferred embodiment the reflecting surface is mounted such that it can be retracted out of the way of the user so that the user can see the display unit, or is such that the user can look over the top of the reflecting surface to see the display unit. Preferably, said reflecting surface is mounted on a frame extending from said display unit.
In order to prevent tiring of the arms of the user, the display system preferably further comprises support means for supporting the wrists, forearms and/or elbows of said user.
A screen of the display unit in a standard position, upright or near it, advantageously creates a reflected stereo image in the region of maximum user dexterity. Following earlier systems, a six-degree-of-freedom position sensing means (optionally supplying force) as discussed above, held in a hand or in each hand of the user controls the virtual image or tool. The virtual image appears to be where the sensing means is held, and interacts with the other displayed objects in ways controlled by application software using code supplied with the system.
BRIEF DESCRIPTION OF THE DRAWINGS
An embodiment of the invention will now be described with reference to the accompanying drawings, in which:
Figure 1 shows a schematic side view of the display system, including real and virtual light rays between the user's eye and the real and virtual (reflected) display screen, and the apparent positions of a virtual tool and a virtual object for a user looking at the real or virtual screen.
Figure 2 shows a perspective view of the display system of figure 1 omitting the real and virtual images.
DETAILED DESCRIPTION OF THE INVENTION
The reach-in display system incorporates a display unit (1) in an upright position (that is in the normal attitude range for desktop computers), capable of stereo display under control of computer graphics hardware. Mounted in front of the display unit (1), in such a position that either (a) the user can look over it and see the display unit in the usual way, or (b) the mounting can be swivelled out of the way, is a light frame (2). Held in each of the user's hands is a position-sensing device (3). The location and orientation of the position-sensing device (3) relative to a fixed reference device or device (4) may be deduced from controlled signals emitted by, and/or signals received by the position-sensing device (3) or the reference devices (4) or both; each device (3) carries one or more buttons (5), which report "pressed" or "unpressed" states and transitions between these, or scaled input devices which report user- controlled values in a continuous range. The coupling between devices (3) and (4) may be through one detecting emissions or reflections from the other, or by a mechanical connection (6), which may detect force applied by the user and may transmit force felt by the user. In implementations where position determination is based on camera observation of position-sensing device (3), emission of energy detected by position-sensing device (3), or detection of energy emitted by position- sensing device (3), there may be several of the fixed devices (4). Where the linkage is mechanical, a single attachment point per position sensor is sufficient. In a hybrid system using mechanical linkage for one hand only, in particular to deliver force, an emission-detection system supporting the sensor used in the other hand can also determine the base point of the mechanical system, which need not therefore be rigidly linked to the screen in one permanent position.)
The frame (2) supports a mirror (7) sloping downward from the user, so that the user can see within the natural work volume a displayed stereo image in the virtual image (8) of the screen (1). The two-point attachment of the frame and mirror in Figure 1 is for clarity. Other support arrangements, and method of permitting the structure to be rotated out of the way, are obvious to one skilled in the art. The attitude of the mirror is important, but it can be held there in many ways. A surface sloped like a drafting table, with or without a central hole that allows the hands and the sensors (3) to move below its main surface, may be added for better support of forearms and wrists.
Filters (9) control which screen images are visible to which eye, so that software can present the appropriate view to each eye to generate in the brain stereoscopic combined image, whose elements have perceived depth. The filters (9) may be mounted on the frame, or worn by the user as glasses. Optionally the head may be tracked for better correspondence between the user's eye positions and the viewpoints for which the left and right view are rendered; but if the rendering delay is long relative to head movement speed, this option is counter-productive.
The position reference devices (4) may be mounted on the frame (2), directly on the monitor, or otherwise rigidly linked to the monitor, software performs the relative position deduction, allowing a virtual object called a "tool" (10) to appear to the user in the display (7) at the location corresponding to the physical position of the position- sensing device (3), by creating in the real screen a stereo image (11) at the apparent position shown. An overall control loop allows an application program in the computer (12) to determine when (10) is to interact with a virtual object (13) generated or managed by the application program (for example, when (10) is close to ( 3) and a button (5) is clicked or held down), upon which the program modifies the position of the object the relative position of its parts, the properties logically assigned to it (linkage to other objects, stiffness, toxicity, ore concentration, or whatever the application requires), or the way that all or part of it is displayed, according to the needs of the application and the choices made by the user.
Those skilled in the art will appreciate that the invention described herein is susceptible to variations and modifications other than those specifically described. It is to be understood that the invention includes all such variations and modifications which fall within its spirit and scope. The invention also includes all the steps, features, compositions and compounds referred to or indicated in this specification, individually or collectively, and any and all combinations of any two or more of said steps or features.

Claims

1. A display system comprising: a substantially upright stereo display unit; and a reflecting surface disposed between a user of said system and said display unit, said reflecting surface being disposed such that it slopes downwardly away from said user, wherein, in use, a virtual image of said display unit and an image displayed by said display unit is created within a natural work space of said user.
2. A display system according to claim 1 , further comprising means for controlling images presented to each eye of said user such that said user perceives said virtual image to be a stereoscopic image.
3. A display unit according to claim 2, wherein said means for controlling images comprises a passive or a computer-synchronized filter mounted relative to said reflecting surface or on the head of said user.
4. A display system according to claim 1 , further comprising position sensing means for sensing perceived movement of said virtual image by said user, said perceived movement being translated to movement of said image displayed by said display unit.
5. A display system according to claim 4, wherein said position sensing means comprises transmitting means and receiving means, one of said transmitting means and said receiving means, in use, being fixedly disposed relative to said reflecting surface, and the other of said transmitting means and said receiving means, in use, being moveably disposed relative to said one of said transmitting means and said receiving means.
6. A display system according to claim 5, wherein said transmitting means comprises at least one emitter of ultrasound or electromagnetic energy, and wherein said receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said at least one emitter.
7. A display system according to claim 6, wherein said transmitting means comprises a plurality of emitters of ultrasound or electromagnetic energy, each of said emitters being fixedly disposed relative to said reflecting surface, and wherein said receiving means comprises at least one sensor for sensing ultrasound or electromagnetic energy emitted by said plurality of emitters, a hand-held device comprising said at least one sensor, in use, being held in a hand or both hands of said user of said system, the position and orientation of said at least one sensor being determined with reference to ultrasound or electromagnetic energy received from said emitters.
8. A display system according to claim 6, wherein said receiving means comprises a plurality of sensors for receiving ultrasound or electromagnetic energy, each of said sensors being fixedly disposed relative to said reflecting surface, and wherein said transmitting means comprises at least one emitter of ultrasound or electromagnetic energy, a hand-held device comprising said at least one emitter, in use, being held in a hand or both hands of said user of said system, the position and orientation of said emitter being determined with reference to ultrasound or electromagnetic energy received from said emitter.
9. A display system according to claim 5, wherein said transmitting means comprises a plurality of reference points or objects, each of said points or objects being fixedly disposed relative to said reflecting surface, and wherein said receiving means comprises a camera, a hand-held device comprising said camera, in use, being held in a hand or both hands of said user of said system, the position and orientation of the camera being determined from camera views with reference to the reference points or objects.
10. A display system according to claim 5, wherein said receiving means comprises a plurality of cameras, each of said cameras being fixedly disposed relative to said reflecting surface, and wherein said transmitting means comprises a reference point or object, a hand-held device comprising said reference point or object, in use, being held in a hand or both hands of said user of said system, the position and orientation of said reference point or object being determined with reference to camera views of said reference point or object.
11. A display system according to claim 4, wherein the position sensing means comprises a hand-held device in use being held in a hand or both hands of said user of said system, said hand-held device being attached to a robotic arm, the position and orientation of said hand-held device being computable with reference to joint angles of said robotic arm at any given moment.
12. A display system according to claim 11 , wherein said robotic arm detects forces applied by said user, and/or delivers computer-selected force to said handheld device.
13. A display system according to any one of claims 7 to 12, wherein said handheld device comprises signal means which, on activation by said user, delivers a desired signal to a computer generating said image displayed by said display unit.
14. A display system according to claim 13, wherein said signal means comprises a button or buttons, and/or rotating or sliding attachments.
15. A display unit according to claim 1 , wherein said reflecting surface is partially reflecting such that, in use, said image displayed by said display unit is superimposed onto a real object.
16. A display system according to claim 1 , wherein said reflecting surface is mounted on a frame extending from said display unit.
17. A display system according to claim 1 , further comprising support means for supporting the wrists, forearms and/or elbows of said user.
PCT/SG1999/000045 1998-05-21 1999-05-20 A display system WO1999060529A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
SG9801129A SG87768A1 (en) 1998-05-21 1998-05-21 Compact reach-in display system for two-handed user interaction with virtual objects
SG9801129-9 1998-05-21
SG1999002081A SG77682A1 (en) 1998-05-21 1999-05-06 A display system
SG9902081-0 1999-05-06

Publications (1)

Publication Number Publication Date
WO1999060529A1 true WO1999060529A1 (en) 1999-11-25

Family

ID=26665173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG1999/000045 WO1999060529A1 (en) 1998-05-21 1999-05-20 A display system

Country Status (2)

Country Link
SG (1) SG77682A1 (en)
WO (1) WO1999060529A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002071337A1 (en) * 2001-03-01 2002-09-12 Volume Interactions Pte Ltd A display apparatus
WO2002100284A1 (en) * 2001-06-13 2002-12-19 Volume Interactions Pte Ltd A guide system
WO2002100285A1 (en) * 2001-06-13 2002-12-19 Volume Interactions Pte Ltd A guide system and a probe therefor
WO2006104247A1 (en) * 2005-03-29 2006-10-05 Yasui & Co. Magnification observation apparatus
US7291456B2 (en) 2002-01-24 2007-11-06 The Regents Of The University Of California Method for determining differences in molecular interactions and for screening a combinatorial library
WO2009150190A1 (en) * 2008-06-11 2009-12-17 Vrmagic Gmbh Ophthalmoscope simulator

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4207284A1 (en) * 1992-03-07 1993-09-09 Stefan Reich Image processing for three=dimensional representation with measurement of head movements - employing stationary monitor with ultrasonic measurement of variations in direction of line of vision of observer, where observer wears liq. crystal shutter spectacles
US5287437A (en) * 1992-06-02 1994-02-15 Sun Microsystems, Inc. Method and apparatus for head tracked display of precomputed stereo images
EP0607000A2 (en) * 1993-01-14 1994-07-20 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
WO1995019584A1 (en) * 1994-01-14 1995-07-20 Dimensional Media Associates Multi-image compositing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4207284A1 (en) * 1992-03-07 1993-09-09 Stefan Reich Image processing for three=dimensional representation with measurement of head movements - employing stationary monitor with ultrasonic measurement of variations in direction of line of vision of observer, where observer wears liq. crystal shutter spectacles
US5287437A (en) * 1992-06-02 1994-02-15 Sun Microsystems, Inc. Method and apparatus for head tracked display of precomputed stereo images
EP0607000A2 (en) * 1993-01-14 1994-07-20 Sun Microsystems, Inc. Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
WO1995019584A1 (en) * 1994-01-14 1995-07-20 Dimensional Media Associates Multi-image compositing

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002071337A1 (en) * 2001-03-01 2002-09-12 Volume Interactions Pte Ltd A display apparatus
WO2002100284A1 (en) * 2001-06-13 2002-12-19 Volume Interactions Pte Ltd A guide system
WO2002100285A1 (en) * 2001-06-13 2002-12-19 Volume Interactions Pte Ltd A guide system and a probe therefor
US7291456B2 (en) 2002-01-24 2007-11-06 The Regents Of The University Of California Method for determining differences in molecular interactions and for screening a combinatorial library
WO2006104247A1 (en) * 2005-03-29 2006-10-05 Yasui & Co. Magnification observation apparatus
WO2009150190A1 (en) * 2008-06-11 2009-12-17 Vrmagic Gmbh Ophthalmoscope simulator
US8690581B2 (en) 2008-06-11 2014-04-08 Vrmagic Gmbh Opthalmoscope simulator

Also Published As

Publication number Publication date
SG77682A1 (en) 2001-01-16

Similar Documents

Publication Publication Date Title
JP4251673B2 (en) Image presentation device
Deering High resolution virtual reality
KR100327874B1 (en) Method and apparatus for generating high resolution 3D images in head-tracked stereoscopic display systems
Weimer et al. A synthetic visual environment with hand gesturing and voice input
Ware et al. Using the bat: A six-dimensional mouse for object placement
EP3564788B1 (en) Three-dimensional user input
US5446834A (en) Method and apparatus for high resolution virtual reality systems using head tracked display
US7091928B2 (en) Intelligent eye
Ware Using hand position for virtual object placement
JP4883774B2 (en) Information processing apparatus, control method therefor, and program
Poston et al. Dextrous virtual work
Mulder et al. The personal space station: Bringing interaction within reach
WO2000002187A1 (en) Stereoscopic user interface method and apparatus
JP2006301654A (en) Image presentation apparatus
Poston et al. The virtual workbench: Dextrous VR
JP3242079U (en) Floating image display device and floating image display system
WO1999060529A1 (en) A display system
US5841887A (en) Input device and display system
Gigante Virtual reality: Enabling technologies
JP2009087161A (en) Image processor and image processing method
US8767054B2 (en) Viewing system for the manipulation of an object
JP3413145B2 (en) Virtual space editing method and virtual space editing device
CA2534218A1 (en) A display apparatus
JP7450289B2 (en) Interactive operation method for stereoscopic images and stereoscopic image display system
WO2024075565A1 (en) Display control device and display control method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase