|Publication number||USRE42336 E1|
|Application number||US 12/574,607|
|Publication date||May 10, 2011|
|Filing date||Oct 6, 2009|
|Priority date||Nov 28, 1995|
|Also published as||EP1116211A1, EP1116211A4, US6184847, WO2000017848A1|
|Publication number||12574607, 574607, US RE42336 E1, US RE42336E1, US-E1-RE42336, USRE42336 E1, USRE42336E1|
|Inventors||Sina Fateh, James F. Flack, Arthur L. Zwern|
|Original Assignee||Rembrandt Portable Display Technologies, Lp|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (74), Non-Patent Citations (76), Referenced by (30), Classifications (16), Legal Events (9)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application claims the benefit of provisional application priority to U.S. Provisional Application No. 60/101,433, filed Sep. 22, 1998, and is a continuation-in-part of U.S. patent application Ser. No. 09/264,799, filed on Mar. 9, 1999, which issued as U.S. Pat. No. 6,084,556 on Jul. 4, 2000, which is a continuation of U.S. patent application Ser. No. 09/235,096, filed Jan. 21, 1999, which issued as U.S. Pat. No. 6,127,990 on Oct. 3, 2000, which is a continuation of U.S. patent application Ser. No. 08/563,525, filed Nov. 28, 1995, now abandoned. This application also is a continuation-in-part of U.S. patent application Ser. No. 09/373,186, filed Aug. 12, 1999, which issued as U.S. Pat. No. 6,359,603 on Mar. 19, 2002, which is a continuation of U.S. patent application Ser. No. 09/235,096, filed Jan. 21, 1999, which issued as U.S. Pat. No. 6,127,990 on Oct. 3, 2000, which is a continuation of U.S. patent application Ser. 08/563,525, filed Nov. 28, 1995, now abandoned.
The invention relates to human/computer interfaces to visual data and more particularly to systems that must display a larger amount of visual data than may be conveniently displayed in a single conventional computer monitor. The present invention uses virtual reality techniques to provide instantaneous and intuitive access to large fields of visual data, and to provide visually-impaired users with enhanced access to enlarged visual data.
Among the visually-impaired population, the most common approach to computer access is specialized software and/or hardware that enlarges the image displayed on the computer monitor. This is because simpler solutions such as moving closer to the monitor, using a larger monitor, adding an optical screen magnifier, or using a spectacle-mounted telescopic system provide either limited magnification or a very limited viewing field. Examples of commercially-available screen enlargers include LP-DOS by Optelec (Westford, Mass.), Zoomtext by Ai Squared (Manchester Center, Vt.), MAGic by Microsystems Software (Framingham, Mass.), and Magnum by Arctic Technologies (Troy, Mich.). In addition, simplistic screen enlargement modules are included in both the Microsoft Windows and Apple Macintosh operating systems.
These conventional computer display magnification solutions operate by magnifying the original image of a software application's output to a “virtual page” whose size is much larger than the physical monitor. For example, with a magnification of 10, a standard 8.5″×11″ page would be approximately 7 feet wide by 9 feet tall. The visually-impaired user then operates the computer by using a mouse, joystick, or cursor keys to control which portion of the virtual page is shown on the monitor at any given point in time. Since the monitor is fixed, the user is in essence moving the virtual page across the monitor, in a manner analogous to that used in closed-circuit television (CCTV) systems for magnifying book pages.
In most cases, conventional screen magnification is performed completely in software running on the host computer's central processing unit (CPU). While this provides a very low-cost solution, the data to be shown on the display must be rendered in its entirety whenever the user pans to a new location within the enlarged image. This can result in lags between commanding the computer to pan and seeing the new image. To overcome this problem, the entire virtual image can be rendered and stored in a video display buffer. Then, as the user selects a portion of the image for viewing, the required portion of the data can be quickly read out of the display buffer and sent to the display device. An example of such a hardware-accelerated screen magnifier is the Vista by Telesensory, Inc. (Mountain View, Calif.). This technique is a form of hardware acceleration known as image deflection.
Unfortunately, there are two basic shortcomings to the conventional approach, even with hardware acceleration. The first problem is spatial orientation, in that it is difficult to determine where on the page one's view is directed at any given time. This occurs because the monitor does not move, and there are no other visual cues to indicate where on the virtual page one's line of sight is facing. This spatial orientation problem is exacerbated for high magnifications and for portable systems employing small display monitors. For example, one study (Goodrich, et. al.) found mean magnifications of 15.48× for nearly 100 experienced users of closed-circuit television devices. At 15×, a 15″ monitor can only display about 1% of a standard 8.5″×11″ page, making most computer work essentially impossible for such users. The problem is further exacerbated by the emergence of graphically-intensive computing regimes such as Microsoft Windows and the Internet World Wide Web, where individual graphic elements may be magnified to become larger than an instantaneous viewing window, or may be automatically generated outside of the user's instantaneous viewing window without the user's awareness.
The second fundamental problem in the conventional approach is dynamic control, in that all of the various control schemes for navigating about the page are cumbersome, confusing, and slow. This is because the navigation methods are unintuitive, relying on such logic as “use joystick to move cursor around screen, and when cursor reaches the edge of the screen, the next portion of document in that direction will be displayed.” Alternatively, some screen enlargers maintain the cursor at the center of the screen, and require the user to position a desired insertion point over the cursor by moving the entire virtual page with a mouse or joystick. In all cases, dynamic control is not only unintuitive, but requires use of at least one hand, which negatively impacts productivity, and may make use by physically-impaired users difficult or impossible.
Together, these spatial orientation and dynamic control problems were termed the “field navigation” problem in the National Advisory Eye Council's 1994-98 National Plan (Legge, et. al.), in which the Low Vision and its Rehabilitation Panel identified this area as a particularly promising opportunity for new technologies.
One promising new technology that is now maturing is virtual reality, which is typically defined as a computer-generated three-dimensional environment providing the ability to navigate about the environment, turn one's head to look around the environment, and interact with simulated objects in the environment using a control peripheral.
In a virtual reality system, the user is “immersed” in a synthetic environment, in which virtual objects can be located anywhere in the user's physical space. The user views these objects by wearing a head-mounted display (HMD), which uses an optical system to cause a tiny display source such as a cathode ray tube or liquid crystal display to appear as a large display screen several feet in front of the user. Since the display source (or sources in the case of two eyes) is fixed to the user's head, the display is viewable regardless of where the user points his line-of-sight. The user also wears a head-tracker, which senses the direction the user is facing, and sends this information to the host computer. The computer uses this data to generate graphics corresponding to the user's line of sight in the virtual environment. This approach to human/computer interfaces was first conceived by Ivan Sutherland in 1966 for use in military simulators, and was first commercialized in the form of the Eyephone head-mounted display by VPL Research in the late 1980s.
Prior art in this area includes a wide range of relevant patents describing low-vision aids, improved virtual reality systems and components such as HMDs and head-trackers, but none which embody or anticipate the present invention.
In the field of low-vision aids, U.S. Pat. No. 4,227,209 issued Oct. 10, 1980 discloses an electronic sensory aid for visually-impaired users including an image sensor and a display array, wherein the degree of magnification provided in the display array may be adjusted by changing the number of display elements corresponding to each sensor array element. For use in electronic sensory aid applications requiring a large depth of focus, an improved image capture approach is disclosed in U.S. Pat. No. 5,325,123 issued Jun. 28, 1994, in which the imaging camera includes an opaque stop with a small aperture, thus allowing the magnification to be adjusted by moving the camera towards or away from the object to be magnified. A non-electronic sensory aid is disclosed in U.S. Pat. No. 4,548,485 issued Oct. 22, 1985, in which an XY stage is used to move textual material across an optical viewing system that captures a portion of the textual material for enlargement.
In U.S. Pat. No. 5,125,046 issued Jun. 23, 1992, and U.S. Pat. No. 5,267,331 issued Nov. 30, 1993, an improved imaging enhancer for visually-impaired users is disclosed in which an image is captured, digitized, and electronically enhanced to increase contrast before displaying the imagery. An improvement to this approach using a head-mounted display is disclosed in U.S. Pat. No. 5,359,675, issued Oct. 25, 1994.
In the field of virtual reality systems, U.S. Pat. No. 5,367,614 issued Nov. 22, 1994 to Bisey discloses a three-dimensional computer image display system using an ultrasonic transceiver head-tracking system to control a three-dimensional display to cause the image to change its perspective in response to head movements. In addition, U.S. Pat. No. 5,442,734 issued Aug. 15, 1995 to Murakami discloses a virtual reality system incorporating a head-mounted display, head-tracker, and image processing system in which predictive tracking algorithms are used to differentially update portions of the display field to provide more rapid updating of those portions of the display field corresponding to the center of the user's visual field. In U.S. patent application Ser. No. 07/621,446 (pending) filed by VPL Research, Inc., a virtual reality system is disclosed in which spatialized audio cues are generated to provide real-time feedback to users upon successful completion of manual tasks such as grasping virtual objects using a sensor-laden glove input device.
In the head-mounted display field, U.S. Pat. No. 5,003,300 issued Mar. 26, 1991 to Wells discloses a raster-based head-mounted display that may be used to display an image to either eye. U.S. Pat. No. 5,151,722 issued Sep. 29, 1992 to Massof discloses a video-based head-mounted display featuring a unique folding optic configuration so that the device may be worn like a pair of glasses. U.S. Pat. No. 5,281,957 issued Jan. 25, 1994 to Schoolman discloses a portable computer system incorporating a head-mounted display that may be hinge-mounted to an eyeglass frame so that the display may be folded up out of the way for viewing the physical environment. A wide variety of additional patents in the area of specific design improvements for head-mounted display devices exists, however, the specific head-mounted display design approach employed to effect the present invention is not critical, so long as image quality, brightness, contrast, and comfort are maintained at high levels.
In recent years, there have been several attempts made to apply head-mounted displays to the problems of enhancing imagery for visually-impaired users. One such effort has resulted in the Low-Vision Enhancement System (LVES) developed by Johns Hopkins University and marketed commercially by Visionics (Minneapolis, Minn.). The LVES device incorporates a head-mounted display with integrated cameras and an image processing system. The cameras generate an image of whatever is positioned directly in front of the user, and the image processing system enlarges the image and performs enhancement functions such as contrast enhancement. While the LVES device can provide magnified imagery of real-world objects to some visually-impaired users, it suffers from several shortcomings compared to the present invention. First, the LVES does not incorporate a head-tracker to provide a hands-free means for navigating within computer data. Further, the LVES suffers from a jitter problem exactly analogous to that experienced by users of binoculars or telescopes. In simple terms, any jitter in the user's line-of-sight is magnified by the same factor as the imagery, which causes the image provided to the user to appear unsteady.
U.S. Pat. No. 5,109,282 issued Apr. 28, 1992 to Peli discloses a novel image processing method for converting continuous grey tone images into high resolution halftone images, and describes an embodiment of the method applicable to presentation of enlarged imagery to visually-impaired users via a head-mounted display. In this device, the imagery is generated by a conventional camera manually scanned across printed text as is common in closed-circuit television systems for the visually-impaired. The head-mounted display is a Private Eye by Reflection Technologies (Waltham, Mass.), which employs a linear array of light-emitting diodes converted to the impression of a rectangular array by means of a scanning mirror. In the disclosed device, benefits of using a head-mounted display for low-vision access to printed material in portable situations are discussed, including the larger visual field, higher visual contrast, lighter weight, and smaller physical size provided by an HMD compared to a portable conventional television monitor. However, no connection to a computer for viewing computer-generated imagery is disclosed or anticipated, and no incorporation of a head-tracking device is disclosed or anticipated.
In the tracker art, a variety of tracking approaches and applications have been conceived and constructed. U.S. Pat. No. 5,373,857 issued Dec. 12, 1994 to Travers, discloses a head-tracking approach for the yaw degree of freedom in virtual reality applications consisting of a magnetic sensor disposed on a headset to produce a displacement signal relative to angular displacement of the head set with respect to the earth's magnetic field. A more sophisticated approach has been developed by the Massachusetts Institute of Technology (MIT), in which an analogous magnetic sensor is used to correct drift in a much faster differential sensor such as an accelerometer, which sensors together provide extremely rapid response and high accuracy within a single package. The MIT approach, believed to be patent-pending, additionally incorporates differential sensors to detect changes in the pitch and roll degrees of freedom, which sensors may also be corrected using slower absolute sensors such as liquid-filled capacitive tilt sensors.
Also within the tracker art, a number of devices have been disclosed which sense head movement for purposes of controlling positioning of a cursor or mouse pointer within the viewable portion of a conventional display monitor. U.S. Pat. No. 4,209,255 issued Jun. 24, 1980 to Heynau discloses a system for pilots employing a light-emitting diode mounted on the pilot's head, with photodiodes located on the display to sense the tapered energy field from the light-emitting diode for purposes of determining the pilot's aim-point within the display.
U.S. Pat. No. 4,565,999 issued Jan. 21, 1986 to King discloses a cursor control system for use with a data terminal wherein a radiation source and a radiation sensor are used to determine changes in a user's head position for purposes of controlling cursor position on the screen.
U.S. Pat. No. 4,567,479 issued Jan. 28, 1986 to Boyd discloses a directional controller for video or computer input by physically-impaired users consisting of a series of mercury switches disposed in proximity to a user's head, wherein movements of the user's head are sensed and converted into cursor control commands. This device also employs a pressure switch activated by the user's mouth which can provide a further control signal such as that generated by clicking a mouse button.
U.S. Pat. No. 4,682,159 issued Jul. 27, 1987 to Davison discloses an apparatus and method for controlling a cursor on a computer display that consists of a headset worn by the user, and a stationary ultrasonic transmitter for emitting sound waves which are picked up by receivers in the headset. These sound waves are compared for phase changes, which are converted into positional change data for controlling the cursor.
U.S. Pat. No. 5,367,315 issued Nov. 22, 1994 to Pan discloses an infrared-light based system that indicates head and eye position in real time, so as to enable a computer user to control cursor movement on a display by moving his or her eyes or head. The device is intended to emulate a standard mouse, thereby allowing use of the presently available software and hardware.
While the above examples demonstrate a well-developed art for controlling computer cursors via head movement, none disclose or anticipate application of head-controlled cursor movement within a head-mounted display, and none anticipate an approach such as the present invention wherein the cursor remains fixed at a particular position within the display while the displayed data is moved instead of the cursor. Movement of displayed data within a head-mounted display in response to head movement has heretofore been used only within virtual reality systems designed for simulating sensory immersion within three-dimensional computer simulations. In such applications, cursors or mouse pointers are not controlled by head movement, but are generated when required through the use of a separate hand-controlled input device.
While virtual reality is still a developmental technology involving exotic graphics hardware, specialized software, and long integration cycles, the concept of closing a control loop between head-tracker data and HMD imagery can be implemented analogously for viewing arbitrary computer data instead of specially-constructed virtual environments. For normally sighted individuals, this could be beneficial by providing a large virtual computer desktop surrounding the user, which can provide simultaneous access to a larger amount of visual data than is possible using the small virtual desktops currently provided on common computing platforms such as Macintosh and Windows. For visually-impaired individuals, head-tracked HMD display techniques can be used to conveniently access a magnified virtual page, and thus enable productive computer use by nearly 1,000,000 new users.
It is therefore an object of the current invention to solve the field navigation problem by combining virtual reality display techniques originally developed for military flight simulators with screen magnification techniques, in order to create a novel and intuitive display interface for visually impaired users.
It is another object of the current invention to provide an intuitive computer display interface allowing the user to automatically achieve proper spatial orientation by directly coupling the user's head orientation to the displayed portion of a magnified virtual page.
It is a further object of the current invention to provide an intuitive computer display interface allowing the user to automatically control the position of a cursor or mouse pointer on a computer-generated virtual page by directly coupling the user's head movements to movements of a cursor across the virtual page, thus freeing the user's hands for other tasks.
It is an additional object of the present invention to provide hands-free instantaneous selection from between many concurrently active computer applications by changing one's line-of-sight from one application window's virtual location to another.
It is yet another object of the present invention to provide and maintain a cursor at a user-selectable position within the user's field-of-view, in order to support use of the virtual computer display by users with arbitrary, non-central preferred retinal loci.
It is still another object of the present invention to alert the user to events occurring outside of the user's instantaneous field-of-view through the use of spatialized audio alerts perceived to originate from the direction of the event, thus causing the user to turn and look in the direction of the event.
It is yet a further object of the present invention to provide effective operation at magnifications much greater than those possible using fixed monitors, by using a novel technique known as spatial compression.
It is still another object of the present invention to provide improved scrolling of imagery across the user's field-of-view, through application of smoothing, thresholding, prediction, and drift compensation algorithms to improve response to raw data representing the user's instantaneous line of sight.
It is still a further object of the present invention to provide a computer display for visually-impaired users that is convenient, lightweight, low-cost, minimally power hungry, and capable of portable operation without degraded performance.
It is another object of the present invention to provide a means for visually-impaired users to view enlarged video imagery in real time over an expanded field-of-regard, thus reducing jitter compared to head-mounted closed-circuit television systems.
In accordance with the present invention, there has been devised a “virtual computer monitor” (VCM) which is broadly comprised of a head-mounted display means worn by the user, a head-orientation sensing means worn by the user, and software means for interfacing these devices to a host computer such that the user's head orientation data is processed to determine which portion of an arbitrary software application's output imagery to display. By properly matching the angle of head rotation to the extent of scrolling across the magnified image, the image can be made to appear fixed in space. The particular location of the portion of the virtual image which is actually being seen by the user is dependent upon the direction in which the user looks. As the user looks to the right, the portion of the virtual image being seen by the user is to the right of the portion of the virtual image previously being seen by the user. Similarly, as the user looks up, the portion of the virtual image being seen by the user is above the portion of the virtual image previously seen by the user. Upon initialization of the VCM device, the user triggers calibration between the user's straight-ahead line of sight and the center of the virtual page. From then on, the user can rotate her head left, right, up, and down to visually scan across the page in corresponding directions. The overall impression is analogous to a normally sighted person scanning across a newspaper page.
As applied to a computer interface device for the visually-impaired, the VCM software provides a magnification adjustment to allow each user to achieve adequate visual resolution without needlessly reducing his instantaneous viewing field. The software also provides a cursor, which nominally remains positioned at the center of the HMD physical field regardless of head movements so that the cursor can be positioned anywhere upon the virtual page by turning to face that location. A further adjustment allows setting the fixed cursor location to any arbitrary position in the HMD device's physical field, so that users with unusable portions of their visual fields can select an alternative preferred retinal loci instead of the center. A software selection also provides an overview display, which shows a reduced-magnification image of the entire virtual page, with a bold black box highlighting the outline of the instantaneous field within the entire field.
An additional important feature is the ability to temporarily adjust the cursor position in real-time using a controller peripheral such as a joystick or mouse. This feature allows fine positioning of the cursor within the field by temporarily locking the head-tracking system to freeze a portion of the virtual page on the physical display, while the controller is used to move the cursor in small increments.
An additional important feature is the ability to display image components in addition to the cursor at fixed points in the physical display, which allows menus or other icons to remain in the user's instantaneous viewing field at all times while scrolling across image content.
An additional important feature resides in the ability to reduce the lag between a head motion and display of the new direction's image by using image deflection, thresholding, smoothing, prediction, and a novel drift compensation technique to reduce display “swimming”, which is caused whenever imperfect head orientation sensing causes the displayed image to not appear fixed in real-space.
An additional important feature resides in the ability to magnify images by extremely large factors using spatial field compression, where the displayed image is scrolled across the physical display at a faster rate than the head is turned. This enables use by individuals with limited head motion, and allows magnification to levels that would otherwise require turning completely around to view edges of the image.
An additional important feature resides in the use of a partially immersive HMD, which avoids simulation sickness by allowing the user to maintain a constant frame of reference in the physical world since real objects can be seen around one or more edges of the display.
It is therefore an advantage of the current invention that it solves the field navigation problem by combining virtual reality display techniques originally developed for military flight simulators with screen magnification techniques, in order to provide a novel and intuitive display interface for visually impaired users.
It is another advantage of the current invention that it provides an intuitive computer display interface allowing the user to automatically achieve proper spatial orientation by directly coupling the user's head orientation to the displayed portion of a magnified virtual page.
It is a further advantage of the current invention that it provides an intuitive computer display interface allowing the user to automatically control the position of a cursor or mouse pointer on a computer-generated virtual page by directly coupling the user's head movements to movements of a cursor across the virtual page, thus freeing the user's hands for other tasks.
It is an additional advantage of the present invention that it provides hands-free instantaneous selection from between many concurrently active computer applications by changing one's line-of-sight from one application window's virtual location to another.
It is yet another advantage of the present invention that it provides and maintains a cursor at a user-selectable position within the user's field-of-view, in order to support use of the virtual computer display by users with arbitrary, non-central preferred retinal loci.
It is still another advantage of the present invention that it alerts the user to events occurring outside of the user's instantaneous field-of-view through the use of spatialized audio alerts perceived to originate from the direction of the event, thus causing the user to turn and look in the direction of the event.
It is yet a further advantage of the present invention that it provides effective operation at magnifications much greater than those possible using fixed monitors, by using a novel technique known as spatial compression.
It is still another advantage of the present invention that it provides improved scrolling of imagery across the user's field-of-view, through application of smoothing, thresholding, prediction, and drift compensation algorithms to improve response to raw data representing the user's instantaneous line of sight.
It is still a further advantage of the present invention that it provides a computer display for visually-impaired users that is convenient, lightweight, low-cost, minimally power hungry, and capable of portable operation without degraded performance.
It is another advantage of the present invention that it provides a means for visually-impaired users to view enlarged video imagery in real time over an expanded field-of-regard, thus reducing jitter compared to head-mounted closed-circuit television systems.
The above and other objects, features, and advantages of the present invention will become more readily understood and appreciated from a consideration of the following detailed description of the preferred embodiment when taken together with the accompanying drawings, which, however, should not be taken as limitative to the present invention but for elucidation and explanation only.
The system of this invention 10 concerns a computer 12 controlled by a user through conventional means such as a keyboard 14 and input controller 16, and whose output is viewed on a display monitor 18. Referring to
With respect to the present invention, a wide variety of different head-mounted displays 26 and head-trackers 28 may be used without affecting the fundamental operating concepts embodied therein, and many suitable devices are commercially available. For the head-mounted display 26, it is important to provide adequate field-of-view to ensure that a significant portion of the user's visual field is addressed, and to provide a sufficient number of picture elements, or pixels, so that small text can be resolved. Useful minimums are twenty degree field-of-view and 100,000 pixels per eye, although these figures are subjective. In addition, visual contrast must be high (100 to 1 or greater) for visually-impaired users. For some visually-impaired users, maximizing contrast can become sufficiently critical that a color display can not cannot be used, and a black and white unit must be used instead. In general, parameters such as field-of-view, pixel count, contrast, size/weight, cost, and other factors such as apparent image distance well-known in the art of head-mounted display design may be traded-off to provide a best compromise over a varied population of users, or may be traded-off to optimize performance for a single user.
The simplest embodiment of the invention 10 uses a CyberTrack™ model head-tracker 28 from General Reality Company. This model provides an output signal emulating that of a mouse, which can be read directly by a standard Microsoft mouse driver 30 for purposes of controlling the manner in which the instantaneous viewport 22 is selected from within the virtual display 20. In alternative embodiments using a different head-tracker 28 which can not so emulate a mouse, an additional software module can be used to interpret the output of the head-tracker 28 and convert the output into “mickeys” that emulate mouse output, or an additional software module can adapt the output of the head-tracker 28 for directly controlling scrolling of the instantaneous viewport 22 without use of an intervening mouse driver. All told, a wide variety of alternative means for converting head-tracker output into scrolling of the instantaneous viewport have been conceived, so the approach selected for the present embodiments should not be considered limitative of the invention.
In any of these embodiments, an improvement to the simple scrolling of the instantaneous viewport 22 can be achieved using hardware or software logic that enables scrolling using a combination of data generated by the head-tracker 28 and the input controller 16. Specifically, when the mouse is not active, head-tracker 28 output is used to perform large-magnitude positioning of the instantaneous viewport 22 with respect to the virtual display 20. Once the instantaneous viewport 22 is positioned at the approximate desired position using head movements, the input controller 16 can then be used to perform fine positioning of the instantaneous viewport. The input controller 16 can also be used to select data items, or click and drag to select multiple data items, as is common within the art. Whenever such actions are taken with the mouse, the instantaneous viewport is moved appropriately, maintaining the mouse pointer 32 at its selected location within the instantaneous viewport. In a preferred embodiment, the input controller 16 and head-tracker 28 can operate simultaneously, which allows “click & drag” functions such as holding down the mouse button to anchor one corner of a selection box, then scrolling the head until an opposing corner is reached, and releasing the mouse button to select all of the items within the resulting selection box.
The present invention has been implemented in two alternative prototype embodiments, with additional embodiments contemplated. The first embodiment is constructed using an Apple Macintosh Duo 230 portable computer 12, a General Reality CyberEye Model 100 head-mounted display 26, and InLARGE screen magnifier software by Berkeley Systems (Berkeley, Calif.). In this embodiment, the head-tracker 28 is an experimental device utilizing Gyrostar ENC-05E solid-state gyroscopes by Murata Manufacturing Company (Kyoto, Japan, and US location at Smyrna, Ga.). Two gyroscopes are used, one each for the head's pitch (elevation) and yaw (direction) degrees of freedom. The output of each gyroscope consists of a differential voltage, with the difference voltage directly proportional to the angular velocity of the sensor. These outputs are fed to the Macintosh computer 12 via the Apple Desktop Bus (ADB) Port, which is used on all Macintosh computers for accepting input from keyboards, mice, and other input control devices. Because the gyroscopes output differential data representing an angular velocity, the data is digitized using a simple analog-to-digital converter integrated circuit, and then used directly for scrolling the imagery, with only a linear scaling factor applied. This scaling factor is dependent on the magnification factor applied to the imagery, and serves to maintain the enlarged image at a fixed position in space as perceived by the user. In the case of an absolute orientation tracker such as a magnetometer, the data must first be converted from orientation to rate of change in orientation by taking the mathematical derivative of the data with respect to time.
In this first embodiment and most conceivable alternative embodiments which utilize differential head tracking devices such as gyroscopes and accelerometers, various tracking errors are introduced by the lack of a stable reference. These errors include drift, temperature instability, hysteresis, cross-axis coupling, and limited dynamic range.
Drift is evidenced by slow motions in the imagery which occur in the absence of any true head motion, and is corrected by incorporating a low-frequency cut-off filter in the tracking data output. Such low-frequency cut-off filters are well-known in the tracking art, and do not affect perceived performance.
Temperature instability is evidenced by drift that occurs following rapid changes in the ambient temperature in which the tracker is used. Some such instability is removed with software which acts like a low-frequency cut-off filter by ignoring D.C. drift, while some is unavoidable and requires a waiting period for temperature of the system hardware to stabilize. This software ignores any D.C. signal component from the head tracker 28 and allows a scaling factor to be input to the system to control the magnitude of the shift in the virtual image as a function of the amount of rotation of the user's head.
Hysteresis is evidenced by sensitivity differences between motion in one direction and motion in a direction 180 degrees opposite. This artifact can be addressed by using a different scaling factor depending upon the tracker's direction of travel. The magnitude of this sealing factor can be determined experimentally, depending upon the magnitude and direction of the hysteresis.
Cross-axis coupling is evidenced by the displayed image moving a small amount in one axis when all of the head motion is along an orthogonal axis. This artifact is also controlled by the software which acts like a low-frequency cut-off filter, and may be further controlled by disabling one axis whenever the orthogonal axis rate of motion is greater than an empirically-determined threshold.
Finally, dynamic range limitations result in upper and lower limits to the rate at which the head may be turned while still maintaining the perception that the image is fixed in space. The lower limit is nominally determined by the electronic noise floor of the sensor devices, although it is raised by addition of the low-frequency cut-off filter. The upper limit is determined by the maximum rate of change measurable by the sensor. If this rate of change is exceeded by overly rapid turning of the user's head, the imagery will appear to move in the same direction as the head is turning. This last artifact has not been solved, but may be addressed in a future embodiment through the use of an absolute position tracker.
In this first embodiment, the Apple Macintosh ADB port allows simultaneous operation of multiple input control peripherals. Because of this feature, either the input controller 16 or a variety of secondary controllers may be used in conjunction with the head-tracker 28 to perform navigation within the imagery. Such controllers include joysticks, trackballs, light pens, simple switches, or any other control device which is ADB port compatible.
The second embodiment of the invention has been implemented for the Intel/Microsoft personal computer architecture. In this embodiment, the computer 12 is a 90 Mhz Pentium host computer, the head-mounted display 26 is a CyberEye Model 100, and the head-tracker 28 is a 3-axis magnetometer, available as the Model TCM-2 from Precision Navigation, Inc. (Mountain View, Calif.) or the CyberTrack™ from General Reality Company (San Jose, Calif.). This embodiment has been made functional using LP-DOS from Optelec (Westford, Mass.) as the screen enlarger 24, although alternative commercially available screen enlargers may be used without modifying the remaining components of the system.
In this second embodiment, the selected head-tracker 28 is an absolute orientation sensor, although any alternative head-tracking device may be used. The specific 3-axis magnetometer used as the head-tracker 28 in this embodiment connects to the serial port of the computer 12, and provides an internal conversion from absolute position to differential data in the form of mouse “mickeys” compatible with the Intel/Microsoft personal computer architecture. Because of this feature, the output of the head-tracker 28 can be read directly by a standard Microsoft mouse driver, which provides a menu for setting the scaling factor required for maintaining a fixed image as perceived by the user.
In this second embodiment, the Intel/Microsoft personal computer architecture does not make use of the Apple ADB bus, but instead uses RS-232 serial communication ports to connect to control devices such as the head-tracker 28 and the input controller 16. This complicates the system design because the standard Microsoft mouse driver can only access one serial port (and therefore one control device) at any particular moment. Since proper operation of the invention 10 requires simultaneous response to the head-tracker 28 and the input controller 16, hardware or software is required to access two control devices simultaneously.
In the most common case of a conventional computer mouse employed as the input controller 16, this may be accomplished in one of at least five ways. First, an existing mouse driver that includes dual-port capability such as the original Borland mouse driver may be used. Second, the source code for the standard Microsoft mouse driver may be modified to support simultaneous access to two serial ports. Third, a device such as the “WhyMouse” by P.I. Engineering (Williamston, Mich.) may be used. This device serves as a “Y” adapter to connect two mouse-type pointing devices into one serial port. Circuitry internal to the WhyMouse automatically routes one or the other device's data to the serial port based on a priority scheme, wherein the first device to emit data gains control of the input. Fourth, a custom solution can be implemented in the form of a unique software driver, or fifth, in the form of a software module running on an intelligent input/output controller such as the Rocketport32 card from Industrial Computer Source (San Diego, Calif.). Such intelligent input/output controllers are available from several commercial sources in the form of a circuit board that may be inserted in an expansion slot within the computer 12. These circuit boards include two or more serial ports, as well as an on-board processor that can manipulate the inputs from the serial ports prior to delivering the tracking data to the computer's internal bus.
Of the four potential approaches to dual input device operation, the preferred embodiment exploits the fourth approach. This is because a custom software module avoids hardware costs, while providing the greatest flexibility in terms of application optimization and user convenience. For example, a custom software module allows the user to select whether the input controller 16 and head-tracker 28 can operate simultaneously in the manner preferred by the inventor, or whether the input controller 16 and head-tracker 28 operate in a priority scheme as provided in the WhyMouse product. In addition, a custom software approach can provide optional use of a variety of alternative devices as the input controller 16. For example, some users may prefer a hand-operated joystick to a mouse, while physically-impaired users may require a finger-operated joystick or head-operated directional switches with a puff & suck switch for activating the mouse clicking function.
The first category of functions performed by the tracking formatter 52 is conversion of the data stream emanating from the head-tracker 28 into a format readable by the control driver 48. This conversion is tracking sensor-dependent. In the case of a magnetometer-based tracker with mouse emulation as used in the Intel/Microsoft embodiment, no conversion is required. In the case of a magnetometer without mouse emulation, the tracking output would consist of rapidly-updated azimuth and elevation position figures, in which event the tracking formatter 52 would subtract the prior position sample from the present sample and then convert the format to mouse mickeys to provide the control driver 48 with emulated mouse output consisting of changes in position. In the case of a gyroscopic tracker with output converted to digital form such as that used in the Apple Macintosh embodiment, the output of the head-tracker 28 consists of angular velocity figures. In this event, the angular velocity samples are simply multiplied by the time period of each sample to yield a change in position, with each positional change then converted into mouse mickeys by the tracking formatter 52.
The second category of functions performed by the tracking formatter 52 consists of error correction functions such as those previously described for the Apple Macintosh embodiment of the invention 10. In that embodiment, the tracking formatter 52 performs low-frequency cut-off filtering, applies a directionally-dependent scaling factor, and disables one axis of travel when the orthogonal axis velocity rises above a threshold. These functions could also be performed in hardware such as an application-specific integrated circuit or a field-programmable gate array if higher-performance at high-volume production is desirable.
The third category of functions performed by the tracking formatter 52 consists of enhancement functions such as orientation prediction. This function addresses the pipeline delay between the instant in time when the head is turned, and the time when the displayed image is updated to display the new user line-of-sight. This delay can be calculated to be the sum of the tracker sensing time, tracker to computer communication time, tracker formatter processing time, control driver processing time, operating system and application software processing time, screen enlarger processing time, and display refresh time. In a typical embodiment, the sum of these delays can become bothersome, causing a perception of the display “swimming” with respect to the user's line of sight changes. This swimming causes perceptual mismatches between the user's internal proprioceptive cues and external visual cues, which in severe cases can cause disorientation and nausea effects known in the virtual reality field as simulator sickness. To avoid such effects, the current position and velocity of the head in each degree of freedom can be used to predict the future position, in the manner of So and Griffin or Azuma and Bishop. By doing so, the predicted future position can be used as the input to the processing pipeline instead of the current actual position, thus decreasing the average mismatch between the proprioceptive and visual cues.
The incorporation of the voice recognizer 56 enables use of convenience-enhancing commands for purposes such as positioning the virtual display with respect to the user's line-of-sight, selecting enlargement factors, controlling tracking, selecting between system operating modes, and controlling individual computer applications. For example, position commands include “center me” to center the user's instantaneous viewport 22 within the virtual display 20, “top right” to move the instantaneous viewport 22 to the top right, etc. Enlargement commands include absolute commands such as “Mag 8” to set the screen enlarger 24 to a magnification of 8 to 1, and relative commands such as “zoom double” to temporarily increase the magnification by a factor of two. Tracking control commands include “lock vertical” to lock-out response to the elevation tracking function, which simplifies scrolling horizontally across text. Selecting between system operating modes includes a complete set of commands for operating the screen enlarger 24, such as “scroll text” to enter the enlarger's text scrolling mode. Finally, application control commands are application-dependent and available commercially as libraries, which typically include most or all mouse-accessible functions such as “page down”, “font: Times”, “edit: cut”, etc.
It is also noted in
It is also noted in
A further embodiment of the present invention 10 is illustrated in
This image to be enlarged 66 may be a small object to be magnified such as text in a book, or may be a distant object to be resolved such as a blackboard in a classroom lecture. Each frame of video captured by the video grabber board 64 is output to the system bus as application output data by software included commercially with the video frame grabber board 64, and fed to the screen enlarger 24. The screen enlarger 24 magnifies the imagery, creating a virtual display 20 of the image to be enlarged 66 that occupies a larger angular extent as seen by the user than does the image to be enlarged 66. The head-mounted display 26, head-tracker 28, tracking formatter 52, and control driver 48 are then used as previously described to provide an instantaneous viewport 22, which may be positioned at any convenient point within the virtual display 20 by turning one's head. In this embodiment, improvement is made upon earlier closed-circuit television inventions for the visually-impaired in that the camera captures the entire image to be enlarged 66 at all times, instead of moving with the user's head or hand and capturing just the amount of imagery that can be displayed within the instantaneous viewport 22. By doing this, spatial awareness is maintained, but jitter in camera motion is not magnified to become disruptive to the user. In addition, by interposing a computer within such a closed-circuit television loop, any image may be instantly saved to permanent memory with a single keystroke for later review, editing, or printout.
As will be appreciated, the present invention is useful in a wide range of applications such as text processing and virtual map navigation. To enhance the user's control of the computer system, the present invention teaches entry of computer control commands through intuitive head gestures. In other words, in addition to adjusting the user's field of view by tracking head motion, we define specific head gestures and correspond these specific head gestures in an intuitive manner with “special discrete commands.”
In step 204, the computer system must distinguish special discrete commands from other head movement simply intended to adjust the user's field of view. This can be accomplished in step 206 through a variety of mechanisms. In some embodiments, certain head gestures could be mapped to corresponding special discrete commands. For specific examples, see the descriptions of
In any event, when the computer system has ascertained in step 204 that a special discrete instruction has occurred, control is passed to a step 208. In step 208, the computer system applies a function associated with the special discrete command to the sensed head motion. These functions can be based on head position and all related derivatives (velocity, acceleration, etc.). These functions may also be piecewise, with discrete portions having varying response characteristics. Once such a function has been applied, control is passed to a step 210 wherein the user's display is adjusted accordingly. Once the display is adjusted, control is passed back to monitor head movement step 202.
With reference to
So far the described special discrete commands have been well-known commands such as scrolling, page down, erase, etc. However, it is contemplated that robust control of a computer system through a head mounted display device requires commands specific to such a computing environment. In particular, there should be a mechanism by which a user can adjust the correspondence between the displayed field of view and the user's head position. For instance, a user may wish to reset his “neutral” field of view display. Imagine a user, initially looking straight ahead at a first display, moving his head 30 or 40 in order to examine or work within this second field of view. It may sometimes make sense to examine this second field of view with the head cocked this way, but often it would be preferable to reset the field of view so that the user may perceive the second field of view while looking straight ahead. The present invention covers all mechanisms that would accomplish this reset feature.
With reference to
Note that the reset command may be initiated and closed by specific head gesture(s). Alternatively, the field of view could be coupled to the viewer's head position with a “weak force.” For example, the “weak force” could operate such that above a certain threshold speed, the displayed field of view would change in accordance with the user's head position. In contrast, when head movement was slower than the certain threshold speed, the field of view would remain constant but the user's head position would change.
The above discussion focused on head-mounted display devices. However, the present invention contemplates a variety of portable display devices operable to control a computer system through intuitive body gestures and natural movements. For example, a wrist worn display could be controlled by hand, wrist, and arm movements. This would allow functions such as pan, zoom, and scroll to be effected upon the wrist worn display. The wrist worn display could be coupled remotely with a central computer system controlled by the user through the wrist worn display. Alternatively, the wrist worn display itself could house a computer system controlled by the intuitive gestures. Additionally, the gesture tracking device could be separate from the wearable display device, allowing the user to attach the gesture tracking device and manipulate it as desired. Still further, the user may be provided multiple wearable control devices for controlling the computer system through intuitive body gestures.
A further portable device operable to control a computer system through intuitive body gestures and natural movements is a Personal Digital Assistant (PDA).
As with the head wearable display, we define specific hand gestures that correspond in an intuitive manner with “special discrete commands.”
In step 1404, the computer system must distinguish special discrete commands from other hand movement simply not intended to adjust the user's field of view, such as small natural movements caused by the user's environment. This can be accomplished in step 1406 through a variety of mechanisms. In some embodiments, certain hand gestures could be mapped to corresponding special discrete commands. For specific examples, see the descriptions of
In any event, when the computer system has ascertained in step 1404 that a special discrete instruction has occurred, control is passed to a step 1408. In step 1408, the computer system applies a function associated with the special discrete command to the sensed hand motion. These functions can be based on hand position and all related derivatives (velocity, acceleration, etc.). These functions may also be piecewise, with discrete portions having varying response characteristics. Once such a function has been applied, control is passed to a step 1410 wherein the user's display is adjusted accordingly. Once the display is adjusted, control is passed back to monitor hand movement step 1402.
With reference to
While the present invention has been described in tenns terms of several preferred embodiments, there are many alterations, permutations, and equivalents which may fall within the scope of this invention. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4209255||Mar 30, 1979||Jun 24, 1980||United Technologies Corporation||Single source aiming point locator|
|US4227209||Aug 9, 1978||Oct 7, 1980||The Charles Stark Draper Laboratory, Inc.||Sensory aid for visually handicapped people|
|US4548485||Sep 1, 1983||Oct 22, 1985||Stewart Dean||Reading device for the visually handicapped|
|US4565999||Apr 1, 1983||Jan 21, 1986||Prime Computer, Inc.||Light pencil|
|US4567479||Dec 23, 1982||Jan 28, 1986||Boyd Barry S||Directional controller apparatus for a video or computer input|
|US4682159||Jun 20, 1984||Jul 21, 1987||Personics Corporation||Apparatus and method for controlling a cursor on a computer display|
|US4790028||Sep 12, 1986||Dec 6, 1988||Westinghouse Electric Corp.||Method and apparatus for generating variably scaled displays|
|US5003300||May 31, 1988||Mar 26, 1991||Reflection Technology, Inc.||Head mounted display for miniature video display system|
|US5109282||Jun 20, 1990||Apr 28, 1992||Eye Research Institute Of Retina Foundation||Halftone imaging method and apparatus utilizing pyramidol error convergence|
|US5125046||Jul 26, 1990||Jun 23, 1992||Ronald Siwoff||Digitally enhanced imager for the visually impaired|
|US5151722||Nov 5, 1990||Sep 29, 1992||The Johns Hopkins University||Video display on spectacle-like frame|
|US5195180||Jun 21, 1989||Mar 16, 1993||Sharp Kabushiki Kaisha||Method for displaying an image including characters and a background|
|US5267331||Apr 29, 1992||Nov 30, 1993||Ronald Siwoff||Digitally enhanced imager for the visually impaired|
|US5281957||Jul 10, 1991||Jan 25, 1994||Schoolman Scientific Corp.||Portable computer and head mounted display|
|US5283560||Jun 25, 1991||Feb 1, 1994||Digital Equipment Corporation||Computer system and method for displaying images with superimposed partially transparent menus|
|US5320538||Sep 23, 1992||Jun 14, 1994||Hughes Training, Inc.||Interactive aircraft training system and method|
|US5322441||Aug 21, 1992||Jun 21, 1994||Texas Instruments Incorporated||Method and apparatus for providing a portable visual display|
|US5325123||Apr 16, 1992||Jun 28, 1994||Bettinardi Edward R||Method and apparatus for variable video magnification|
|US5359675||Jun 9, 1992||Oct 25, 1994||Ronald Siwoff||Video spectacles|
|US5367315||Nov 15, 1990||Nov 22, 1994||Eyetech Corporation||Method and apparatus for controlling cursor movement|
|US5367614||Apr 1, 1992||Nov 22, 1994||Grumman Aerospace Corporation||Three-dimensional computer image variable perspective display system|
|US5373857||Jun 18, 1993||Dec 20, 1994||Forte Technologies, Inc.||Head tracking apparatus|
|US5422653||Jan 7, 1993||Jun 6, 1995||Maguire, Jr.; Francis J.||Passive virtual reality|
|US5442734||Mar 6, 1992||Aug 15, 1995||Fujitsu Limited||Image processing unit and method for executing image processing of a virtual environment|
|US5450596||Jul 18, 1991||Sep 12, 1995||Redwear Interactive Inc.||CD-ROM data retrieval system using a hands-free command controller and headwear monitor|
|US5526481||Sep 7, 1995||Jun 11, 1996||Dell Usa L.P.||Display scrolling system for personal digital assistant|
|US5526812||Oct 27, 1995||Jun 18, 1996||General Electric Company||Display system for enhancing visualization of body structures during medical procedures|
|US5579026||May 13, 1994||Nov 26, 1996||Olympus Optical Co., Ltd.||Image display apparatus of head mounted type|
|US5581271||Dec 5, 1994||Dec 3, 1996||Hughes Aircraft Company||Head mounted visual display|
|US5581670||Jul 21, 1993||Dec 3, 1996||Xerox Corporation||User interface having movable sheet with click-through tools|
|US5587936||Oct 5, 1993||Dec 24, 1996||Vpl Research, Inc.||Method and apparatus for creating sounds in a virtual world by simulating sound in specific locations in space and generating sounds as touch feedback|
|US5590062||Jun 21, 1994||Dec 31, 1996||Matsushita Electric Industrial Co., Ltd.||Simulator for producing various living environments mainly for visual perception|
|US5602566||Aug 23, 1994||Feb 11, 1997||Hitachi, Ltd.||Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor|
|US5617114||May 24, 1995||Apr 1, 1997||Xerox Corporation||User interface having click-through tools that can be composed with other tools|
|US5645077||Jun 16, 1994||Jul 8, 1997||Massachusetts Institute Of Technology||Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body|
|US5661632||Sep 29, 1995||Aug 26, 1997||Dell Usa, L.P.||Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions|
|US5666499||Aug 4, 1995||Sep 9, 1997||Silicon Graphics, Inc.||Clickaround tool-based graphical interface with two cursors|
|US5675746||Sep 30, 1992||Oct 7, 1997||Marshall; Paul S.||Virtual reality generator for use with financial information|
|US5683297||Dec 16, 1994||Nov 4, 1997||Raviv; Roni||Head mounted modular electronic game system|
|US5686940||Dec 23, 1994||Nov 11, 1997||Rohm Co., Ltd.||Display apparatus|
|US5689287||Jan 22, 1996||Nov 18, 1997||Xerox Corporation||Context-preserving display system using a perspective sheet|
|US5689619||Aug 9, 1996||Nov 18, 1997||The United States Of America As Represented By The Secretary Of The Army||Eyetracker control of heads-up displays|
|US5689667||Jun 6, 1995||Nov 18, 1997||Silicon Graphics, Inc.||Methods and system of controlling menus with radial and linear portions|
|US5734421||May 30, 1995||Mar 31, 1998||Maguire, Jr.; Francis J.||Apparatus for inducing attitudinal head movements for passive virtual reality|
|US5742264 *||Jan 23, 1996||Apr 21, 1998||Matsushita Electric Industrial Co., Ltd.||Head-mounted display|
|US5777715||Jan 21, 1997||Jul 7, 1998||Allen Vision Systems, Inc.||Low vision rehabilitation system|
|US5790769||Aug 4, 1995||Aug 4, 1998||Silicon Graphics Incorporated||System for editing time-based temporal digital media including a pointing device toggling between temporal and translation-rotation modes|
|US5835077||Mar 15, 1996||Nov 10, 1998||Remec, Inc.,||Computer control device|
|US5844544||Jun 13, 1995||Dec 1, 1998||H. K. Eyecan Ltd.||Visual communications apparatus employing eye-position monitoring|
|US5844824||May 22, 1997||Dec 1, 1998||Xybernaut Corporation||Hands-free, portable computer and system|
|US5923307||Jan 27, 1997||Jul 13, 1999||Microsoft Corporation||Logical monitor configuration in a multiple monitor environment|
|US5926178||Apr 1, 1997||Jul 20, 1999||Silicon Graphics, Inc.||Display and control of menus with radial and linear portions|
|US5959605||Nov 22, 1995||Sep 28, 1999||Picker International, Inc.||Video magnifier|
|US5973669||Aug 22, 1996||Oct 26, 1999||Silicon Graphics, Inc.||Temporal data control system|
|US5977935 *||Aug 12, 1994||Nov 2, 1999||Seiko Epson Corporation||Head-mounted image display device and data processing apparatus including the same|
|US5991085||Jul 12, 1996||Nov 23, 1999||I-O Display Systems Llc||Head-mounted personal visual display apparatus with image generator and holder|
|US6005482||Sep 17, 1998||Dec 21, 1999||Xerox Corporation||Surface mounted information collage|
|US6061064||Aug 27, 1996||May 9, 2000||Sun Microsystems, Inc.||System and method for providing and using a computer user interface with a view space having discrete portions|
|US6084556 *||Mar 9, 1999||Jul 4, 2000||Vega Vista, Inc.||Virtual computer monitor|
|US6115025||Sep 30, 1997||Sep 5, 2000||Silicon Graphics, Inc.||System for maintaining orientation of a user interface as a display changes orientation|
|US6115028||Aug 22, 1996||Sep 5, 2000||Silicon Graphics, Inc.||Three dimensional input system using tilt|
|US6118427||Apr 18, 1996||Sep 12, 2000||Silicon Graphics, Inc.||Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency|
|US6127990||Jan 21, 1999||Oct 3, 2000||Vega Vista, Inc.||Wearable display and methods for controlling same|
|US6148271||Jan 14, 1998||Nov 14, 2000||Silicon Pie, Inc.||Speed, spin rate, and curve measuring device|
|US6151563||Sep 15, 1999||Nov 21, 2000||Silicon Pie, Inc.||Speed, spin rate, and curve measuring device using magnetic field sensors|
|US6184847||Sep 22, 1999||Feb 6, 2001||Vega Vista, Inc.||Intuitive control of portable data displays|
|US6184859||Apr 19, 1996||Feb 6, 2001||Sony Corporation||Picture display apparatus|
|US6292158||May 1, 1998||Sep 18, 2001||Shimadzu Corporation||Display system|
|US6353436||May 8, 2000||Mar 5, 2002||Sun Microsystems, Inc.||Graphical user interface|
|US6359603 *||Aug 12, 1999||Mar 19, 2002||Vega Vista, Inc.||Portable display and methods of controlling same|
|US6361507||Apr 5, 2000||Mar 26, 2002||Massachusetts Institute Of Technology||Inertial orientation tracker having gradual automatic drift compensation for tracking human head and other similarly sized body|
|US6445364 *||Jun 28, 2001||Sep 3, 2002||Vega Vista, Inc.||Portable game display and method for controlling same|
|US6457024||Jul 14, 1998||Sep 24, 2002||Lee Felsentein||Wearable hypermedium system|
|US6590583||Jan 16, 2002||Jul 8, 2003||Planetweb, Inc.||Method for context-preserving magnification of digital image regions|
|1||Article entitled "Compensating lags in Head-Coupled Displays Using Head Position Prediction and Image Deflection," Journal of Aircraft, vol. 29, No. 6, Nov.-Dec. 1992, by Richard H.Y. So and Michael J. Griffin (pp. 1064 to 1068).|
|2||Article entitled "Improving Static and Dynamic Registration in an Optical See-through HMD," by Ronald Azuma and Gary Bishop, Computer Graphics Proceedings Annual Conference Series 1994, Jul. 24, 1994 (pp. 197 to 203).|
|3||Article entitled "Priority Rendering with a Virtual Reality Address Recalculation Pipeline" Computer Graphics Proceedings, Annual Conference Series, 1994 (pp. 155 to 162).|
|4||Chameleon apparatus (1990) as depicted in http://www.dgp.toronto.edu/~gf/videos.htm at links http://www.dgp.toronto.edu/~gf/videos/Chameleon.mpg 1994 and http://www.dgp.toronto.edu/~gf/videos/Spatially-aware%20palmtop%20-%20Chameleon.mpg (screenshots and transcription of audio associated with video are provided).|
|5||Chameleon apparatus (1990) as depicted in http://www.dgp.toronto.edu/˜gf/videos.htm at links http://www.dgp.toronto.edu/˜gf/videos/Chameleon.mpg 1994 and http://www.dgp.toronto.edu/˜gf/videos/Spatially-aware%20palmtop%20-%20Chameleon.mpg (screenshots and transcription of audio associated with video are provided).|
|6||Examiner's Interview Summary dated Jun. 12, 2000 for U.S. Appl. No. 09/235,096.|
|7||Final Office Action dated Sep. 7, 2000 for U.S. Appl. No. 09/373,186.|
|8||Flyer, "1995 Master Source-Book," Industrial Computer Source, 1995 (2 pages).|
|9||Flyer, "Computer Magnification Systems," TeleSensory, 1995 (4 pages).|
|10||Flyer, "Digital Audio Soundblaster," Creative Labs, Inc., Sep. 19, 1995 (one page).|
|11||Flyer, "Dragon Dictate the Premier PC Dictation Program," Dragon Systems, Inc., Dec. 1994 (4 pages).|
|12||Flyer, "Introducing Head Master Plus the Mouse and Keyboard Alternative for Personal Computers," Prentke Romich Company, Mar. 1995 (2 pages).|
|13||Flyer, "Magnify your screen and your possibilities," ZoomText, Mar. 1995 (two pages).|
|14||Flyer, "Magnum GT Graphics & Text Screen Enlarger," Artic Technologies, Jan. 1, 1995 (one page).|
|15||Flyer, "MGA Power Family," Matrox Graphics Inc., Nov. 1995 (2 pages).|
|16||Flyer, "OPTELEC Independence Solutions for People with Low Vision" Optelec, 1993 (6 pages).|
|17||Flyer, "Talk to your PC Just Voice: Professional Speech Recognition for Windows," Integrated Wave Technologies, Inc., Nov. 1995 (3 pages).|
|18||Flyer, "Ultra-small angular velocity sensor with Murata's unique triangular prism vibrating unit" Gyrostar, Murata Mfg. Co., Ltd., Aug. 29, 1995 (2 pages).|
|19||Flyer, "Virtual Reality Products That Work As Hard As You Do," General Reality Company, Mar. 1995, (6 pages).|
|20||Flyer, "Why Mouse Dual Input Adapters," P.I. Engineering, 1995, (2 pages).|
|21||Flyer, A Brighter Picture A Fuller Life-the Visionics Low Vision Enhancing, Visionics Corporation, Mar. 1995 (4 pages).|
|22||Flyer, A Brighter Picture A Fuller Life—the Visionics Low Vision Enhancing, Visionics Corporation, Mar. 1995 (4 pages).|
|23||Flyer, MAGic Deluxe Microsystems Software, Inc., Mar. 1995 (two pages).|
|24||Flyer, Virtual Computer Monitor, General Reality Corporation, Mar. 1995 (2 pages).|
|25||Goodrich GL, Meh EB, and Darling NC: Parameters in The Use of CCTV's and Optical Aids. Am Jour Optom, vol. 57, No. 12, pp. 881-892, 1980.|
|26||Grant of Petition for Revival of an Application for Patent Abandoned Unintentionally dated Jun. 25, 2001 for U.S. Appl. No. 09/373,186.|
|27||IDS dated Aug. 12, 1999 for U.S. Appl. No. 09/373,186.|
|28||IDS dated Dec. 11, 1998 for U.S. Appl. No. 08/563,525.|
|29||IDS dated Feb. 16, 1996 and Supplemental IDS dated Feb. 27, 1996 for U.S. Appl. No. 08/563,525.|
|30||IDS dated Jan. 21, 1999 for U.S. Appl. No. 09/235,096.|
|31||IDS dated Jan. 30, 2001 for U.S. Appl. No. 09/373,186 and petition under 37 C.F.R. 1.97(d)(2).|
|32||IDS dated Jun. 25, 1999 for U.S. Appl. No. 09/264,799.|
|33||IDS dated Mar. 28, 2001 for U.S. Appl. No. 09/373,186.|
|34||IDS dated Mar. 8, 1999 for U.S. Appl. No. 09/235,096.|
|35||IDS dated Mar. 8, 1999 for U.S. Appl. No. 09/264,799.|
|36||IDS dated Oct. 16, 2000 for U.S. Appl. No. 09/373,186.|
|37||IDS dated Oct. 27, 2000 for U.S. Appl. No. 09/373,186.|
|38||Inertial proprioceptive devices: Self-motion-sensing toys and tools by C. Verplaetse, IBM Systems Journal, vol. 35, Nos. 3&4, 1996, pp. 639-650.|
|39||International Search Report dated Feb. 2, 2000 for PCT Patent Application No. PCT/US99/21235.|
|40||International Search Report dated Sep. 19, 2000 for PCT Patent Application No. PCT/US00/15210.|
|41||Legge G, Pelli D, et al. Report of the Low Vision and its Rehabilitation Panel. Vision Research-A National Plan 19 94-1998, A Report of the National Advisory Eye Council, 1994, pp. 304-321.|
|42||Legge G, Pelli D, et al. Report of the Low Vision and its Rehabilitation Panel. Vision Research—A National Plan 19 94-1998, A Report of the National Advisory Eye Council, 1994, pp. 304-321.|
|43||Notice of Abandonment dated Dec. 2, 2003 for U.S. Appl. No. 10/183,181.|
|44||Notice of Abandonment dated Dec. 27, 1999 for U.S. Appl. No. 08/563,525.|
|45||Notice of Abandonment dated Jun. 15, 2004 for U.S. Appl. No. 09/895,576.|
|46||Notice of Allowance dated Aug. 13, 2001 for U.S. Appl. No. 09/373,186.|
|47||Notice of Allowance dated Jul. 19, 1999 for U.S. Appl. No. 09/235,096.|
|48||Notice of Allowance dated Jul. 29, 1999 for U.S. Appl. No. 09/264,799.|
|49||Notice of Allowance dated May 21, 2002 for U.S. Appl. No. 09/895,765.|
|50||Notice of Allowance dated Nov. 2, 1998 for U.S. Appl. No. 08/563,525.|
|51||Office Action dated Apr. 6, 1998 for U.S. Appl. No. 08/563,525.|
|52||Office Action dated Apr. 9, 2003 for U.S. Appl. No. 10/183,181.|
|53||Office Action dated Dec. 5, 2003 for U.S. Appl. No. 09/895,576.|
|54||Office Action dated Feb. 15, 2002 for U.S. Appl. No. 09/895,765.|
|55||Office Action dated Jun. 10, 1999 for U.S. Appl. No. 09/235,096.|
|56||Office Action dated Mar. 21, 2000 for U.S. Appl. No. 09/373,186.|
|57||Office Action dated Oct. 3, 2002 for U.S. Appl. No. 10/183,181.|
|58||Petition for Revival of an Application for Patent Abandoned Unintentionally and RCE dated May 11, 2001 for U.S. Appl. No. 09/373,186.|
|59||Preliminary Amendment dated Aug. 12, 1999 for U.S. Appl. No. 09/373,186.|
|60||Preliminary Amendment dated Jan. 21, 1999 for U.S. Appl. No. 09/235,096.|
|61||Preliminary Amendment dated Mar. 8, 1999 for U.S. Appl. No. 09/264,799.|
|62||Preliminary Amendment dated Sep. 24, 2002 for U.S. Appl. No. 10/183,181.|
|63||Publication entitled "Virtual Computer Monitor For Visually Impaired Users" by Arthur L. Zwern, General Reality Company and Michael R. Clark, Apple Computer Corporation, Nov. 30, 1994 (9 pages).|
|64||Publication entitled "Virtual Computer Monitor for Visually Impaired Users" by Arthur L. Zwern, General Reality Company and Michael R. Clark, Apple Computer, Inc., Advanced Technology Group, Aug. 28, 1995 (10 pages).|
|65||Request for Certificate of Correction dated Apr. 30, 2002 for U.S. Appl. No. 09/373,186.|
|66||Request for Corrected Filing Receipt dated Jun. 12, 2001 for U.S. Appl. No. 09/373,186.|
|67||Response to Apr. 6, 1998 Office Action for U.S. Appl. No. 08/563,525 dated Aug. 11, 1998.|
|68||Response to Jun. 10, 1999 Office Action for U.S. Appl. No. 09/235,096 dated Jun. 29, 1999.|
|69||Response to Mar. 21, 2000 Office Action for U.S. Appl. No. 09/373,186 dated Jun. 21, 2000.|
|70||Response to Oct. 3, 2002 Office Action for U.S. Appl. No. 10/183,181 dated Jan. 16, 2003.|
|71||Response to Sep. 7, 2000 Final Office Action for U.S. Appl. No. 09/373,186 dated Dec. 7, 2000.|
|72||Re-submission of Response to Sep. 7, 2000 Final Office Action for U.S. Appl. No. 09/373,186 dated Apr. 30, 2001.|
|73||Slides on "Virtual Computer Display for Visually-Impaired Users," CyberEye by General Reality, Nov. 30, 1994 (12 pages).|
|74||Slides, "Anta" poster paper, General Reality Company, Feb. 13, 1996 (12 pages).|
|75||Slides, "Virtual Computer Monitor for Visually-Impaired Users" by Arthur Zwern, General Reality Company and Michael Clark, Apple Computer Corporation, Aug. 28, 1995 (12 pages).|
|76||Terminal Disclaimer dated Apr. 16, 2002 for U.S. Appl. No. 09/895,765.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8606316 *||Oct 21, 2009||Dec 10, 2013||Xerox Corporation||Portable blind aid device|
|US8643951||Mar 15, 2012||Feb 4, 2014||Google Inc.||Graphical menu and interaction therewith through a viewing window|
|US8957910 *||Nov 2, 2011||Feb 17, 2015||Nintendo Co., Ltd.||Game system, game device, storage medium storing game program, and game process method|
|US8970692||Mar 2, 2012||Mar 3, 2015||Industrial Technology Research Institute||Head mount personal computer and interactive system using the same|
|US9013264||Mar 12, 2012||Apr 21, 2015||Perceptive Devices, Llc||Multipurpose controller for electronic devices, facial expressions management and drowsiness detection|
|US9035878||Feb 29, 2012||May 19, 2015||Google Inc.||Input system|
|US9091851||Jan 25, 2012||Jul 28, 2015||Microsoft Technology Licensing, Llc||Light control in head mounted displays|
|US9097890||Mar 25, 2012||Aug 4, 2015||Microsoft Technology Licensing, Llc||Grating in a light transmissive illumination system for see-through near-eye display glasses|
|US9097891||Mar 26, 2012||Aug 4, 2015||Microsoft Technology Licensing, Llc||See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment|
|US9128281||Sep 14, 2011||Sep 8, 2015||Microsoft Technology Licensing, Llc||Eyepiece with uniformly illuminated reflective display|
|US9129295||Mar 26, 2012||Sep 8, 2015||Microsoft Technology Licensing, Llc||See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear|
|US9134534||Mar 26, 2012||Sep 15, 2015||Microsoft Technology Licensing, Llc||See-through near-eye display glasses including a modular image source|
|US9182596||Mar 26, 2012||Nov 10, 2015||Microsoft Technology Licensing, Llc||See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light|
|US9223134||Mar 25, 2012||Dec 29, 2015||Microsoft Technology Licensing, Llc||Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses|
|US9229227||Mar 25, 2012||Jan 5, 2016||Microsoft Technology Licensing, Llc||See-through near-eye display glasses with a light transmissive wedge shaped illumination system|
|US9245364 *||Jan 15, 2013||Jan 26, 2016||Lenovo (Beijing) Co., Ltd.||Portable device and display processing method for adjustment of images|
|US9268136||Sep 28, 2012||Feb 23, 2016||Google Inc.||Use of comparative sensor data to determine orientation of head relative to body|
|US9285589||Jan 3, 2012||Mar 15, 2016||Microsoft Technology Licensing, Llc||AR glasses with event and sensor triggered control of AR eyepiece applications|
|US9329689||Mar 16, 2011||May 3, 2016||Microsoft Technology Licensing, Llc||Method and apparatus for biometric data capture|
|US9341843||Mar 26, 2012||May 17, 2016||Microsoft Technology Licensing, Llc||See-through near-eye display glasses with a small scale image source|
|US9366862||Mar 26, 2012||Jun 14, 2016||Microsoft Technology Licensing, Llc||System and method for delivering content to a group of see-through near eye display eyepieces|
|US9448687||Feb 5, 2014||Sep 20, 2016||Google Inc.||Zoomable/translatable browser interface for a head mounted device|
|US20110092249 *||Oct 21, 2009||Apr 21, 2011||Xerox Corporation||Portable blind aid device|
|US20120165099 *||Nov 2, 2011||Jun 28, 2012||Nintendo Co., Ltd.||Game system, game device, storage medium storing game program, and game process method|
|US20120194549 *||Dec 30, 2011||Aug 2, 2012||Osterhout Group, Inc.||Ar glasses specific user interface based on a connected external device type|
|US20130117707 *||Nov 8, 2011||May 9, 2013||Google Inc.||Velocity-Based Triggering|
|US20130182016 *||Jan 15, 2013||Jul 18, 2013||Beijing Lenovo Software Ltd.||Portable device and display processing method|
|CN103180893A *||Jul 5, 2012||Jun 26, 2013||索尼公司||Method and system for use in providing three dimensional user interface|
|CN103180893B *||Jul 5, 2012||Jan 20, 2016||索尼公司||用于提供三维用户界面的方法和系统|
|WO2013028268A1 *||Jul 5, 2012||Feb 28, 2013||Sony Corporation||Method and system for use in providing three dimensional user interface|
|U.S. Classification||345/8, 715/863, 715/729|
|International Classification||G09G5/00, G09B21/00, G06F3/01, G06F3/033, G06F3/048, G06F3/00|
|Cooperative Classification||G06F3/012, G06F2200/1637, G06F3/011, G06F3/0346|
|European Classification||G06F3/01B, G06F3/01B2, G06F3/0346|
|Nov 11, 2010||AS||Assignment|
Owner name: REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP, VIRGI
Effective date: 20101109
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REMBRANDT TECHNOLOGIES, LP;REEL/FRAME:025348/0041
|Jun 28, 2012||AS||Assignment|
Owner name: VEGA VISTA, INC., CALIFORNIA
Effective date: 20120329
Free format text: PATENT ACQUISITION AGREEMENT;ASSIGNOR:REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP;REEL/FRAME:028466/0229
|Sep 17, 2012||REMI||Maintenance fee reminder mailed|
|Feb 3, 2013||REIN||Reinstatement after maintenance fee payment confirmed|
|Feb 3, 2013||LAPS||Lapse for failure to pay maintenance fees|
|May 22, 2013||AS||Assignment|
Owner name: VEGA VISTA, INC., CALIFORNIA
Effective date: 20120329
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP;REEL/FRAME:030469/0811
|Jun 3, 2013||PRDP||Patent reinstated due to the acceptance of a late maintenance fee|
Effective date: 20110510
|Jun 4, 2013||FPAY||Fee payment|
Year of fee payment: 12
|Jul 25, 2013||AS||Assignment|
Owner name: VEGA VISTA, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FATEH, SINA;FLACK, JAMES F.;ZWERN, ARTHUR L.;SIGNING DATES FROM 20000118 TO 20000905;REEL/FRAME:030876/0458
Effective date: 20071018
Owner name: REMBRANDT TECHNOLOGIES, LP, PENNSYLVANIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VEGA VISTA, INC.;REEL/FRAME:030879/0322