USRE42336E1 - Intuitive control of portable data displays - Google Patents
Intuitive control of portable data displays Download PDFInfo
- Publication number
- USRE42336E1 USRE42336E1 US12/574,607 US57460709A USRE42336E US RE42336 E1 USRE42336 E1 US RE42336E1 US 57460709 A US57460709 A US 57460709A US RE42336 E USRE42336 E US RE42336E
- Authority
- US
- United States
- Prior art keywords
- user
- implemented method
- display
- recited
- computer implemented
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- the invention relates to human/computer interfaces to visual data and more particularly to systems that must display a larger amount of visual data than may be conveniently displayed in a single conventional computer monitor.
- the present invention uses virtual reality techniques to provide instantaneous and intuitive access to large fields of visual data, and to provide visually-impaired users with enhanced access to enlarged visual data.
- the first problem is spatial orientation, in that it is difficult to determine where on the page one's view is directed at any given time. This occurs because the monitor does not move, and there are no other visual cues to indicate where on the virtual page one's line of sight is facing.
- This spatial orientation problem is exacerbated for high magnifications and for portable systems employing small display monitors. For example, one study (Goodrich, et. al.) found mean magnifications of 15.48 ⁇ for nearly 100 experienced users of closed-circuit television devices. At 15 ⁇ , a 15′′ monitor can only display about 1% of a standard 8.5′′ ⁇ 11′′ page, making most computer work essentially impossible for such users.
- the second fundamental problem in the conventional approach is dynamic control, in that all of the various control schemes for navigating about the page are cumbersome, confusing, and slow. This is because the navigation methods are unintuitive, relying on such logic as “use joystick to move cursor around screen, and when cursor reaches the edge of the screen, the next portion of document in that direction will be displayed.” Alternatively, some screen enlargers maintain the cursor at the center of the screen, and require the user to position a desired insertion point over the cursor by moving the entire virtual page with a mouse or joystick. In all cases, dynamic control is not only unintuitive, but requires use of at least one hand, which negatively impacts productivity, and may make use by physically-impaired users difficult or impossible.
- virtual reality is typically defined as a computer-generated three-dimensional environment providing the ability to navigate about the environment, turn one's head to look around the environment, and interact with simulated objects in the environment using a control peripheral.
- the user is “immersed” in a synthetic environment, in which virtual objects can be located anywhere in the user's physical space.
- the user views these objects by wearing a head-mounted display (HMD), which uses an optical system to cause a tiny display source such as a cathode ray tube or liquid crystal display to appear as a large display screen several feet in front of the user. Since the display source (or sources in the case of two eyes) is fixed to the user's head, the display is viewable regardless of where the user points his line-of-sight.
- the user also wears a head-tracker, which senses the direction the user is facing, and sends this information to the host computer.
- the computer uses this data to generate graphics corresponding to the user's line of sight in the virtual environment.
- This approach to human/computer interfaces was first conceived by Ivan Sutherland in 1966 for use in military simulators, and was first commercialized in the form of the Eyephone head-mounted display by VPL Research in the late 1980s.
- U.S. Pat. No. 4,227,209 issued Oct. 10, 1980 discloses an electronic sensory aid for visually-impaired users including an image sensor and a display array, wherein the degree of magnification provided in the display array may be adjusted by changing the number of display elements corresponding to each sensor array element.
- an improved image capture approach is disclosed in U.S. Pat. No. 5,325,123 issued Jun. 28, 1994, in which the imaging camera includes an opaque stop with a small aperture, thus allowing the magnification to be adjusted by moving the camera towards or away from the object to be magnified.
- a non-electronic sensory aid is disclosed in U.S. Pat. No. 4,548,485 issued Oct. 22, 1985, in which an XY stage is used to move textual material across an optical viewing system that captures a portion of the textual material for enlargement.
- U.S. Pat. No. 5,367,614 issued Nov. 22, 1994 to Bisey discloses a three-dimensional computer image display system using an ultrasonic transceiver head-tracking system to control a three-dimensional display to cause the image to change its perspective in response to head movements.
- U.S. Pat. No. 5,442,734 issued Aug. 15, 1995 to Murakami discloses a virtual reality system incorporating a head-mounted display, head-tracker, and image processing system in which predictive tracking algorithms are used to differentially update portions of the display field to provide more rapid updating of those portions of the display field corresponding to the center of the user's visual field.
- U.S. Pat. No. 5,003,300 issued Mar. 26, 1991 to Wells discloses a raster-based head-mounted display that may be used to display an image to either eye.
- U.S. Pat. No. 5,151,722 issued Sep. 29, 1992 to Massof discloses a video-based head-mounted display featuring a unique folding optic configuration so that the device may be worn like a pair of glasses.
- U.S. Pat. No. 5,281,957 issued Jan. 25, 1994 to Schoolman discloses a portable computer system incorporating a head-mounted display that may be hinge-mounted to an eyeglass frame so that the display may be folded up out of the way for viewing the physical environment.
- LVES Low-Vision Enhancement System
- the LVES device incorporates a head-mounted display with integrated cameras and an image processing system.
- the cameras generate an image of whatever is positioned directly in front of the user, and the image processing system enlarges the image and performs enhancement functions such as contrast enhancement.
- the LVES device can provide magnified imagery of real-world objects to some visually-impaired users, it suffers from several shortcomings compared to the present invention.
- the LVES does not incorporate a head-tracker to provide a hands-free means for navigating within computer data. Further, the LVES suffers from a jitter problem exactly analogous to that experienced by users of binoculars or telescopes. In simple terms, any jitter in the user's line-of-sight is magnified by the same factor as the imagery, which causes the image provided to the user to appear unsteady.
- U.S. Pat. No. 5,109,282 issued Apr. 28, 1992 to Peli discloses a novel image processing method for converting continuous grey tone images into high resolution halftone images, and describes an embodiment of the method applicable to presentation of enlarged imagery to visually-impaired users via a head-mounted display.
- the imagery is generated by a conventional camera manually scanned across printed text as is common in closed-circuit television systems for the visually-impaired.
- the head-mounted display is a Private Eye by Reflection Technologies (Waltham, Mass.), which employs a linear array of light-emitting diodes converted to the impression of a rectangular array by means of a scanning mirror.
- U.S. Pat. No. 5,373,857 issued Dec. 12, 1994 to Travers discloses a head-tracking approach for the yaw degree of freedom in virtual reality applications consisting of a magnetic sensor disposed on a headset to produce a displacement signal relative to angular displacement of the head set with respect to the earth's magnetic field.
- a more sophisticated approach has been developed by the Massachusetts Institute of Technology (MIT), in which an analogous magnetic sensor is used to correct drift in a much faster differential sensor such as an accelerometer, which sensors together provide extremely rapid response and high accuracy within a single package.
- the MIT approach believed to be patent-pending, additionally incorporates differential sensors to detect changes in the pitch and roll degrees of freedom, which sensors may also be corrected using slower absolute sensors such as liquid-filled capacitive tilt sensors.
- U.S. Pat. No. 4,565,999 issued Jan. 21, 1986 to King discloses a cursor control system for use with a data terminal wherein a radiation source and a radiation sensor are used to determine changes in a user's head position for purposes of controlling cursor position on the screen.
- U.S. Pat. No. 4,567,479 issued Jan. 28, 1986 to Boyd discloses a directional controller for video or computer input by physically-impaired users consisting of a series of mercury switches disposed in proximity to a user's head, wherein movements of the user's head are sensed and converted into cursor control commands.
- This device also employs a pressure switch activated by the user's mouth which can provide a further control signal such as that generated by clicking a mouse button.
- U.S. Pat. No. 4,682,159 issued Jul. 27, 1987 to Davison discloses an apparatus and method for controlling a cursor on a computer display that consists of a headset worn by the user, and a stationary ultrasonic transmitter for emitting sound waves which are picked up by receivers in the headset. These sound waves are compared for phase changes, which are converted into positional change data for controlling the cursor.
- U.S. Pat. No. 5,367,315 issued Nov. 22, 1994 to Pan discloses an infrared-light based system that indicates head and eye position in real time, so as to enable a computer user to control cursor movement on a display by moving his or her eyes or head.
- the device is intended to emulate a standard mouse, thereby allowing use of the presently available software and hardware.
- VCM virtual computer monitor
- a “virtual computer monitor” which is broadly comprised of a head-mounted display means worn by the user, a head-orientation sensing means worn by the user, and software means for interfacing these devices to a host computer such that the user's head orientation data is processed to determine which portion of an arbitrary software application's output imagery to display.
- the portion of the virtual image being seen by the user is to the right of the portion of the virtual image previously being seen by the user.
- the portion of the virtual image being seen by the user is above the portion of the virtual image previously seen by the user.
- the user Upon initialization of the VCM device, the user triggers calibration between the user's straight-ahead line of sight and the center of the virtual page. From then on, the user can rotate her head left, right, up, and down to visually scan across the page in corresponding directions. The overall impression is analogous to a normally sighted person scanning across a newspaper page.
- the VCM software provides a magnification adjustment to allow each user to achieve adequate visual resolution without needlessly reducing his instantaneous viewing field.
- the software also provides a cursor, which nominally remains positioned at the center of the HMD physical field regardless of head movements so that the cursor can be positioned anywhere upon the virtual page by turning to face that location.
- a further adjustment allows setting the fixed cursor location to any arbitrary position in the HMD device's physical field, so that users with unusable portions of their visual fields can select an alternative preferred retinal loci instead of the center.
- a software selection also provides an overview display, which shows a reduced-magnification image of the entire virtual page, with a bold black box highlighting the outline of the instantaneous field within the entire field.
- An additional important feature is the ability to temporarily adjust the cursor position in real-time using a controller peripheral such as a joystick or mouse. This feature allows fine positioning of the cursor within the field by temporarily locking the head-tracking system to freeze a portion of the virtual page on the physical display, while the controller is used to move the cursor in small increments.
- An additional important feature is the ability to display image components in addition to the cursor at fixed points in the physical display, which allows menus or other icons to remain in the user's instantaneous viewing field at all times while scrolling across image content.
- An additional important feature resides in the ability to reduce the lag between a head motion and display of the new direction's image by using image deflection, thresholding, smoothing, prediction, and a novel drift compensation technique to reduce display “swimming”, which is caused whenever imperfect head orientation sensing causes the displayed image to not appear fixed in real-space.
- An additional important feature resides in the ability to magnify images by extremely large factors using spatial field compression, where the displayed image is scrolled across the physical display at a faster rate than the head is turned. This enables use by individuals with limited head motion, and allows magnification to levels that would otherwise require turning completely around to view edges of the image.
- An additional important feature resides in the use of a partially immersive HMD, which avoids simulation sickness by allowing the user to maintain a constant frame of reference in the physical world since real objects can be seen around one or more edges of the display.
- FIG. 1 is a conceptual sketch illustrating operation of a conventional screen enlarger
- FIG. 2 is a block diagram of the hardware components of the virtual computer monitor
- FIG. 3 is a conceptual sketch illustrating operation of a virtual computer monitor, and intuitive field navigation via head rotation
- FIG. 4 illustrates various means for configuring the virtual computer monitor display, including A) typical configuration, B) typical configuration in combination with a blockage of the user's foveal vision, C) mouse pointer/cursor offset to a non-central preferred retinal locus, and D) Entire display field offset to be centered about a non-central preferred retinal locus;
- FIG. 5 is a block diagram of the logical flow of data processing in an advanced embodiment of the virtual computer monitor
- FIG. 6 is a pictorial illustration showing the present invention applied to magnification of real-time imagery
- FIG. 7 is a pictorial illustration showing several intuitive head gestures that correspond to special discrete functions
- FIG. 8 is a flow chart illustrating one computer implemented method for controlling a computer system with a head-mounted display device
- FIGS. 9-11 are flow charts illustrating methods for performing magnification and scrolling commands with intuitive head gestures
- FIG. 12 is a flow chart illustrating one method for controlling the correspondence between the displayed field of view and the user's head position
- FIG. 13 is an illustration showing a PDA operable by intuitive body gestures, in accordance with an embodiment of the present invention.
- FIG. 14 is a flowchart showing a computer implemented method for responding to a user's hand movement
- FIG. 15 is a flowchart showing a method for discrete magnification in accordance with one aspect of the present invention.
- FIG. 16 is a flowchart showing a method for discrete de-magnification in accordance with another aspect of the present invention.
- the system of this invention 10 concerns a computer 12 controlled by a user through conventional means such as a keyboard 14 and input controller 16 , and whose output is viewed on a display monitor 18 .
- the invention 10 is specifically intended for use in applications where the total amount of data to be viewed can be configured as a virtual display 20 , which is significantly greater in extent than the amount of data that can be conveniently viewed within an instantaneous viewport 22 provided by the display monitor 18 .
- An example of such an application is when the virtual display 20 consists of a large computer desktop running several application windows, while the amount of data that can be visually resolved within the instantaneous viewport 22 consists of a single application window.
- the virtual display 20 consists of a word-processing document magnified for a visually-impaired user by a screen enlarger 24 , which may consist of software or a combination of software and hardware.
- conventional control means such as a keyboard 14 or input controller 16 may be used to select which portion of the virtual display 20 is shown within the display monitor 18 at any given moment, as described in the prior art.
- the most basic embodiment of the invention 10 is achieved by implementing the display monitor as a head-mounted display 26 , wherein tiny display sources such as LCDs are held within close proximity to the user's eyes, and optically coupled to the eyes with a lensing system such that the image of the computer display appears to float in space several feet in front of the user.
- tiny display sources such as LCDs are held within close proximity to the user's eyes, and optically coupled to the eyes with a lensing system such that the image of the computer display appears to float in space several feet in front of the user.
- Such devices are well known in the art, and are commercially available from sources including General Reality Company (San Jose, Calif.), Optics 1 (Westlake, Calif.), and Virtual I/O (Seattle, Wash.).
- the user wears a head-tracker 28 , which senses changes in orientation of the user's head and reports them to the computer 12 to result in the perception of scrolling the instantaneous viewport 22 across the virtual display 20 .
- Head-trackers are also well-known in the art, and a variety of different devices are available from sources including General Reality Company, Precision Navigation (Mountain View, Calif.), and Polhemus Inc. (Colchester, Vt.).
- head-mounted displays 26 and head-trackers 28 may be used without affecting the fundamental operating concepts embodied therein, and many suitable devices are commercially available.
- the head-mounted display 26 it is important to provide adequate field-of-view to ensure that a significant portion of the user's visual field is addressed, and to provide a sufficient number of picture elements, or pixels, so that small text can be resolved.
- Useful minimums are twenty degree field-of-view and 100,000 pixels per eye, although these figures are subjective.
- visual contrast must be high (100 to 1 or greater) for visually-impaired users. For some visually-impaired users, maximizing contrast can become sufficiently critical that a color display can not cannot be used, and a black and white unit must be used instead.
- parameters such as field-of-view, pixel count, contrast, size/weight, cost, and other factors such as apparent image distance well-known in the art of head-mounted display design may be traded-off to provide a best compromise over a varied population of users, or may be traded-off to optimize performance for a single user.
- the simplest embodiment of the invention 10 uses a CyberTrackTM model head-tracker 28 from General Reality Company.
- This model provides an output signal emulating that of a mouse, which can be read directly by a standard Microsoft mouse driver 30 for purposes of controlling the manner in which the instantaneous viewport 22 is selected from within the virtual display 20 .
- an additional software module can be used to interpret the output of the head-tracker 28 and convert the output into “mickeys” that emulate mouse output, or an additional software module can adapt the output of the head-tracker 28 for directly controlling scrolling of the instantaneous viewport 22 without use of an intervening mouse driver. All told, a wide variety of alternative means for converting head-tracker output into scrolling of the instantaneous viewport have been conceived, so the approach selected for the present embodiments should not be considered limitative of the invention.
- the result of the invention 10 is to provide the user with the perception that the virtual display 20 is fixed in space in front of the user, and that the user can position the instantaneous viewport 22 provided by the head-mounted display 26 at any point within the virtual display 20 merely by rotating his or her head to look in the desired direction.
- the user's nervous system provides proprioceptive feedback which constantly provides the user with a sense of direction, and because turning to look in a particular direction is a natural and intuitive means for viewing objects in that direction
- the invention 10 provides a solution for both the spatial orientation and dynamic control aspects of the field navigation problem.
- a mouse pointer 32 is shown.
- the mouse pointer 32 is typically maintained in the center of the instantaneous viewport 22 as the viewport is scrolled, and is used to allow selection of particular data items by the user. Such selection may be performed by clicking a button on the input controller 16 .
- the mouse pointer may also be adjusted to remain at a non-central position 34 within the instantaneous viewport 22 while the viewport is scrolled.
- Such a non-central position 34 may be used in a case where the user suffers from a visual impairment such as macular degeneration, which can cause the foveal (central) portion of the user's visual field to be blocked, as illustrated by the visual blockage 36 .
- a visual impairment such as macular degeneration
- An alternative approach in the event of a visual blockage is to physically rotate the head-mounted display 26 with respect to the user's line-of-sight, so that the instantaneous viewport 22 is no longer centered around the user's line-of-sight, but is instead skewed into an offset position 38 .
- an improvement to the simple scrolling of the instantaneous viewport 22 can be achieved using hardware or software logic that enables scrolling using a combination of data generated by the head-tracker 28 and the input controller 16 .
- head-tracker 28 output is used to perform large-magnitude positioning of the instantaneous viewport 22 with respect to the virtual display 20 .
- the input controller 16 can then be used to perform fine positioning of the instantaneous viewport.
- the input controller 16 can also be used to select data items, or click and drag to select multiple data items, as is common within the art.
- the instantaneous viewport is moved appropriately, maintaining the mouse pointer 32 at its selected location within the instantaneous viewport.
- the input controller 16 and head-tracker 28 can operate simultaneously, which allows “click & drag” functions such as holding down the mouse button to anchor one corner of a selection box, then scrolling the head until an opposing corner is reached, and releasing the mouse button to select all of the items within the resulting selection box.
- the present invention has been implemented in two alternative prototype embodiments, with additional embodiments contemplated.
- the first embodiment is constructed using an Apple Macintosh Duo 230 portable computer 12 , a General Reality CyberEye Model 100 head-mounted display 26 , and InLARGE screen magnifier software by Berkeley Systems (Berkeley, Calif.).
- the head-tracker 28 is an experimental device utilizing Gyrostar ENC-05E solid-state gyroscopes by Murata Manufacturing Company (Kyoto, Japan, and US location at Smyrna, Ga.). Two gyroscopes are used, one each for the head's pitch (elevation) and yaw (direction) degrees of freedom.
- each gyroscope consists of a differential voltage, with the difference voltage directly proportional to the angular velocity of the sensor. These outputs are fed to the Macintosh computer 12 via the Apple Desktop Bus (ADB) Port, which is used on all Macintosh computers for accepting input from keyboards, mice, and other input control devices.
- ADB Apple Desktop Bus
- the gyroscopes output differential data representing an angular velocity, the data is digitized using a simple analog-to-digital converter integrated circuit, and then used directly for scrolling the imagery, with only a linear scaling factor applied. This scaling factor is dependent on the magnification factor applied to the imagery, and serves to maintain the enlarged image at a fixed position in space as perceived by the user.
- an absolute orientation tracker such as a magnetometer
- the data must first be converted from orientation to rate of change in orientation by taking the mathematical derivative of the data with respect to time.
- Drift is evidenced by slow motions in the imagery which occur in the absence of any true head motion, and is corrected by incorporating a low-frequency cut-off filter in the tracking data output.
- Such low-frequency cut-off filters are well-known in the tracking art, and do not affect perceived performance.
- Temperature instability is evidenced by drift that occurs following rapid changes in the ambient temperature in which the tracker is used. Some such instability is removed with software which acts like a low-frequency cut-off filter by ignoring D.C. drift, while some is unavoidable and requires a waiting period for temperature of the system hardware to stabilize. This software ignores any D.C. signal component from the head tracker 28 and allows a scaling factor to be input to the system to control the magnitude of the shift in the virtual image as a function of the amount of rotation of the user's head.
- Hysteresis is evidenced by sensitivity differences between motion in one direction and motion in a direction 180 degrees opposite. This artifact can be addressed by using a different scaling factor depending upon the tracker's direction of travel. The magnitude of this sealing factor can be determined experimentally, depending upon the magnitude and direction of the hysteresis.
- Cross-axis coupling is evidenced by the displayed image moving a small amount in one axis when all of the head motion is along an orthogonal axis.
- This artifact is also controlled by the software which acts like a low-frequency cut-off filter, and may be further controlled by disabling one axis whenever the orthogonal axis rate of motion is greater than an empirically-determined threshold.
- the Apple Macintosh ADB port allows simultaneous operation of multiple input control peripherals. Because of this feature, either the input controller 16 or a variety of secondary controllers may be used in conjunction with the head-tracker 28 to perform navigation within the imagery. Such controllers include joysticks, trackballs, light pens, simple switches, or any other control device which is ADB port compatible.
- the second embodiment of the invention has been implemented for the Intel/Microsoft personal computer architecture.
- the computer 12 is a 90 Mhz Pentium host computer
- the head-mounted display 26 is a CyberEye Model 100
- the head-tracker 28 is a 3-axis magnetometer, available as the Model TCM-2 from Precision Navigation, Inc. (Mountain View, Calif.) or the CyberTrackTM from General Reality Company (San Jose, Calif.).
- This embodiment has been made functional using LP-DOS from Optelec (Westford, Mass.) as the screen enlarger 24 , although alternative commercially available screen enlargers may be used without modifying the remaining components of the system.
- the selected head-tracker 28 is an absolute orientation sensor, although any alternative head-tracking device may be used.
- the specific 3-axis magnetometer used as the head-tracker 28 in this embodiment connects to the serial port of the computer 12 , and provides an internal conversion from absolute position to differential data in the form of mouse “mickeys” compatible with the Intel/Microsoft personal computer architecture. Because of this feature, the output of the head-tracker 28 can be read directly by a standard Microsoft mouse driver, which provides a menu for setting the scaling factor required for maintaining a fixed image as perceived by the user.
- the Intel/Microsoft personal computer architecture does not make use of the Apple ADB bus, but instead uses RS-232 serial communication ports to connect to control devices such as the head-tracker 28 and the input controller 16 .
- this may be accomplished in one of at least five ways.
- an existing mouse driver that includes dual-port capability such as the original Borland mouse driver may be used.
- the source code for the standard Microsoft mouse driver may be modified to support simultaneous access to two serial ports.
- a device such as the “WhyMouse” by P.I. Engineering (Williamston, Mich.) may be used. This device serves as a “Y” adapter to connect two mouse-type pointing devices into one serial port. Circuitry internal to the WhyMouse automatically routes one or the other device's data to the serial port based on a priority scheme, wherein the first device to emit data gains control of the input.
- a custom solution can be implemented in the form of a unique software driver, or fifth, in the form of a software module running on an intelligent input/output controller such as the Rocketport32 card from Industrial Computer Source (San Diego, Calif.).
- intelligent input/output controllers are available from several commercial sources in the form of a circuit board that may be inserted in an expansion slot within the computer 12 .
- circuit boards include two or more serial ports, as well as an on-board processor that can manipulate the inputs from the serial ports prior to delivering the tracking data to the computer's internal bus.
- a custom software module avoids hardware costs, while providing the greatest flexibility in terms of application optimization and user convenience.
- a custom software module allows the user to select whether the input controller 16 and head-tracker 28 can operate simultaneously in the manner preferred by the inventor, or whether the input controller 16 and head-tracker 28 operate in a priority scheme as provided in the WhyMouse product.
- a custom software approach can provide optional use of a variety of alternative devices as the input controller 16 . For example, some users may prefer a hand-operated joystick to a mouse, while physically-impaired users may require a finger-operated joystick or head-operated directional switches with a puff & suck switch for activating the mouse clicking function.
- FIG. 5 a block diagram of an advanced embodiment of the invention 10 is shown.
- the computer 12 , keyboard 14 , input controller 16 , screen enlarger 24 , head-mounted display 26 , and head-tracker 28 are illustrated as previously defined, while a standard computer operating system such as Microsoft Windows is conceptually shown as 42 , a typical computer application such as Microsoft Word is shown as 44 , and a typical display driver such as a VGA graphics board is shown as 46 .
- the software module used for combining the inputs of the input controller 16 and head-tracker 28 is shown as the control driver 48 .
- An additional software module called the input remapper 50 is also shown interposed between the input controller 16 and the control driver 48 .
- This input remapper 50 is a program that converts inputs from a variety of potential devices that may be used as the input controller 16 into a single convenient data format such as mouse mickeys.
- the output of a joystick used as the input controller 16 can be remapped by the input remapper 50 so that pressing the joystick trigger button results in a mouse click signal being sent to the control driver 48 , moving the joystick to the left results in emulation of sliding the mouse to the left, etc.
- the control driver 48 can be made standard, with only the input remapper 50 modified whenever it is desirable to support a new type of input controller 16 within the invention 10 .
- the use of an input remapper 50 is a common approach in CD-ROM personal computer games, where the user can select between the mouse, joystick, keyboard, or other devices for purposes of controlling game play.
- FIG. 5 also illustrates use of a tracking formatter 52 , which is a software module interposed between the head-tracker 28 and the control driver 48 .
- the tracking formatter 52 performs various functions depending upon the particular sensing means employed within the head-tracker 28 . These functions can be separated into three categories.
- the first category of functions performed by the tracking formatter 52 is conversion of the data stream emanating from the head-tracker 28 into a format readable by the control driver 48 .
- This conversion is tracking sensor-dependent. In the case of a magnetometer-based tracker with mouse emulation as used in the Intel/Microsoft embodiment, no conversion is required. In the case of a magnetometer without mouse emulation, the tracking output would consist of rapidly-updated azimuth and elevation position figures, in which event the tracking formatter 52 would subtract the prior position sample from the present sample and then convert the format to mouse mickeys to provide the control driver 48 with emulated mouse output consisting of changes in position.
- the output of the head-tracker 28 consists of angular velocity figures.
- the angular velocity samples are simply multiplied by the time period of each sample to yield a change in position, with each positional change then converted into mouse mickeys by the tracking formatter 52 .
- the second category of functions performed by the tracking formatter 52 consists of error correction functions such as those previously described for the Apple Macintosh embodiment of the invention 10 .
- the tracking formatter 52 performs low-frequency cut-off filtering, applies a directionally-dependent scaling factor, and disables one axis of travel when the orthogonal axis velocity rises above a threshold.
- These functions could also be performed in hardware such as an application-specific integrated circuit or a field-programmable gate array if higher-performance at high-volume production is desirable.
- the third category of functions performed by the tracking formatter 52 consists of enhancement functions such as orientation prediction.
- This function addresses the pipeline delay between the instant in time when the head is turned, and the time when the displayed image is updated to display the new user line-of-sight.
- This delay can be calculated to be the sum of the tracker sensing time, tracker to computer communication time, tracker formatter processing time, control driver processing time, operating system and application software processing time, screen enlarger processing time, and display refresh time.
- the sum of these delays can become bothersome, causing a perception of the display “swimming” with respect to the user's line of sight changes.
- This swimming causes perceptual mismatches between the user's internal proprioceptive cues and external visual cues, which in severe cases can cause disorientation and nausea effects known in the virtual reality field as simulator sickness.
- the current position and velocity of the head in each degree of freedom can be used to predict the future position, in the manner of So and Griffin or Azuma and Bishop. By doing so, the predicted future position can be used as the input to the processing pipeline instead of the current actual position, thus decreasing the average mismatch between the proprioceptive and visual cues.
- FIG. 5 also illustrates the use of a voice recognition system as a means for inputting control commands and application data into the invention 10 .
- the voice recognition system consists of a microphone 54 disposed near the user's mouth, such as by mounting onto or within the head-mounted display.
- the output of the microphone is input to the computer's audio input port, which digitizes the audio data.
- the digital data is then analyzed by a voice recognizer 56 , which may consist of hardware, software, or a combination of the two.
- a typical embodiment of the voice recognizer 56 for an Intel/Microsoft architecture would consist of Dragon Dictate software by Dragon Systems (Newton, Mass.), running on a SoundBlaster audio board by Creative Laboratories (Milpitas, Calif.).
- the output is sent to the operating system in the form of digital data interpreted as either commands or content depending upon the state of the operating system.
- position commands include “center me” to center the user's instantaneous viewport 22 within the virtual display 20 , “top right” to move the instantaneous viewport 22 to the top right, etc.
- Enlargement commands include absolute commands such as “Mag 8” to set the screen enlarger 24 to a magnification of 8 to 1, and relative commands such as “zoom double” to temporarily increase the magnification by a factor of two.
- Tracking control commands include “lock vertical” to lock-out response to the elevation tracking function, which simplifies scrolling horizontally across text. Selecting between system operating modes includes a complete set of commands for operating the screen enlarger 24 , such as “scroll text” to enter the enlarger's text scrolling mode. Finally, application control commands are application-dependent and available commercially as libraries, which typically include most or all mouse-accessible functions such as “page down”, “font: Times”, “edit: cut”, etc.
- FIG. 5 additionally illustrates a spatialized audio generator 58 , which is used to alert the user to computer-generated events occurring outside the user's instantaneous viewport 22 .
- a spatialized audio generator 58 is used to alert the user to computer-generated events occurring outside the user's instantaneous viewport 22 . This is done by providing the user with slightly different signals in each ear via a pair of loudspeakers or stereo headphones 60 , with the differences calculated to simulate directionality via slight delays between the nearer ear's signal and the farther ear's signal, slight reduction in high-frequency content in the farther ear's signal, and other spatial processing as is commonly known in the art.
- the spatialized audio generator 58 can be constructed from commercially-available components such as a SoundBlaster audio board from Creative Laboratories (Milpitas, Calif.), which includes audio spatialization software as a standard feature.
- the input to the spatialized audio generator 58 is provided by the operating system 42 for simple alerts such as “beeps” signifying an error or other message window, and may be provided by the application software 44 or the screen enlarger 24 for more advanced messages such as synthesized voice messages or text-to-speech conversion.
- control driver 48 contains a scaling factor used to adjust the amount by which the instantaneous viewport 22 moves across the virtual display 20 per degree of head rotation.
- this scaling factor is set so that the virtual display 20 appears fixed in space while the instantaneous viewport is scanned across it.
- fixing the virtual display can be problematic, as the user's head may be required to rotate more than is comfortable to scan from one edge of the virtual display to the opposing edge.
- the present invention 10 may be configured by the user with a different scaling factor, which increases the amount by which the instantaneous viewport 22 moves across the virtual display 20 for each degree of head rotation.
- a snap-back function may be included within the control driver 48 , wherein data from the input controller 16 is used only for temporary repositioning of the mouse pointer 32 and instantaneous viewport 22 within the virtual display 20 .
- this function records activity of the input controller 16 while such activity is being used to control the displayed imagery. Once such activity ceases, the inverse of the recorded activity is fed to the operating system 42 , which snaps-back the image displayed in the instantaneous viewport 22 to that which would be viewed in the absence of the input controller 16 .
- the result of this snap-back function is that the virtual display 20 is maintained at a fixed location in space, which may be temporarily modified by use of the input controller 16 , but is returned to following use of the input controller 16 .
- additional image processing may be performed by the screen enlarger 24 or elsewhere in the processing pipeline to incorporate additional functions which may be desirable for visually-impaired users or other applications.
- a common feature in commercial screen enlargers consists of contrast reversal, where instead of displaying black text on a white background, white text can be displayed on a black background. This improves text readability for some users.
- image enhancement is image enhancement, wherein the imagery is digitally enhanced to strengthen edges, which improves resolution ability for some users.
- FIG. 5 it is noted that if the screen enlarger 24 is set to an enlargement factor of one-to-one or omitted entirely, and a display driver 46 providing a virtual desktop function such as the MGA Millenium by Matrox (Dorval, QC, Canada) is used, then the present invention 10 can be used in an identical fashion by a non-visually-impaired user for purposes of accessing large areas of a virtual desktop, which enhances tasks such as simultaneous use of many individual computer applications.
- a display driver 46 providing a virtual desktop function such as the MGA Millenium by Matrox (Dorval, QC, Canada)
- FIG. 6 shows the invention 10 applied to magnification of real-time imagery.
- a video camera 62 and a video frame grabber board 64 are added to any of the previously described embodiments.
- the video camera is then mounted in a stationary position and aimed at an image to be enlarged 66 .
- This image to be enlarged 66 may be a small object to be magnified such as text in a book, or may be a distant object to be resolved such as a blackboard in a classroom lecture.
- Each frame of video captured by the video grabber board 64 is output to the system bus as application output data by software included commercially with the video frame grabber board 64 , and fed to the screen enlarger 24 .
- the screen enlarger 24 magnifies the imagery, creating a virtual display 20 of the image to be enlarged 66 that occupies a larger angular extent as seen by the user than does the image to be enlarged 66 .
- the head-mounted display 26 , head-tracker 28 , tracking formatter 52 , and control driver 48 are then used as previously described to provide an instantaneous viewport 22 , which may be positioned at any convenient point within the virtual display 20 by turning one's head.
- improvement is made upon earlier closed-circuit television inventions for the visually-impaired in that the camera captures the entire image to be enlarged 66 at all times, instead of moving with the user's head or hand and capturing just the amount of imagery that can be displayed within the instantaneous viewport 22 .
- spatial awareness is maintained, but jitter in camera motion is not magnified to become disruptive to the user.
- any image may be instantly saved to permanent memory with a single keystroke for later review, editing, or printout.
- the present invention is useful in a wide range of applications such as text processing and virtual map navigation.
- the present invention teaches entry of computer control commands through intuitive head gestures.
- head gestures in addition to adjusting the user's field of view by tracking head motion, we define specific head gestures and correspond these specific head gestures in an intuitive manner with “special discrete commands.”
- FIG. 7 illustrates some possible head gestures that may be useful.
- a two-tailed motion arrow 170 illustrates forward or backward head motion and such gestures may correspond to increasing or decreasing display magnification.
- a two-tailed motion arrow 172 illustrates head-nodding motion, which could control document scrolling.
- the user could begin nodding with a downward or upward motion to initiate downward or upward scrolling, respectively.
- Another two-tailed motion arrow 174 indicates side-to-side head motion. This side-to-side motion could bring about a panning action.
- the last two-tailed motion arrow 176 illustrates brisk or abrupt head shaking motion, which could cause erasure or screen clearing.
- a first step 202 represents monitoring the user's head movement.
- the user is supplied a head-mounted display device which provides at least visual feedback.
- the computer system through the display device e.g., has the capability to track the user's head movement.
- the computer system responds to sensed user head movement by determining whether a special discrete command has been entered. If not, control is passed to a step 206 , which updates the virtual space such that the user's field of view is maintained in accordance with the head position.
- step 204 the computer system must distinguish special discrete commands from other head movement simply intended to adjust the user's field of view. This can be accomplished in step 206 through a variety of mechanisms.
- certain head gestures could be mapped to corresponding special discrete commands. For specific examples, see the descriptions of FIG. 7 above, and FIGS. 9-11 below. These head motions ought to if possible be distinct from motions a user might be required to make to use the head-mounted display.
- a first head gesture e.g., a very abrupt nod or such
- the first head gesture would operate like a control character, with subsequent head gestures being special discrete commands.
- control is passed to a step 208 .
- step 208 the computer system applies a function associated with the special discrete command to the sensed head motion. These functions can be based on head position and all related derivatives (velocity, acceleration, etc.). These functions may also be piecewise, with discrete portions having varying response characteristics. Once such a function has been applied, control is passed to a step 210 wherein the user's display is adjusted accordingly. Once the display is adjusted, control is passed back to monitor head movement step 202 .
- FIG. 9 illustrates the implementation of a discrete magnification instruction in accordance with one embodiment of the present invention.
- the computer system detects a forward head motion intended to cause magnification. Control is thus passed to a step 1208 (a specific case of step 208 of FIG. 9 ) where the magnification function is implemented. This function may increase magnification as a function of the change in user's head position, the speed of the user's head gesture, and/or the acceleration of the user's head gesture. After the magnification has been adjusted, control is passed back to step 202 of FIG. 8 .
- Steps 2204 and 2208 of FIG. 10 implement a process similar to that of FIG. 9 , the difference being that the method of FIG. 10 applies to reverse head motion and a corresponding decrease in magnification.
- FIG. 11 illustrates a method for scrolling through the virtual display space.
- the computer system detects either up or down head motion defined as corresponding to special discrete scrolling commands.
- the computer system scrolls through the virtual display space accordingly. When finished, control is passed back to step 202 .
- a method 300 for controlling the correspondence between the displayed field of view and the user's head position will now be described.
- a first step 302 the user initiates a correspondence reset command.
- the user will be in a first field of view with the user's head in a first head position.
- the computer preserves this information.
- a next step 304 the user moves his head to a second position in order to perceive a second field of view.
- a step 306 the user closes the reset command.
- the computer system resets the virtual space mapping so that the second field of view is perceived at the user's first head position.
- the reset command may be initiated and closed by specific head gesture(s).
- the field of view could be coupled to the viewer's head position with a “weak force.”
- the “weak force” could operate such that above a certain threshold speed, the displayed field of view would change in accordance with the user's head position.
- the field of view would remain constant but the user's head position would change.
- a wrist worn display could be controlled by hand, wrist, and arm movements. This would allow functions such as pan, zoom, and scroll to be effected upon the wrist worn display.
- the wrist worn display could be coupled remotely with a central computer system controlled by the user through the wrist worn display.
- the wrist worn display itself could house a computer system controlled by the intuitive gestures.
- the gesture tracking device could be separate from the wearable display device, allowing the user to attach the gesture tracking device and manipulate it as desired.
- the user may be provided multiple wearable control devices for controlling the computer system through intuitive body gestures.
- FIG. 13 is an illustration showing a PDA 1300 operable by intuitive body gestures, in accordance with an embodiment of the present invention.
- the PDA 1300 includes a display 1302 , and is quite small, light weight, and relatively inexpensive.
- FIG. 13 illustrates some possible hand gestures that may be useful.
- a two-tailed motion arrow 1304 illustrates forward or backward hand motion along the z-axis and may correspond to increasing or decreasing display magnification.
- a two-tailed motion arrow 1306 illustrates up and down hand motion along the x-axis, which could control document scrolling. For example, the user could begin rotating with a downward or upward motion to initiate downward or upward scrolling, respectively.
- Another two-tailed motion arrow 1308 indicates side-to-side hand motion along the y-axis. This side-to-side motion could bring about a panning action.
- the last two-tailed motion arrow 1310 illustrates brisk or abrupt head hand shaking motion, which could cause erasure or screen clearing.
- a first step 1402 represents monitoring the user's hand movement.
- the user is supplied a hand-portable display device which provides at least visual feedback.
- the computer system through the display device, gyros and/or accelerometers has the capability to track the user's hand movement.
- the computer system responds to sensed user hand movement by determining whether a special discrete command has been entered. If not, control is passed to a step 1406 , which updates the virtual space such that the user's field of view is maintained in accordance with the hand position.
- step 1404 the computer system must distinguish special discrete commands from other hand movement simply not intended to adjust the user's field of view, such as small natural movements caused by the user's environment. This can be accomplished in step 1406 through a variety of mechanisms.
- certain hand gestures could be mapped to corresponding special discrete commands. For specific examples, see the descriptions of FIG. 13 above, and FIGS. 14 and 15 below. These hand motions preferably are distinct from motions a user might be required to make to use the hand-mounted display.
- a first hand gesture e.g., a very abrupt rotation
- the first hand gesture would operate like a control character, with subsequent hand gestures being special discrete commands.
- step 1408 when the computer system has ascertained in step 1404 that a special discrete instruction has occurred, control is passed to a step 1408 .
- step 1408 the computer system applies a function associated with the special discrete command to the sensed hand motion. These functions can be based on hand position and all related derivatives (velocity, acceleration, etc.). These functions may also be piecewise, with discrete portions having varying response characteristics. Once such a function has been applied, control is passed to a step 1410 wherein the user's display is adjusted accordingly. Once the display is adjusted, control is passed back to monitor hand movement step 1402 .
- FIG. 15 illustrates the implementation of a discrete magnification instruction in accordance with one embodiment of the present invention.
- the computer system detects a forward hand motion intended to cause magnification.
- Control is thus passed to a step 1508 (a specific case of step 1408 of FIG. 14 ) where the magnification function is implemented.
- This function may increase magnification as a function of the change in user's hand position, the speed of the user's hand gesture, and/or the acceleration of the user's hand gesture.
- control is passed back to step 1402 of FIG. 14 .
- Steps 1604 and 1608 of FIG. 16 implement a process similar to that of FIG. 15 , the difference being that the method of FIG. 16 applies to reverse hand motion and a corresponding decrease in magnification.
- control is passed back to step 1402 .
Abstract
Description
Claims (29)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/574,607 USRE42336E1 (en) | 1995-11-28 | 2009-10-06 | Intuitive control of portable data displays |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US56352595A | 1995-11-28 | 1995-11-28 | |
US10143398P | 1998-09-22 | 1998-09-22 | |
US09/235,096 US6127990A (en) | 1995-11-28 | 1999-01-21 | Wearable display and methods for controlling same |
US09/264,799 US6084556A (en) | 1995-11-28 | 1999-03-09 | Virtual computer monitor |
US09/373,186 US6359603B1 (en) | 1995-11-28 | 1999-08-12 | Portable display and methods of controlling same |
US09/404,051 US6184847B1 (en) | 1998-09-22 | 1999-09-22 | Intuitive control of portable data displays |
US12/574,607 USRE42336E1 (en) | 1995-11-28 | 2009-10-06 | Intuitive control of portable data displays |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/404,051 Reissue US6184847B1 (en) | 1995-11-28 | 1999-09-22 | Intuitive control of portable data displays |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE42336E1 true USRE42336E1 (en) | 2011-05-10 |
Family
ID=26798255
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/404,051 Ceased US6184847B1 (en) | 1995-11-28 | 1999-09-22 | Intuitive control of portable data displays |
US12/574,607 Expired - Lifetime USRE42336E1 (en) | 1995-11-28 | 2009-10-06 | Intuitive control of portable data displays |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/404,051 Ceased US6184847B1 (en) | 1995-11-28 | 1999-09-22 | Intuitive control of portable data displays |
Country Status (4)
Country | Link |
---|---|
US (2) | US6184847B1 (en) |
EP (1) | EP1116211A4 (en) |
JP (1) | JP2002525769A (en) |
WO (1) | WO2000017848A1 (en) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110092249A1 (en) * | 2009-10-21 | 2011-04-21 | Xerox Corporation | Portable blind aid device |
US20120165099A1 (en) * | 2010-12-22 | 2012-06-28 | Nintendo Co., Ltd. | Game system, game device, storage medium storing game program, and game process method |
US20120194549A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Ar glasses specific user interface based on a connected external device type |
WO2013028268A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation | Method and system for use in providing three dimensional user interface |
US20130117707A1 (en) * | 2011-11-08 | 2013-05-09 | Google Inc. | Velocity-Based Triggering |
US20130182016A1 (en) * | 2012-01-16 | 2013-07-18 | Beijing Lenovo Software Ltd. | Portable device and display processing method |
US8643951B1 (en) | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
US8970692B2 (en) | 2011-09-01 | 2015-03-03 | Industrial Technology Research Institute | Head mount personal computer and interactive system using the same |
US9013264B2 (en) | 2011-03-12 | 2015-04-21 | Perceptive Devices, Llc | Multipurpose controller for electronic devices, facial expressions management and drowsiness detection |
US9035878B1 (en) | 2012-02-29 | 2015-05-19 | Google Inc. | Input system |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9268136B1 (en) | 2012-09-28 | 2016-02-23 | Google Inc. | Use of comparative sensor data to determine orientation of head relative to body |
US9285589B2 (en) | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US9329689B2 (en) | 2010-02-28 | 2016-05-03 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US9448687B1 (en) | 2014-02-05 | 2016-09-20 | Google Inc. | Zoomable/translatable browser interface for a head mounted device |
US9547406B1 (en) | 2011-10-31 | 2017-01-17 | Google Inc. | Velocity-based triggering |
US9759917B2 (en) | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
US9996149B1 (en) | 2016-02-22 | 2018-06-12 | Immersacad Corporation | Method for one-touch translational navigation of immersive, virtual reality environments |
US10067559B2 (en) | 2011-11-30 | 2018-09-04 | Google Llc | Graphical interface having adjustable borders |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US10539787B2 (en) | 2010-02-28 | 2020-01-21 | Microsoft Technology Licensing, Llc | Head-worn adaptive display |
US10860100B2 (en) | 2010-02-28 | 2020-12-08 | Microsoft Technology Licensing, Llc | AR glasses with predictive control of external device based on event input |
US20220155853A1 (en) * | 2020-11-19 | 2022-05-19 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality information prompting system, display control method, equipment and medium |
Families Citing this family (262)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US6127990A (en) * | 1995-11-28 | 2000-10-03 | Vega Vista, Inc. | Wearable display and methods for controlling same |
WO2000017848A1 (en) | 1998-09-22 | 2000-03-30 | Vega Vista, Inc. | Intuitive control of portable data displays |
US7966078B2 (en) | 1999-02-01 | 2011-06-21 | Steven Hoffberg | Network media appliance system and method |
US20060279542A1 (en) * | 1999-02-12 | 2006-12-14 | Vega Vista, Inc. | Cellular phones and mobile devices with motion driven control |
US20060061550A1 (en) * | 1999-02-12 | 2006-03-23 | Sina Fateh | Display size emulation system |
US20060061551A1 (en) * | 1999-02-12 | 2006-03-23 | Vega Vista, Inc. | Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection |
US7749089B1 (en) | 1999-02-26 | 2010-07-06 | Creative Kingdoms, Llc | Multi-media interactive play system |
GB9911270D0 (en) * | 1999-05-15 | 1999-07-14 | Hester Robert George | Head control |
US6825945B1 (en) * | 1999-05-25 | 2004-11-30 | Silverbrook Research Pty Ltd | Method and system for delivery of a brochure |
US7539937B2 (en) * | 1999-05-25 | 2009-05-26 | Silverbrook Research Pty Ltd | Periodical distribution via a computer network |
US6466198B1 (en) * | 1999-11-05 | 2002-10-15 | Innoventions, Inc. | View navigation and magnification of a hand-held device with a display |
US20020024506A1 (en) * | 1999-11-09 | 2002-02-28 | Flack James F. | Motion detection and tracking system to control navigation and display of object viewers |
IL133382A0 (en) * | 1999-12-08 | 2001-04-30 | Lass Yoram | A mobile telephone tilt mouse |
US6941382B1 (en) * | 2000-02-07 | 2005-09-06 | Raja Tuli | Portable high speed internet or desktop device |
US6761637B2 (en) | 2000-02-22 | 2004-07-13 | Creative Kingdoms, Llc | Method of game play using RFID tracking device |
US7445550B2 (en) | 2000-02-22 | 2008-11-04 | Creative Kingdoms, Llc | Magical wand and interactive play experience |
US7878905B2 (en) | 2000-02-22 | 2011-02-01 | Creative Kingdoms, Llc | Multi-layered interactive play experience |
US9189069B2 (en) * | 2000-07-17 | 2015-11-17 | Microsoft Technology Licensing, Llc | Throwing gestures for mobile devices |
US6753828B2 (en) * | 2000-09-25 | 2004-06-22 | Siemens Corporated Research, Inc. | System and method for calibrating a stereo optical see-through head-mounted display system for augmented reality |
US20020105484A1 (en) * | 2000-09-25 | 2002-08-08 | Nassir Navab | System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality |
US7066781B2 (en) | 2000-10-20 | 2006-06-27 | Denise Chapman Weston | Children's toy with wireless tag/transponder |
US20020109673A1 (en) * | 2001-01-04 | 2002-08-15 | Thierry Valet | Method and apparatus employing angled single accelerometer sensing multi-directional motion |
US6704447B2 (en) * | 2001-02-21 | 2004-03-09 | Justsystem Corporation | Method and apparatus for using illumination from a display for computer vision based user interfaces and biometric authentication |
US6798429B2 (en) * | 2001-03-29 | 2004-09-28 | Intel Corporation | Intuitive mobile device interface to virtual spaces |
US6826477B2 (en) * | 2001-04-23 | 2004-11-30 | Ecole Polytechnique Federale De Lausanne (Epfl) | Pedestrian navigation method and apparatus operative in a dead reckoning mode |
US7365734B2 (en) * | 2002-08-06 | 2008-04-29 | Rembrandt Ip Management, Llc | Control of display content by movement on a fixed spherical space |
USRE47457E1 (en) * | 2001-08-07 | 2019-06-25 | Facebook, Inc. | Control of display content by movement on a fixed spherical space |
US7190378B2 (en) * | 2001-08-16 | 2007-03-13 | Siemens Corporate Research, Inc. | User interface for augmented and virtual reality systems |
US7194148B2 (en) * | 2001-09-07 | 2007-03-20 | Yavitz Edward Q | Technique for providing simulated vision |
US20030149822A1 (en) * | 2002-02-01 | 2003-08-07 | Bryan Scott | Method for integrating an intelligent docking station with a handheld personal computer |
US20030172217A1 (en) * | 2002-03-08 | 2003-09-11 | Bryan Scott | Method for implementing communication drivers in an intelligent docking station/handheld personal computer system |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US20070066396A1 (en) | 2002-04-05 | 2007-03-22 | Denise Chapman Weston | Retail methods for providing an interactive product to a consumer |
US7203911B2 (en) * | 2002-05-13 | 2007-04-10 | Microsoft Corporation | Altering a display on a viewing device based upon a user proximity to the viewing device |
US8797260B2 (en) * | 2002-07-27 | 2014-08-05 | Sony Computer Entertainment Inc. | Inertially trackable hand-held controller |
US8947347B2 (en) * | 2003-08-27 | 2015-02-03 | Sony Computer Entertainment Inc. | Controlling actions in a video game unit |
US10086282B2 (en) * | 2002-07-27 | 2018-10-02 | Sony Interactive Entertainment Inc. | Tracking device for use in obtaining information for controlling game program execution |
US9174119B2 (en) | 2002-07-27 | 2015-11-03 | Sony Computer Entertainement America, LLC | Controller for providing inputs to control execution of a program when inputs are combined |
US8686939B2 (en) * | 2002-07-27 | 2014-04-01 | Sony Computer Entertainment Inc. | System, method, and apparatus for three-dimensional input control |
US7854655B2 (en) * | 2002-07-27 | 2010-12-21 | Sony Computer Entertainment America Inc. | Obtaining input for controlling execution of a game program |
JP2004109994A (en) * | 2002-08-30 | 2004-04-08 | Olympus Corp | Head mounted image display system device and image processing method therefor |
US20050156817A1 (en) * | 2002-08-30 | 2005-07-21 | Olympus Corporation | Head-mounted display system and method for processing images |
US20040100484A1 (en) * | 2002-11-25 | 2004-05-27 | Barrett Peter T. | Three-dimensional television viewing environment |
US7511710B2 (en) | 2002-11-25 | 2009-03-31 | Microsoft Corporation | Three-dimensional program guide |
US7515156B2 (en) * | 2003-01-08 | 2009-04-07 | Hrl Laboratories, Llc | Method and apparatus for parallel speculative rendering of synthetic images |
US9063633B2 (en) * | 2006-03-30 | 2015-06-23 | Arjuna Indraeswaran Rajasingham | Virtual navigation system for virtual and real spaces |
US7426329B2 (en) | 2003-03-06 | 2008-09-16 | Microsoft Corporation | Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player |
US9446319B2 (en) | 2003-03-25 | 2016-09-20 | Mq Gaming, Llc | Interactive gaming toy |
US9165478B2 (en) | 2003-04-18 | 2015-10-20 | International Business Machines Corporation | System and method to enable blind people to have access to information printed on a physical document |
US7872635B2 (en) * | 2003-05-15 | 2011-01-18 | Optimetrics, Inc. | Foveated display eye-tracking system and method |
US7391888B2 (en) | 2003-05-30 | 2008-06-24 | Microsoft Corporation | Head pose assessment methods and systems |
US20070223732A1 (en) * | 2003-08-27 | 2007-09-27 | Mao Xiao D | Methods and apparatuses for adjusting a visual image based on an audio signal |
US11033821B2 (en) | 2003-09-02 | 2021-06-15 | Jeffrey D. Mullen | Systems and methods for location based games and employment of the same on location enabled devices |
EP1679689B1 (en) * | 2003-10-28 | 2014-01-01 | Panasonic Corporation | Image display device and image display method |
US7093034B2 (en) * | 2003-11-18 | 2006-08-15 | Microsoft Corporation | Method and apparatus for input management having a plurality of input provider types wherein staging area holds and allows access by external components |
US7707039B2 (en) * | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Automatic modification of web pages |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US10635723B2 (en) | 2004-02-15 | 2020-04-28 | Google Llc | Search engines and systems with handheld document data capture devices |
US7812860B2 (en) * | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US20060041605A1 (en) * | 2004-04-01 | 2006-02-23 | King Martin T | Determining actions involving captured information and electronic content associated with rendered documents |
US20060041484A1 (en) * | 2004-04-01 | 2006-02-23 | King Martin T | Methods and systems for initiating application processes by data capture from rendered documents |
US20060122983A1 (en) * | 2004-12-03 | 2006-06-08 | King Martin T | Locating electronic instances of documents based on rendered instances, document fragment digest generation, and digest based document fragment determination |
US20060053097A1 (en) * | 2004-04-01 | 2006-03-09 | King Martin T | Searching and accessing documents on private networks for use with captures from rendered documents |
US7176888B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Selective engagement of motion detection |
US7301527B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Feedback based user interface for motion controlled handheld devices |
US7365736B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Customizable gesture mappings for motion controlled handheld devices |
US7301529B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Context dependent gesture response |
US7173604B2 (en) * | 2004-03-23 | 2007-02-06 | Fujitsu Limited | Gesture identification of controlled devices |
US7180502B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | Handheld device with preferred motion selection |
US20050212753A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Motion controlled remote controller |
US7365737B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Non-uniform gesture precision |
US7903084B2 (en) * | 2004-03-23 | 2011-03-08 | Fujitsu Limited | Selective engagement of motion input modes |
US7365735B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Translation controlled cursor |
US7301528B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Distinguishing tilt and translation motion components in handheld devices |
US20050212760A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture based user interface supporting preexisting symbols |
US7301526B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Dynamic adaptation of gestures for motion controlled handheld devices |
US7180501B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | Gesture based navigation of a handheld user interface |
US7176887B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Environmental modeling for motion controlled handheld devices |
US7180500B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | User definable gestures for motion controlled handheld devices |
US7176886B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Spatial signatures |
US7280096B2 (en) * | 2004-03-23 | 2007-10-09 | Fujitsu Limited | Motion sensor engagement for a handheld device |
US20080313172A1 (en) * | 2004-12-03 | 2008-12-18 | King Martin T | Determining actions involving captured information and electronic content associated with rendered documents |
US9008447B2 (en) * | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
US7894670B2 (en) | 2004-04-01 | 2011-02-22 | Exbiblio B.V. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US20060081714A1 (en) | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8146156B2 (en) | 2004-04-01 | 2012-03-27 | Google Inc. | Archive of text captures from rendered documents |
US20070300142A1 (en) * | 2005-04-01 | 2007-12-27 | King Martin T | Contextual dynamic advertising based upon captured rendered text |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
WO2008028674A2 (en) | 2006-09-08 | 2008-03-13 | Exbiblio B.V. | Optical scanners, such as hand-held optical scanners |
US8081849B2 (en) * | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
US20060098900A1 (en) * | 2004-09-27 | 2006-05-11 | King Martin T | Secure data gathering from rendered documents |
US8713418B2 (en) * | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US8489624B2 (en) * | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
WO2005119356A2 (en) | 2004-05-28 | 2005-12-15 | Erik Jan Banning | Interactive direct-pointing system and calibration method |
US20090033630A1 (en) * | 2004-06-04 | 2009-02-05 | Koninklijke Philips Electronics, N.V. | hand-held device for content navigation by a user |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
WO2006023153A1 (en) * | 2004-08-23 | 2006-03-02 | Gamecaster, Inc. | Apparatus, methods and systems for viewing and manipulating a virtual environment |
US20060066567A1 (en) * | 2004-09-29 | 2006-03-30 | Scharenbroch Gregory K | System and method of controlling scrolling text display |
KR100641182B1 (en) * | 2004-12-30 | 2006-11-02 | 엘지전자 주식회사 | Apparatus and method for moving virtual screen in a mobile terminal |
US20080153591A1 (en) * | 2005-03-07 | 2008-06-26 | Leonidas Deligiannidis | Teleportation Systems and Methods in a Virtual Environment |
US9285897B2 (en) | 2005-07-13 | 2016-03-15 | Ultimate Pointer, L.L.C. | Easily deployable interactive direct-pointing system and calibration method therefor |
US20070035563A1 (en) * | 2005-08-12 | 2007-02-15 | The Board Of Trustees Of Michigan State University | Augmented reality spatial interaction and navigational system |
US7647175B2 (en) * | 2005-09-09 | 2010-01-12 | Rembrandt Technologies, Lp | Discrete inertial display navigation |
US20070057911A1 (en) * | 2005-09-12 | 2007-03-15 | Sina Fateh | System and method for wireless network content conversion for intuitively controlled portable displays |
US20080082363A1 (en) * | 2005-10-07 | 2008-04-03 | Nader Habashi | On-line healthcare consultation services system and method of using same |
US20070091037A1 (en) * | 2005-10-21 | 2007-04-26 | Yee-Chun Lee | Energy Efficient Compact Display For Mobile Device |
US7606552B2 (en) * | 2005-11-10 | 2009-10-20 | Research In Motion Limited | System and method for activating an electronic device |
WO2007060604A2 (en) * | 2005-11-25 | 2007-05-31 | Koninklijke Philips Electronics N.V. | Filtering pointer coordinates |
KR101107538B1 (en) * | 2006-03-15 | 2012-02-08 | 퀄컴 인코포레이티드 | Sensor-based orientation system |
JP4684147B2 (en) * | 2006-03-28 | 2011-05-18 | 任天堂株式会社 | Inclination calculation device, inclination calculation program, game device, and game program |
US7928926B2 (en) * | 2006-06-27 | 2011-04-19 | Panasonic Corporation | Display apparatus and method for hands free operation that selects a function when window is within field of view |
US20080055315A1 (en) * | 2006-09-05 | 2008-03-06 | Dale Ducharme | Method and System to Establish and Animate a Coordinate System for Content on a Display |
KR101443404B1 (en) * | 2006-09-15 | 2014-10-02 | 구글 인코포레이티드 | Capture and display of annotations in paper and electronic documents |
US9767599B2 (en) * | 2006-12-29 | 2017-09-19 | X-Rite Inc. | Surface appearance simulation |
US9235262B2 (en) * | 2009-05-08 | 2016-01-12 | Kopin Corporation | Remote control of host application using motion and voice commands |
TWI377055B (en) * | 2007-08-10 | 2012-11-21 | Ind Tech Res Inst | Interactive rehabilitation method and system for upper and lower extremities |
US8638363B2 (en) | 2009-02-18 | 2014-01-28 | Google Inc. | Automatically capturing information, such as capturing information using a document-aware device |
WO2009120984A1 (en) | 2008-03-28 | 2009-10-01 | Kopin Corporation | Handheld wireless display device having high-resolution display suitable for use as a mobile internet device |
US7953462B2 (en) | 2008-08-04 | 2011-05-31 | Vartanian Harry | Apparatus and method for providing an adaptively responsive flexible display device |
DE102008055180A1 (en) * | 2008-12-30 | 2010-07-01 | Sennheiser Electronic Gmbh & Co. Kg | Control system, handset and control methods |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
WO2010105246A2 (en) | 2009-03-12 | 2010-09-16 | Exbiblio B.V. | Accessing resources based on capturing information from a rendered document |
US8849570B2 (en) * | 2009-03-19 | 2014-09-30 | Microsoft Corporation | Projected way-finding |
US8121640B2 (en) | 2009-03-19 | 2012-02-21 | Microsoft Corporation | Dual module portable devices |
US20100241999A1 (en) * | 2009-03-19 | 2010-09-23 | Microsoft Corporation | Canvas Manipulation Using 3D Spatial Gestures |
US20100315329A1 (en) * | 2009-06-12 | 2010-12-16 | Southwest Research Institute | Wearable workspace |
US20100321482A1 (en) * | 2009-06-17 | 2010-12-23 | Lc Technologies Inc. | Eye/head controls for camera pointing |
WO2011044680A1 (en) * | 2009-10-13 | 2011-04-21 | Recon Instruments Inc. | Control systems and methods for head-mounted information systems |
US8890657B2 (en) * | 2009-10-30 | 2014-11-18 | Symbol Technologies, Inc. | System and method for operating an RFID system with head tracking |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
EP2679013A2 (en) | 2010-02-23 | 2014-01-01 | MUV Interactive Ltd. | A system for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith |
US9880619B2 (en) | 2010-02-23 | 2018-01-30 | Muy Interactive Ltd. | Virtual reality system with a finger-wearable control |
US20120001932A1 (en) * | 2010-07-02 | 2012-01-05 | Burnett William R | Systems and methods for assisting visually-impaired users to view visual content |
US9491560B2 (en) * | 2010-07-20 | 2016-11-08 | Analog Devices, Inc. | System and method for improving headphone spatial impression |
US8780014B2 (en) | 2010-08-25 | 2014-07-15 | Eastman Kodak Company | Switchable head-mounted display |
US9111498B2 (en) | 2010-08-25 | 2015-08-18 | Eastman Kodak Company | Head-mounted display with environmental state detection |
US8619005B2 (en) * | 2010-09-09 | 2013-12-31 | Eastman Kodak Company | Switchable head-mounted display transition |
US9377862B2 (en) | 2010-09-20 | 2016-06-28 | Kopin Corporation | Searchlight navigation using headtracker to reveal hidden or extra document data |
US10013976B2 (en) | 2010-09-20 | 2018-07-03 | Kopin Corporation | Context sensitive overlays in voice controlled headset computer displays |
US9122307B2 (en) * | 2010-09-20 | 2015-09-01 | Kopin Corporation | Advanced remote control of host application using motion and voice commands |
US9316827B2 (en) | 2010-09-20 | 2016-04-19 | Kopin Corporation | LifeBoard—series of home pages for head mounted displays (HMD) that respond to head tracking |
US8953570B2 (en) | 2010-11-23 | 2015-02-10 | Symbol Technologies, Inc. | Radio frequency identification system and related operating methods |
DE102010062607A1 (en) * | 2010-12-08 | 2012-06-14 | Robert Bosch Gmbh | Device for generating an input signal |
WO2012154938A1 (en) | 2011-05-10 | 2012-11-15 | Kopin Corporation | Headset computer that uses motion and voice commands to control information display and remote devices |
US8754786B2 (en) * | 2011-06-30 | 2014-06-17 | General Electric Company | Method of operating a synthetic vision system in an aircraft |
US8912979B1 (en) * | 2011-07-14 | 2014-12-16 | Google Inc. | Virtual window in head-mounted display |
TWI524258B (en) * | 2011-08-19 | 2016-03-01 | 鴻海精密工業股份有限公司 | Electronic book display adjustment system and method |
US8990682B1 (en) | 2011-10-05 | 2015-03-24 | Google Inc. | Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display |
US9081177B2 (en) | 2011-10-07 | 2015-07-14 | Google Inc. | Wearable computer with nearby object response |
WO2013101438A1 (en) | 2011-12-29 | 2013-07-04 | Kopin Corporation | Wireless hands-free computing head mounted video eyewear for local/remote diagnosis and repair |
EP2800993A2 (en) | 2012-01-06 | 2014-11-12 | HPO Assets LLC | Eyewear docking station and electronic module |
US20160011724A1 (en) * | 2012-01-06 | 2016-01-14 | Google Inc. | Hands-Free Selection Using a Ring-Based User-Interface |
US9213185B1 (en) * | 2012-01-06 | 2015-12-15 | Google Inc. | Display scaling based on movement of a head-mounted display |
US9153043B1 (en) | 2012-02-16 | 2015-10-06 | Google, Inc. | Systems and methods for providing a user interface in a field of view of a media item |
JP5880115B2 (en) * | 2012-02-17 | 2016-03-08 | ソニー株式会社 | Head mounted display, head mounted display control program, and head mounted display control method |
US8947322B1 (en) | 2012-03-19 | 2015-02-03 | Google Inc. | Context detection and context-based user-interface population |
US8947323B1 (en) * | 2012-03-20 | 2015-02-03 | Hayes Solos Raffle | Content display methods |
US9710056B2 (en) | 2012-03-21 | 2017-07-18 | Google Inc. | Methods and systems for correlating movement of a device with state changes of the device |
US9170648B2 (en) * | 2012-04-03 | 2015-10-27 | The Boeing Company | System and method for virtual engineering |
US8854415B2 (en) | 2012-04-03 | 2014-10-07 | Cisco Technology, Inc. | Motion responsive video capture during a video conference |
US8929954B2 (en) | 2012-04-25 | 2015-01-06 | Kopin Corporation | Headset computer (HSC) as auxiliary display with ASR and HT input |
JP6289448B2 (en) | 2012-04-25 | 2018-03-07 | コピン コーポレーション | Instant translation system |
US9218526B2 (en) | 2012-05-24 | 2015-12-22 | HJ Laboratories, LLC | Apparatus and method to detect a paper document using one or more sensors |
US20140136960A1 (en) * | 2012-11-13 | 2014-05-15 | Microsoft Corporation | Content-Aware Scrolling |
JP6155622B2 (en) * | 2012-12-18 | 2017-07-05 | セイコーエプソン株式会社 | Display device, head-mounted display device, display device control method, and head-mounted display device control method |
US10528135B2 (en) | 2013-01-14 | 2020-01-07 | Ctrl-Labs Corporation | Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display |
JP6179412B2 (en) * | 2013-01-31 | 2017-08-16 | 株式会社Jvcケンウッド | Input display device |
EP2960896B1 (en) | 2013-02-22 | 2022-07-20 | Sony Group Corporation | Head-mounted display and image display device |
US20140280502A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Crowd and cloud enabled virtual reality distributed location network |
US20140280506A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Virtual reality enhanced through browser connections |
US20140280505A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Virtual reality interaction with 3d printing |
US20140282113A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Personal digital assistance and virtual reality |
US20140280503A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | System and methods for effective virtual reality visitor interface |
US9838506B1 (en) | 2013-03-15 | 2017-12-05 | Sony Interactive Entertainment America Llc | Virtual reality universe representation changes viewing based upon client side parameters |
US20140280644A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Real time unified communications interaction of a predefined location in a virtual reality location |
US9507426B2 (en) | 2013-03-27 | 2016-11-29 | Google Inc. | Using the Z-axis in user interfaces for head mountable displays |
US9213403B1 (en) | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
US10152082B2 (en) | 2013-05-13 | 2018-12-11 | North Inc. | Systems, articles and methods for wearable electronic devices that accommodate different user forms |
US10262462B2 (en) | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
US9146618B2 (en) | 2013-06-28 | 2015-09-29 | Google Inc. | Unlocking a head mounted device |
US10042422B2 (en) | 2013-11-12 | 2018-08-07 | Thalmic Labs Inc. | Systems, articles, and methods for capacitive electromyography sensors |
US10188309B2 (en) | 2013-11-27 | 2019-01-29 | North Inc. | Systems, articles, and methods for electromyography sensors |
US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
US11426123B2 (en) | 2013-08-16 | 2022-08-30 | Meta Platforms Technologies, Llc | Systems, articles and methods for signal routing in wearable electronic devices that detect muscle activity of a user using a set of discrete and separately enclosed pod structures |
US20150124566A1 (en) | 2013-10-04 | 2015-05-07 | Thalmic Labs Inc. | Systems, articles and methods for wearable electronic devices employing contact sensors |
US9788789B2 (en) | 2013-08-30 | 2017-10-17 | Thalmic Labs Inc. | Systems, articles, and methods for stretchable printed circuit boards |
US9256072B2 (en) * | 2013-10-02 | 2016-02-09 | Philip Scott Lyren | Wearable electronic glasses that detect movement of a real object copies movement of a virtual object |
WO2015084227A1 (en) * | 2013-12-06 | 2015-06-11 | Telefonaktiebolaget L M Ericsson (Publ) | Optical head mounted display, television portal module and methods for controlling graphical user interface |
WO2015094191A1 (en) * | 2013-12-17 | 2015-06-25 | Intel Corporation | Controlling vision correction using eye tracking and depth detection |
US9588343B2 (en) | 2014-01-25 | 2017-03-07 | Sony Interactive Entertainment America Llc | Menu navigation in a head-mounted display |
US9437159B2 (en) * | 2014-01-25 | 2016-09-06 | Sony Interactive Entertainment America Llc | Environmental interrupt in a head-mounted display and utilization of non field of view real estate |
US9488833B2 (en) * | 2014-02-07 | 2016-11-08 | International Business Machines Corporation | Intelligent glasses for the visually impaired |
JP2017509386A (en) | 2014-02-14 | 2017-04-06 | サルミック ラブス インコーポレイテッド | System, product and method for elastic electrical cable and wearable electronic device using the same |
JP6307627B2 (en) | 2014-03-14 | 2018-04-04 | 株式会社ソニー・インタラクティブエンタテインメント | Game console with space sensing |
US10199008B2 (en) | 2014-03-27 | 2019-02-05 | North Inc. | Systems, devices, and methods for wearable electronic devices as state machines |
JP2017118159A (en) * | 2014-12-17 | 2017-06-29 | 株式会社テレパシージャパン | Wearable image display device including flexible support member |
US20150325202A1 (en) * | 2014-05-07 | 2015-11-12 | Thalmic Labs Inc. | Systems, devices, and methods for wearable computers with heads-up displays |
WO2015170520A1 (en) * | 2014-05-09 | 2015-11-12 | ソニー株式会社 | Information processing system and information processing method |
CN103984659B (en) * | 2014-05-15 | 2017-07-21 | 华为技术有限公司 | The method and apparatus that timesharing uses serial ports |
US9880632B2 (en) | 2014-06-19 | 2018-01-30 | Thalmic Labs Inc. | Systems, devices, and methods for gesture identification |
US9766449B2 (en) | 2014-06-25 | 2017-09-19 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
KR101728408B1 (en) | 2014-09-22 | 2017-04-19 | (주)에프엑스기어 | Apparatus and method for low latency simulation using estimation of orientation, and computer program for the same |
US10585485B1 (en) | 2014-11-10 | 2020-03-10 | Amazon Technologies, Inc. | Controlling content zoom level based on user head movement |
KR102178298B1 (en) | 2014-11-21 | 2020-11-12 | 삼성전자주식회사 | Method for controlling display and apparatus supplying the same |
US9807221B2 (en) | 2014-11-28 | 2017-10-31 | Thalmic Labs Inc. | Systems, devices, and methods effected in response to establishing and/or terminating a physical communications link |
DE102014018056A1 (en) | 2014-12-05 | 2016-06-09 | Audi Ag | Method of operating a virtual reality glasses and virtual reality glasses |
US9563270B2 (en) * | 2014-12-26 | 2017-02-07 | Microsoft Technology Licensing, Llc | Head-based targeting with pitch amplification |
CN105807798B (en) * | 2014-12-31 | 2018-11-30 | 上海乐相科技有限公司 | A kind of head-wearing type intelligent glasses vibration control method and device |
CN107820578A (en) | 2015-02-17 | 2018-03-20 | 赛尔米克实验室公司 | The system, apparatus and method expanded for carrying out suitable Vitrea eye in wearable head-up display |
US9958682B1 (en) | 2015-02-17 | 2018-05-01 | Thalmic Labs Inc. | Systems, devices, and methods for splitter optics in wearable heads-up displays |
US10379604B2 (en) | 2015-04-10 | 2019-08-13 | Virzoom, Inc. | Virtual reality exercise game |
US10078435B2 (en) | 2015-04-24 | 2018-09-18 | Thalmic Labs Inc. | Systems, methods, and computer program products for interacting with electronically displayed presentation materials |
US10133075B2 (en) | 2015-05-04 | 2018-11-20 | Thalmic Labs Inc. | Systems, devices, and methods for angle- and wavelength-multiplexed holographic optical elements |
US10078220B2 (en) | 2015-05-28 | 2018-09-18 | Thalmic Labs Inc. | Wearable heads-up display with integrated eye tracker |
CN106310655A (en) * | 2015-06-22 | 2017-01-11 | 鸿富锦精密工业(深圳)有限公司 | An electronic device and a head-mounted display device |
CA2996721A1 (en) | 2015-09-04 | 2017-03-09 | Thalmic Labs Inc. | Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses |
US10139902B2 (en) * | 2015-09-16 | 2018-11-27 | Colopl, Inc. | Method and apparatus for changing a field of view without synchronization with movement of a head-mounted display |
US20170092002A1 (en) * | 2015-09-30 | 2017-03-30 | Daqri, Llc | User interface for augmented reality system |
CA3007196A1 (en) | 2015-10-01 | 2017-04-06 | Thalmic Labs Inc. | Systems, devices, and methods for interacting with content displayed on head-mounted displays |
US9904051B2 (en) | 2015-10-23 | 2018-02-27 | Thalmic Labs Inc. | Systems, devices, and methods for laser eye tracking |
US10802190B2 (en) | 2015-12-17 | 2020-10-13 | Covestro Llc | Systems, devices, and methods for curved holographic optical elements |
US10303246B2 (en) | 2016-01-20 | 2019-05-28 | North Inc. | Systems, devices, and methods for proximity-based eye tracking |
US10151926B2 (en) | 2016-01-29 | 2018-12-11 | North Inc. | Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display |
US10568502B2 (en) | 2016-03-23 | 2020-02-25 | The Chinese University Of Hong Kong | Visual disability detection system using virtual reality |
JP2019518979A (en) | 2016-04-13 | 2019-07-04 | ノース インコーポレイテッドNorth Inc. | System, device and method for focusing a laser projector |
JP6563592B2 (en) * | 2016-05-02 | 2019-08-21 | 株式会社ソニー・インタラクティブエンタテインメント | Display control apparatus, display control method, and program |
CN106200899A (en) * | 2016-06-24 | 2016-12-07 | 北京奇思信息技术有限公司 | The method and system that virtual reality is mutual are controlled according to user's headwork |
US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
CN110300542A (en) | 2016-07-25 | 2019-10-01 | 开创拉布斯公司 | Use the method and apparatus of wearable automated sensor prediction muscle skeleton location information |
US10277874B2 (en) | 2016-07-27 | 2019-04-30 | North Inc. | Systems, devices, and methods for laser projectors |
WO2018027326A1 (en) | 2016-08-12 | 2018-02-15 | Thalmic Labs Inc. | Systems, devices, and methods for variable luminance in wearable heads-up displays |
JP6779715B2 (en) * | 2016-09-02 | 2020-11-04 | 株式会社Living Anywhere Garage | Information processing system |
US10345596B2 (en) | 2016-11-10 | 2019-07-09 | North Inc. | Systems, devices, and methods for astigmatism compensation in a wearable heads-up display |
WO2018098579A1 (en) | 2016-11-30 | 2018-06-07 | Thalmic Labs Inc. | Systems, devices, and methods for laser eye tracking in wearable heads-up displays |
US10663732B2 (en) | 2016-12-23 | 2020-05-26 | North Inc. | Systems, devices, and methods for beam combining in wearable heads-up displays |
US10437073B2 (en) | 2017-01-25 | 2019-10-08 | North Inc. | Systems, devices, and methods for beam combining in laser projectors |
US10281977B2 (en) * | 2017-05-25 | 2019-05-07 | Acer Incorporated | Virtual reality systems with human interface device emulation and related methods |
WO2019075134A1 (en) * | 2017-10-10 | 2019-04-18 | Baudisch, Patrick | A haptic device that allows blind users to interact in real-time in virtual worlds |
EP3697297A4 (en) | 2017-10-19 | 2020-12-16 | Facebook Technologies, Inc. | Systems and methods for identifying biological structures associated with neuromuscular source signals |
US20190121133A1 (en) | 2017-10-23 | 2019-04-25 | North Inc. | Free space multiple laser diode modules |
WO2019084325A1 (en) * | 2017-10-27 | 2019-05-02 | Magic Leap, Inc. | Virtual reticle for augmented reality systems |
US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
US11150730B1 (en) | 2019-04-30 | 2021-10-19 | Facebook Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
CN109166433B (en) * | 2018-08-16 | 2021-09-28 | 医博士医教科技(深圳)有限公司 | Medical anthropomorphic dummy system |
US10905350B2 (en) | 2018-08-31 | 2021-02-02 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
CN112789577B (en) | 2018-09-20 | 2024-04-05 | 元平台技术有限公司 | Neuromuscular text input, writing and drawing in augmented reality systems |
US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
US10893127B1 (en) | 2019-07-26 | 2021-01-12 | Arkade, Inc. | System and method for communicating interactive data between heterogeneous devices |
US10773157B1 (en) | 2019-07-26 | 2020-09-15 | Arkade, Inc. | Interactive computing devices and accessories |
US10946272B2 (en) | 2019-07-26 | 2021-03-16 | Arkade, Inc. | PC blaster game console |
US11315326B2 (en) * | 2019-10-15 | 2022-04-26 | At&T Intellectual Property I, L.P. | Extended reality anchor caching based on viewport prediction |
US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
Citations (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4209255A (en) | 1979-03-30 | 1980-06-24 | United Technologies Corporation | Single source aiming point locator |
US4227209A (en) | 1978-08-09 | 1980-10-07 | The Charles Stark Draper Laboratory, Inc. | Sensory aid for visually handicapped people |
US4548485A (en) | 1983-09-01 | 1985-10-22 | Stewart Dean | Reading device for the visually handicapped |
US4565999A (en) | 1983-04-01 | 1986-01-21 | Prime Computer, Inc. | Light pencil |
US4567479A (en) | 1982-12-23 | 1986-01-28 | Boyd Barry S | Directional controller apparatus for a video or computer input |
US4682159A (en) | 1984-06-20 | 1987-07-21 | Personics Corporation | Apparatus and method for controlling a cursor on a computer display |
US4790028A (en) | 1986-09-12 | 1988-12-06 | Westinghouse Electric Corp. | Method and apparatus for generating variably scaled displays |
US5003300A (en) | 1987-07-27 | 1991-03-26 | Reflection Technology, Inc. | Head mounted display for miniature video display system |
US5109282A (en) | 1990-06-20 | 1992-04-28 | Eye Research Institute Of Retina Foundation | Halftone imaging method and apparatus utilizing pyramidol error convergence |
US5125046A (en) | 1990-07-26 | 1992-06-23 | Ronald Siwoff | Digitally enhanced imager for the visually impaired |
US5151722A (en) | 1990-11-05 | 1992-09-29 | The Johns Hopkins University | Video display on spectacle-like frame |
US5195180A (en) | 1988-06-23 | 1993-03-16 | Sharp Kabushiki Kaisha | Method for displaying an image including characters and a background |
US5267331A (en) | 1990-07-26 | 1993-11-30 | Ronald Siwoff | Digitally enhanced imager for the visually impaired |
US5281957A (en) | 1984-11-14 | 1994-01-25 | Schoolman Scientific Corp. | Portable computer and head mounted display |
US5283560A (en) | 1991-06-25 | 1994-02-01 | Digital Equipment Corporation | Computer system and method for displaying images with superimposed partially transparent menus |
US5320538A (en) | 1992-09-23 | 1994-06-14 | Hughes Training, Inc. | Interactive aircraft training system and method |
US5322441A (en) | 1990-10-05 | 1994-06-21 | Texas Instruments Incorporated | Method and apparatus for providing a portable visual display |
US5325123A (en) | 1992-04-16 | 1994-06-28 | Bettinardi Edward R | Method and apparatus for variable video magnification |
US5359675A (en) | 1990-07-26 | 1994-10-25 | Ronald Siwoff | Video spectacles |
US5367614A (en) | 1992-04-01 | 1994-11-22 | Grumman Aerospace Corporation | Three-dimensional computer image variable perspective display system |
US5367315A (en) | 1990-11-15 | 1994-11-22 | Eyetech Corporation | Method and apparatus for controlling cursor movement |
US5373857A (en) | 1993-06-18 | 1994-12-20 | Forte Technologies, Inc. | Head tracking apparatus |
US5422653A (en) | 1993-01-07 | 1995-06-06 | Maguire, Jr.; Francis J. | Passive virtual reality |
US5442734A (en) | 1991-03-06 | 1995-08-15 | Fujitsu Limited | Image processing unit and method for executing image processing of a virtual environment |
US5450596A (en) | 1991-07-18 | 1995-09-12 | Redwear Interactive Inc. | CD-ROM data retrieval system using a hands-free command controller and headwear monitor |
US5526481A (en) | 1993-07-26 | 1996-06-11 | Dell Usa L.P. | Display scrolling system for personal digital assistant |
US5526812A (en) | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5579026A (en) | 1993-05-14 | 1996-11-26 | Olympus Optical Co., Ltd. | Image display apparatus of head mounted type |
US5581271A (en) | 1994-12-05 | 1996-12-03 | Hughes Aircraft Company | Head mounted visual display |
US5581670A (en) | 1993-07-21 | 1996-12-03 | Xerox Corporation | User interface having movable sheet with click-through tools |
US5587936A (en) | 1990-11-30 | 1996-12-24 | Vpl Research, Inc. | Method and apparatus for creating sounds in a virtual world by simulating sound in specific locations in space and generating sounds as touch feedback |
US5590062A (en) | 1993-07-02 | 1996-12-31 | Matsushita Electric Industrial Co., Ltd. | Simulator for producing various living environments mainly for visual perception |
US5602566A (en) | 1993-08-24 | 1997-02-11 | Hitachi, Ltd. | Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor |
US5617114A (en) | 1993-07-21 | 1997-04-01 | Xerox Corporation | User interface having click-through tools that can be composed with other tools |
US5645077A (en) | 1994-06-16 | 1997-07-08 | Massachusetts Institute Of Technology | Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body |
US5661632A (en) | 1994-01-04 | 1997-08-26 | Dell Usa, L.P. | Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions |
US5666499A (en) | 1995-08-04 | 1997-09-09 | Silicon Graphics, Inc. | Clickaround tool-based graphical interface with two cursors |
US5675746A (en) | 1992-09-30 | 1997-10-07 | Marshall; Paul S. | Virtual reality generator for use with financial information |
US5683297A (en) | 1994-12-16 | 1997-11-04 | Raviv; Roni | Head mounted modular electronic game system |
US5686940A (en) | 1993-12-24 | 1997-11-11 | Rohm Co., Ltd. | Display apparatus |
US5689287A (en) | 1993-10-27 | 1997-11-18 | Xerox Corporation | Context-preserving display system using a perspective sheet |
US5689619A (en) | 1996-08-09 | 1997-11-18 | The United States Of America As Represented By The Secretary Of The Army | Eyetracker control of heads-up displays |
US5689667A (en) | 1995-06-06 | 1997-11-18 | Silicon Graphics, Inc. | Methods and system of controlling menus with radial and linear portions |
US5734421A (en) | 1995-05-30 | 1998-03-31 | Maguire, Jr.; Francis J. | Apparatus for inducing attitudinal head movements for passive virtual reality |
US5742264A (en) * | 1995-01-24 | 1998-04-21 | Matsushita Electric Industrial Co., Ltd. | Head-mounted display |
US5777715A (en) | 1997-01-21 | 1998-07-07 | Allen Vision Systems, Inc. | Low vision rehabilitation system |
US5790769A (en) | 1995-08-04 | 1998-08-04 | Silicon Graphics Incorporated | System for editing time-based temporal digital media including a pointing device toggling between temporal and translation-rotation modes |
US5835077A (en) | 1995-01-13 | 1998-11-10 | Remec, Inc., | Computer control device |
US5844824A (en) | 1995-10-02 | 1998-12-01 | Xybernaut Corporation | Hands-free, portable computer and system |
US5844544A (en) | 1994-06-17 | 1998-12-01 | H. K. Eyecan Ltd. | Visual communications apparatus employing eye-position monitoring |
US5923307A (en) | 1997-01-27 | 1999-07-13 | Microsoft Corporation | Logical monitor configuration in a multiple monitor environment |
US5959605A (en) | 1995-11-22 | 1999-09-28 | Picker International, Inc. | Video magnifier |
US5973669A (en) | 1996-08-22 | 1999-10-26 | Silicon Graphics, Inc. | Temporal data control system |
US5977935A (en) * | 1993-08-12 | 1999-11-02 | Seiko Epson Corporation | Head-mounted image display device and data processing apparatus including the same |
US5991085A (en) | 1995-04-21 | 1999-11-23 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
US6005482A (en) | 1998-09-17 | 1999-12-21 | Xerox Corporation | Surface mounted information collage |
US6061064A (en) | 1993-08-31 | 2000-05-09 | Sun Microsystems, Inc. | System and method for providing and using a computer user interface with a view space having discrete portions |
US6084556A (en) * | 1995-11-28 | 2000-07-04 | Vega Vista, Inc. | Virtual computer monitor |
US6115028A (en) | 1996-08-22 | 2000-09-05 | Silicon Graphics, Inc. | Three dimensional input system using tilt |
US6115025A (en) | 1997-09-30 | 2000-09-05 | Silicon Graphics, Inc. | System for maintaining orientation of a user interface as a display changes orientation |
US6118427A (en) | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6148271A (en) | 1998-01-14 | 2000-11-14 | Silicon Pie, Inc. | Speed, spin rate, and curve measuring device |
US6151563A (en) | 1998-01-14 | 2000-11-21 | Silicon Pie, Inc. | Speed, spin rate, and curve measuring device using magnetic field sensors |
US6184847B1 (en) | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6184859B1 (en) | 1995-04-21 | 2001-02-06 | Sony Corporation | Picture display apparatus |
US6292158B1 (en) | 1997-05-08 | 2001-09-18 | Shimadzu Corporation | Display system |
US6457024B1 (en) | 1991-07-18 | 2002-09-24 | Lee Felsentein | Wearable hypermedium system |
US6590583B2 (en) | 1996-05-14 | 2003-07-08 | Planetweb, Inc. | Method for context-preserving magnification of digital image regions |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4836670A (en) * | 1987-08-19 | 1989-06-06 | Center For Innovative Technology | Eye movement detector |
US6351261B1 (en) * | 1993-08-31 | 2002-02-26 | Sun Microsystems, Inc. | System and method for a virtual reality system having a frame buffer that stores a plurality of view points that can be selected and viewed by the user |
-
1999
- 1999-09-22 WO PCT/US1999/021235 patent/WO2000017848A1/en not_active Application Discontinuation
- 1999-09-22 US US09/404,051 patent/US6184847B1/en not_active Ceased
- 1999-09-22 JP JP2000571431A patent/JP2002525769A/en active Pending
- 1999-09-22 EP EP99948245A patent/EP1116211A4/en not_active Withdrawn
-
2009
- 2009-10-06 US US12/574,607 patent/USRE42336E1/en not_active Expired - Lifetime
Patent Citations (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4227209A (en) | 1978-08-09 | 1980-10-07 | The Charles Stark Draper Laboratory, Inc. | Sensory aid for visually handicapped people |
US4209255A (en) | 1979-03-30 | 1980-06-24 | United Technologies Corporation | Single source aiming point locator |
US4567479A (en) | 1982-12-23 | 1986-01-28 | Boyd Barry S | Directional controller apparatus for a video or computer input |
US4565999A (en) | 1983-04-01 | 1986-01-21 | Prime Computer, Inc. | Light pencil |
US4548485A (en) | 1983-09-01 | 1985-10-22 | Stewart Dean | Reading device for the visually handicapped |
US4682159A (en) | 1984-06-20 | 1987-07-21 | Personics Corporation | Apparatus and method for controlling a cursor on a computer display |
US5281957A (en) | 1984-11-14 | 1994-01-25 | Schoolman Scientific Corp. | Portable computer and head mounted display |
US4790028A (en) | 1986-09-12 | 1988-12-06 | Westinghouse Electric Corp. | Method and apparatus for generating variably scaled displays |
US5003300A (en) | 1987-07-27 | 1991-03-26 | Reflection Technology, Inc. | Head mounted display for miniature video display system |
US5195180A (en) | 1988-06-23 | 1993-03-16 | Sharp Kabushiki Kaisha | Method for displaying an image including characters and a background |
US5109282A (en) | 1990-06-20 | 1992-04-28 | Eye Research Institute Of Retina Foundation | Halftone imaging method and apparatus utilizing pyramidol error convergence |
US5267331A (en) | 1990-07-26 | 1993-11-30 | Ronald Siwoff | Digitally enhanced imager for the visually impaired |
US5125046A (en) | 1990-07-26 | 1992-06-23 | Ronald Siwoff | Digitally enhanced imager for the visually impaired |
US5359675A (en) | 1990-07-26 | 1994-10-25 | Ronald Siwoff | Video spectacles |
US5322441A (en) | 1990-10-05 | 1994-06-21 | Texas Instruments Incorporated | Method and apparatus for providing a portable visual display |
US5151722A (en) | 1990-11-05 | 1992-09-29 | The Johns Hopkins University | Video display on spectacle-like frame |
US5367315A (en) | 1990-11-15 | 1994-11-22 | Eyetech Corporation | Method and apparatus for controlling cursor movement |
US5587936A (en) | 1990-11-30 | 1996-12-24 | Vpl Research, Inc. | Method and apparatus for creating sounds in a virtual world by simulating sound in specific locations in space and generating sounds as touch feedback |
US5442734A (en) | 1991-03-06 | 1995-08-15 | Fujitsu Limited | Image processing unit and method for executing image processing of a virtual environment |
US5283560A (en) | 1991-06-25 | 1994-02-01 | Digital Equipment Corporation | Computer system and method for displaying images with superimposed partially transparent menus |
US5450596A (en) | 1991-07-18 | 1995-09-12 | Redwear Interactive Inc. | CD-ROM data retrieval system using a hands-free command controller and headwear monitor |
US6457024B1 (en) | 1991-07-18 | 2002-09-24 | Lee Felsentein | Wearable hypermedium system |
US5367614A (en) | 1992-04-01 | 1994-11-22 | Grumman Aerospace Corporation | Three-dimensional computer image variable perspective display system |
US5325123A (en) | 1992-04-16 | 1994-06-28 | Bettinardi Edward R | Method and apparatus for variable video magnification |
US5320538A (en) | 1992-09-23 | 1994-06-14 | Hughes Training, Inc. | Interactive aircraft training system and method |
US5675746A (en) | 1992-09-30 | 1997-10-07 | Marshall; Paul S. | Virtual reality generator for use with financial information |
US5422653A (en) | 1993-01-07 | 1995-06-06 | Maguire, Jr.; Francis J. | Passive virtual reality |
US5579026A (en) | 1993-05-14 | 1996-11-26 | Olympus Optical Co., Ltd. | Image display apparatus of head mounted type |
US5373857A (en) | 1993-06-18 | 1994-12-20 | Forte Technologies, Inc. | Head tracking apparatus |
US5526812A (en) | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5590062A (en) | 1993-07-02 | 1996-12-31 | Matsushita Electric Industrial Co., Ltd. | Simulator for producing various living environments mainly for visual perception |
US5581670A (en) | 1993-07-21 | 1996-12-03 | Xerox Corporation | User interface having movable sheet with click-through tools |
US5617114A (en) | 1993-07-21 | 1997-04-01 | Xerox Corporation | User interface having click-through tools that can be composed with other tools |
US5526481A (en) | 1993-07-26 | 1996-06-11 | Dell Usa L.P. | Display scrolling system for personal digital assistant |
US5977935A (en) * | 1993-08-12 | 1999-11-02 | Seiko Epson Corporation | Head-mounted image display device and data processing apparatus including the same |
US5602566A (en) | 1993-08-24 | 1997-02-11 | Hitachi, Ltd. | Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor |
US6353436B1 (en) | 1993-08-31 | 2002-03-05 | Sun Microsystems, Inc. | Graphical user interface |
US6061064A (en) | 1993-08-31 | 2000-05-09 | Sun Microsystems, Inc. | System and method for providing and using a computer user interface with a view space having discrete portions |
US5689287A (en) | 1993-10-27 | 1997-11-18 | Xerox Corporation | Context-preserving display system using a perspective sheet |
US5686940A (en) | 1993-12-24 | 1997-11-11 | Rohm Co., Ltd. | Display apparatus |
US5661632A (en) | 1994-01-04 | 1997-08-26 | Dell Usa, L.P. | Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions |
US6361507B1 (en) | 1994-06-16 | 2002-03-26 | Massachusetts Institute Of Technology | Inertial orientation tracker having gradual automatic drift compensation for tracking human head and other similarly sized body |
US5645077A (en) | 1994-06-16 | 1997-07-08 | Massachusetts Institute Of Technology | Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body |
US5844544A (en) | 1994-06-17 | 1998-12-01 | H. K. Eyecan Ltd. | Visual communications apparatus employing eye-position monitoring |
US5581271A (en) | 1994-12-05 | 1996-12-03 | Hughes Aircraft Company | Head mounted visual display |
US5683297A (en) | 1994-12-16 | 1997-11-04 | Raviv; Roni | Head mounted modular electronic game system |
US5835077A (en) | 1995-01-13 | 1998-11-10 | Remec, Inc., | Computer control device |
US5742264A (en) * | 1995-01-24 | 1998-04-21 | Matsushita Electric Industrial Co., Ltd. | Head-mounted display |
US6184859B1 (en) | 1995-04-21 | 2001-02-06 | Sony Corporation | Picture display apparatus |
US5991085A (en) | 1995-04-21 | 1999-11-23 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
US5734421A (en) | 1995-05-30 | 1998-03-31 | Maguire, Jr.; Francis J. | Apparatus for inducing attitudinal head movements for passive virtual reality |
US5689667A (en) | 1995-06-06 | 1997-11-18 | Silicon Graphics, Inc. | Methods and system of controlling menus with radial and linear portions |
US5926178A (en) | 1995-06-06 | 1999-07-20 | Silicon Graphics, Inc. | Display and control of menus with radial and linear portions |
US5666499A (en) | 1995-08-04 | 1997-09-09 | Silicon Graphics, Inc. | Clickaround tool-based graphical interface with two cursors |
US5790769A (en) | 1995-08-04 | 1998-08-04 | Silicon Graphics Incorporated | System for editing time-based temporal digital media including a pointing device toggling between temporal and translation-rotation modes |
US5844824A (en) | 1995-10-02 | 1998-12-01 | Xybernaut Corporation | Hands-free, portable computer and system |
US5959605A (en) | 1995-11-22 | 1999-09-28 | Picker International, Inc. | Video magnifier |
US6084556A (en) * | 1995-11-28 | 2000-07-04 | Vega Vista, Inc. | Virtual computer monitor |
US6445364B2 (en) * | 1995-11-28 | 2002-09-03 | Vega Vista, Inc. | Portable game display and method for controlling same |
US6127990A (en) | 1995-11-28 | 2000-10-03 | Vega Vista, Inc. | Wearable display and methods for controlling same |
US6359603B1 (en) * | 1995-11-28 | 2002-03-19 | Vega Vista, Inc. | Portable display and methods of controlling same |
US6118427A (en) | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6590583B2 (en) | 1996-05-14 | 2003-07-08 | Planetweb, Inc. | Method for context-preserving magnification of digital image regions |
US5689619A (en) | 1996-08-09 | 1997-11-18 | The United States Of America As Represented By The Secretary Of The Army | Eyetracker control of heads-up displays |
US6115028A (en) | 1996-08-22 | 2000-09-05 | Silicon Graphics, Inc. | Three dimensional input system using tilt |
US5973669A (en) | 1996-08-22 | 1999-10-26 | Silicon Graphics, Inc. | Temporal data control system |
US5777715A (en) | 1997-01-21 | 1998-07-07 | Allen Vision Systems, Inc. | Low vision rehabilitation system |
US5923307A (en) | 1997-01-27 | 1999-07-13 | Microsoft Corporation | Logical monitor configuration in a multiple monitor environment |
US6292158B1 (en) | 1997-05-08 | 2001-09-18 | Shimadzu Corporation | Display system |
US6115025A (en) | 1997-09-30 | 2000-09-05 | Silicon Graphics, Inc. | System for maintaining orientation of a user interface as a display changes orientation |
US6151563A (en) | 1998-01-14 | 2000-11-21 | Silicon Pie, Inc. | Speed, spin rate, and curve measuring device using magnetic field sensors |
US6148271A (en) | 1998-01-14 | 2000-11-14 | Silicon Pie, Inc. | Speed, spin rate, and curve measuring device |
US6005482A (en) | 1998-09-17 | 1999-12-21 | Xerox Corporation | Surface mounted information collage |
US6184847B1 (en) | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
Non-Patent Citations (74)
Title |
---|
Article entitled "Compensating lags in Head-Coupled Displays Using Head Position Prediction and Image Deflection," Journal of Aircraft, vol. 29, No. 6, Nov.-Dec. 1992, by Richard H.Y. So and Michael J. Griffin (pp. 1064 to 1068). |
Article entitled "Improving Static and Dynamic Registration in an Optical See-through HMD," by Ronald Azuma and Gary Bishop, Computer Graphics Proceedings Annual Conference Series 1994, Jul. 24, 1994 (pp. 197 to 203). |
Article entitled "Priority Rendering with a Virtual Reality Address Recalculation Pipeline" Computer Graphics Proceedings, Annual Conference Series, 1994 (pp. 155 to 162). |
Chameleon apparatus (1990) as depicted in http://www.dgp.toronto.edu/~gf/videos.htm at links http://www.dgp.toronto.edu/~gf/videos/Chameleon.mpg 1994 and http://www.dgp.toronto.edu/~gf/videos/Spatially-aware%20palmtop%20-%20Chameleon.mpg (screenshots and transcription of audio associated with video are provided). |
Chameleon apparatus (1990) as depicted in http://www.dgp.toronto.edu/˜gf/videos.htm at links http://www.dgp.toronto.edu/˜gf/videos/Chameleon.mpg 1994 and http://www.dgp.toronto.edu/˜gf/videos/Spatially-aware%20palmtop%20-%20Chameleon.mpg (screenshots and transcription of audio associated with video are provided). |
Examiner's Interview Summary dated Jun. 12, 2000 for U.S. Appl. No. 09/235,096. |
Final Office Action dated Sep. 7, 2000 for U.S. Appl. No. 09/373,186. |
Flyer, "1995 Master Source-Book," Industrial Computer Source, 1995 (2 pages). |
Flyer, "Computer Magnification Systems," TeleSensory, 1995 (4 pages). |
Flyer, "Digital Audio Soundblaster," Creative Labs, Inc., Sep. 19, 1995 (one page). |
Flyer, "Dragon Dictate the Premier PC Dictation Program," Dragon Systems, Inc., Dec. 1994 (4 pages). |
Flyer, "Introducing Head Master Plus the Mouse and Keyboard Alternative for Personal Computers," Prentke Romich Company, Mar. 1995 (2 pages). |
Flyer, "Magnify your screen and your possibilities," ZoomText, Mar. 1995 (two pages). |
Flyer, "Magnum GT Graphics & Text Screen Enlarger," Artic Technologies, Jan. 1, 1995 (one page). |
Flyer, "MGA Power Family," Matrox Graphics Inc., Nov. 1995 (2 pages). |
Flyer, "OPTELEC Independence Solutions for People with Low Vision" Optelec, 1993 (6 pages). |
Flyer, "Talk to your PC Just Voice: Professional Speech Recognition for Windows," Integrated Wave Technologies, Inc., Nov. 1995 (3 pages). |
Flyer, "Ultra-small angular velocity sensor with Murata's unique triangular prism vibrating unit" Gyrostar, Murata Mfg. Co., Ltd., Aug. 29, 1995 (2 pages). |
Flyer, "Virtual Reality Products That Work As Hard As You Do," General Reality Company, Mar. 1995, (6 pages). |
Flyer, "Why Mouse Dual Input Adapters," P.I. Engineering, 1995, (2 pages). |
Flyer, A Brighter Picture A Fuller Life-the Visionics Low Vision Enhancing, Visionics Corporation, Mar. 1995 (4 pages). |
Flyer, MAGic Deluxe Microsystems Software, Inc., Mar. 1995 (two pages). |
Flyer, Virtual Computer Monitor, General Reality Corporation, Mar. 1995 (2 pages). |
Goodrich GL, Meh EB, and Darling NC: Parameters in The Use of CCTV's and Optical Aids. Am Jour Optom, vol. 57, No. 12, pp. 881-892, 1980. |
Grant of Petition for Revival of an Application for Patent Abandoned Unintentionally dated Jun. 25, 2001 for U.S. Appl. No. 09/373,186. |
IDS dated Aug. 12, 1999 for U.S. Appl. No. 09/373,186. |
IDS dated Dec. 11, 1998 for U.S. Appl. No. 08/563,525. |
IDS dated Feb. 16, 1996 and Supplemental IDS dated Feb. 27, 1996 for U.S. Appl. No. 08/563,525. |
IDS dated Jan. 21, 1999 for U.S. Appl. No. 09/235,096. |
IDS dated Jan. 30, 2001 for U.S. Appl. No. 09/373,186 and petition under 37 C.F.R. 1.97(d)(2). |
IDS dated Jun. 25, 1999 for U.S. Appl. No. 09/264,799. |
IDS dated Mar. 28, 2001 for U.S. Appl. No. 09/373,186. |
IDS dated Mar. 8, 1999 for U.S. Appl. No. 09/235,096. |
IDS dated Mar. 8, 1999 for U.S. Appl. No. 09/264,799. |
IDS dated Oct. 16, 2000 for U.S. Appl. No. 09/373,186. |
IDS dated Oct. 27, 2000 for U.S. Appl. No. 09/373,186. |
Inertial proprioceptive devices: Self-motion-sensing toys and tools by C. Verplaetse, IBM Systems Journal, vol. 35, Nos. 3&4, 1996, pp. 639-650. |
International Search Report dated Feb. 2, 2000 for PCT Patent Application No. PCT/US99/21235. |
International Search Report dated Sep. 19, 2000 for PCT Patent Application No. PCT/US00/15210. |
Legge G, Pelli D, et al. Report of the Low Vision and its Rehabilitation Panel. Vision Research-A National Plan 19 94-1998, A Report of the National Advisory Eye Council, 1994, pp. 304-321. |
Notice of Abandonment dated Dec. 2, 2003 for U.S. Appl. No. 10/183,181. |
Notice of Abandonment dated Dec. 27, 1999 for U.S. Appl. No. 08/563,525. |
Notice of Abandonment dated Jun. 15, 2004 for U.S. Appl. No. 09/895,576. |
Notice of Allowance dated Aug. 13, 2001 for U.S. Appl. No. 09/373,186. |
Notice of Allowance dated Jul. 19, 1999 for U.S. Appl. No. 09/235,096. |
Notice of Allowance dated Jul. 29, 1999 for U.S. Appl. No. 09/264,799. |
Notice of Allowance dated May 21, 2002 for U.S. Appl. No. 09/895,765. |
Notice of Allowance dated Nov. 2, 1998 for U.S. Appl. No. 08/563,525. |
Office Action dated Apr. 6, 1998 for U.S. Appl. No. 08/563,525. |
Office Action dated Apr. 9, 2003 for U.S. Appl. No. 10/183,181. |
Office Action dated Dec. 5, 2003 for U.S. Appl. No. 09/895,576. |
Office Action dated Feb. 15, 2002 for U.S. Appl. No. 09/895,765. |
Office Action dated Jun. 10, 1999 for U.S. Appl. No. 09/235,096. |
Office Action dated Mar. 21, 2000 for U.S. Appl. No. 09/373,186. |
Office Action dated Oct. 3, 2002 for U.S. Appl. No. 10/183,181. |
Petition for Revival of an Application for Patent Abandoned Unintentionally and RCE dated May 11, 2001 for U.S. Appl. No. 09/373,186. |
Preliminary Amendment dated Aug. 12, 1999 for U.S. Appl. No. 09/373,186. |
Preliminary Amendment dated Jan. 21, 1999 for U.S. Appl. No. 09/235,096. |
Preliminary Amendment dated Mar. 8, 1999 for U.S. Appl. No. 09/264,799. |
Preliminary Amendment dated Sep. 24, 2002 for U.S. Appl. No. 10/183,181. |
Publication entitled "Virtual Computer Monitor For Visually Impaired Users" by Arthur L. Zwern, General Reality Company and Michael R. Clark, Apple Computer Corporation, Nov. 30, 1994 (9 pages). |
Publication entitled "Virtual Computer Monitor for Visually Impaired Users" by Arthur L. Zwern, General Reality Company and Michael R. Clark, Apple Computer, Inc., Advanced Technology Group, Aug. 28, 1995 (10 pages). |
Request for Certificate of Correction dated Apr. 30, 2002 for U.S. Appl. No. 09/373,186. |
Request for Corrected Filing Receipt dated Jun. 12, 2001 for U.S. Appl. No. 09/373,186. |
Response to Apr. 6, 1998 Office Action for U.S. Appl. No. 08/563,525 dated Aug. 11, 1998. |
Response to Jun. 10, 1999 Office Action for U.S. Appl. No. 09/235,096 dated Jun. 29, 1999. |
Response to Mar. 21, 2000 Office Action for U.S. Appl. No. 09/373,186 dated Jun. 21, 2000. |
Response to Oct. 3, 2002 Office Action for U.S. Appl. No. 10/183,181 dated Jan. 16, 2003. |
Response to Sep. 7, 2000 Final Office Action for U.S. Appl. No. 09/373,186 dated Dec. 7, 2000. |
Re-submission of Response to Sep. 7, 2000 Final Office Action for U.S. Appl. No. 09/373,186 dated Apr. 30, 2001. |
Slides on "Virtual Computer Display for Visually-Impaired Users," CyberEye by General Reality, Nov. 30, 1994 (12 pages). |
Slides, "Anta" poster paper, General Reality Company, Feb. 13, 1996 (12 pages). |
Slides, "Virtual Computer Monitor for Visually-Impaired Users" by Arthur Zwern, General Reality Company and Michael Clark, Apple Computer Corporation, Aug. 28, 1995 (12 pages). |
Terminal Disclaimer dated Apr. 16, 2002 for U.S. Appl. No. 09/895,765. |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8606316B2 (en) * | 2009-10-21 | 2013-12-10 | Xerox Corporation | Portable blind aid device |
US20110092249A1 (en) * | 2009-10-21 | 2011-04-21 | Xerox Corporation | Portable blind aid device |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US10539787B2 (en) | 2010-02-28 | 2020-01-21 | Microsoft Technology Licensing, Llc | Head-worn adaptive display |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US10268888B2 (en) | 2010-02-28 | 2019-04-23 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
US20120194549A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Ar glasses specific user interface based on a connected external device type |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9875406B2 (en) | 2010-02-28 | 2018-01-23 | Microsoft Technology Licensing, Llc | Adjustable extension for temple arm |
US9759917B2 (en) | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
US9285589B2 (en) | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US10860100B2 (en) | 2010-02-28 | 2020-12-08 | Microsoft Technology Licensing, Llc | AR glasses with predictive control of external device based on event input |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9329689B2 (en) | 2010-02-28 | 2016-05-03 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US20120165099A1 (en) * | 2010-12-22 | 2012-06-28 | Nintendo Co., Ltd. | Game system, game device, storage medium storing game program, and game process method |
US8957910B2 (en) * | 2010-12-22 | 2015-02-17 | Nintendo Co., Ltd. | Game system, game device, storage medium storing game program, and game process method |
US10300383B2 (en) | 2010-12-22 | 2019-05-28 | Nintendo Co., Ltd. | Game system, game device, storage medium storing game program, and game process method |
US9808717B2 (en) | 2010-12-22 | 2017-11-07 | Nintendo Co., Ltd. | Game system, game device, storage medium storing game program, and game process method |
US9013264B2 (en) | 2011-03-12 | 2015-04-21 | Perceptive Devices, Llc | Multipurpose controller for electronic devices, facial expressions management and drowsiness detection |
WO2013028268A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation | Method and system for use in providing three dimensional user interface |
CN103180893B (en) * | 2011-08-23 | 2016-01-20 | 索尼公司 | For providing the method and system of three-dimensional user interface |
CN103180893A (en) * | 2011-08-23 | 2013-06-26 | 索尼公司 | Method and system for use in providing three dimensional user interface |
US8970692B2 (en) | 2011-09-01 | 2015-03-03 | Industrial Technology Research Institute | Head mount personal computer and interactive system using the same |
US9547406B1 (en) | 2011-10-31 | 2017-01-17 | Google Inc. | Velocity-based triggering |
US20130117707A1 (en) * | 2011-11-08 | 2013-05-09 | Google Inc. | Velocity-Based Triggering |
US10067559B2 (en) | 2011-11-30 | 2018-09-04 | Google Llc | Graphical interface having adjustable borders |
US20130182016A1 (en) * | 2012-01-16 | 2013-07-18 | Beijing Lenovo Software Ltd. | Portable device and display processing method |
US9245364B2 (en) * | 2012-01-16 | 2016-01-26 | Lenovo (Beijing) Co., Ltd. | Portable device and display processing method for adjustment of images |
US9035878B1 (en) | 2012-02-29 | 2015-05-19 | Google Inc. | Input system |
US8643951B1 (en) | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
US9557152B2 (en) | 2012-09-28 | 2017-01-31 | Google Inc. | Use of comparative sensor data to determine orientation of head relative to body |
US9268136B1 (en) | 2012-09-28 | 2016-02-23 | Google Inc. | Use of comparative sensor data to determine orientation of head relative to body |
US9448687B1 (en) | 2014-02-05 | 2016-09-20 | Google Inc. | Zoomable/translatable browser interface for a head mounted device |
US9996149B1 (en) | 2016-02-22 | 2018-06-12 | Immersacad Corporation | Method for one-touch translational navigation of immersive, virtual reality environments |
US20220155853A1 (en) * | 2020-11-19 | 2022-05-19 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality information prompting system, display control method, equipment and medium |
US11703945B2 (en) * | 2020-11-19 | 2023-07-18 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality information prompting system, display control method, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
EP1116211A1 (en) | 2001-07-18 |
JP2002525769A (en) | 2002-08-13 |
WO2000017848A1 (en) | 2000-03-30 |
US6184847B1 (en) | 2001-02-06 |
EP1116211A4 (en) | 2001-11-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE42336E1 (en) | Intuitive control of portable data displays | |
US6359603B1 (en) | Portable display and methods of controlling same | |
RU2288512C2 (en) | Method and system for viewing information on display | |
US10318017B2 (en) | Viewing images with tilt control on a hand-held device | |
US9628783B2 (en) | Method for interacting with virtual environment using stereoscope attached to computing device and modifying view of virtual environment based on user input in order to be displayed on portion of display | |
US6124843A (en) | Head mounting type image display system | |
US20100128112A1 (en) | Immersive display system for interacting with three-dimensional content | |
CN114080585A (en) | Virtual user interface using peripheral devices in an artificial reality environment | |
US8179366B2 (en) | Systems and methods for using a movable object to control a computer | |
US20130154913A1 (en) | Systems and methods for a gaze and gesture interface | |
US20040095311A1 (en) | Body-centric virtual interactive apparatus and method | |
US20020158908A1 (en) | Web browser user interface for low-resolution displays | |
JP2014135086A (en) | Three dimensional user interface effects on display by using properties of motion | |
US20160334884A1 (en) | Remote Sensitivity Adjustment in an Interactive Display System | |
CN110489027B (en) | Handheld input device and display position control method and device of indication icon of handheld input device | |
US11183141B2 (en) | Display device and method for controlling same | |
JP2024026103A (en) | Devices, methods, and graphical user interfaces for gaze-based navigation | |
US20060119574A1 (en) | Systems and methods for using a movable object to control a computer | |
Jay et al. | Amplifying head movements with head-mounted displays | |
US6297803B1 (en) | Apparatus and method for image display | |
CN110717993A (en) | Interaction method, system and medium of split type AR glasses system | |
KR100320297B1 (en) | Virtual Space Navigation Interface Method Using Body Icons | |
JP2001337645A (en) | Display system and storage medium | |
CN114253389B (en) | Augmented reality system integrating motion sensor and augmented reality display method | |
TW201925989A (en) | Interactive system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP, VIRGI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REMBRANDT TECHNOLOGIES, LP;REEL/FRAME:025348/0041 Effective date: 20101109 |
|
AS | Assignment |
Owner name: VEGA VISTA, INC., CALIFORNIA Free format text: PATENT ACQUISITION AGREEMENT;ASSIGNOR:REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP;REEL/FRAME:028466/0229 Effective date: 20120329 |
|
REMI | Maintenance fee reminder mailed | ||
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PMFG); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees | ||
REIN | Reinstatement after maintenance fee payment confirmed | ||
AS | Assignment |
Owner name: VEGA VISTA, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REMBRANDT PORTABLE DISPLAY TECHNOLOGIES, LP;REEL/FRAME:030469/0811 Effective date: 20120329 |
|
PRDP | Patent reinstated due to the acceptance of a late maintenance fee |
Effective date: 20110510 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: REMBRANDT TECHNOLOGIES, LP, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VEGA VISTA, INC.;REEL/FRAME:030879/0322 Effective date: 20071018 Owner name: VEGA VISTA, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FATEH, SINA;FLACK, JAMES F.;ZWERN, ARTHUR L.;SIGNING DATES FROM 20000118 TO 20000905;REEL/FRAME:030876/0458 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: FACEBOOK, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VEGA VISTA, INC.;REEL/FRAME:042803/0921 Effective date: 20140724 |
|
AS | Assignment |
Owner name: META PLATFORMS, INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK, INC.;REEL/FRAME:058897/0824 Effective date: 20211028 |