US4034401A - Observer-identification of a target or other point of interest in a viewing field - Google Patents

Observer-identification of a target or other point of interest in a viewing field Download PDF

Info

Publication number
US4034401A
US4034401A US05/678,795 US67879576A US4034401A US 4034401 A US4034401 A US 4034401A US 67879576 A US67879576 A US 67879576A US 4034401 A US4034401 A US 4034401A
Authority
US
United States
Prior art keywords
eye
observer
point
display
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US05/678,795
Inventor
George Mann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smiths Group PLC
Original Assignee
Smiths Group PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smiths Group PLC filed Critical Smiths Group PLC
Application granted granted Critical
Publication of US4034401A publication Critical patent/US4034401A/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/22Aiming or laying means for vehicle-borne armament, e.g. on aircraft
    • F41G3/225Helmet sighting systems

Definitions

  • the system may compute appropriate coordinates of location of the identified target or other point of interest within the field, by reference both to the recorded measure of eye position and the corresponding measure made of the eye position when the reference marker is viewed.
  • FIG. 1 To the extent that the system shown in FIG. 1 has so far been described, it constitutes a head-up display system of conventional form used in military aircraft. With such a system it is necessary for the pilot having once recognized a desired target in the external scene, to identify the location of this target to the weapon-aiming computer 13.
  • the weapon-aiming computer 13 responds to the identification of target-location to maintain the symbol 9 of the projected display in register with the target as seen by the pilot in the external scene through the reflector 1.
  • the ability of the computer 13 to maintain such registration throughout the motion of the aircraft and any changes of its attitude depends on the accuracy with which the location of the target is initially identified to the computer by the pilot. Additionally, the usefulness of the facility and its tactical advantage depend very much on the speed with which this identification of target location is made.
  • the pilot again operates the pushbutton 28 to enter the coordinates of the centre-point of the eye in the new position, into the computing unit 27.
  • the position in the display corresponding to these latter coordinates is known and is used as the reference or origin from which the unit 27 computes (using the first-entered coordinates) the appropriate shifts that are to be applied to the symbol 9 for appropriate registration with the identified target.
  • the waveform generator 11 responds to the computed shifts to bring the symbol 9 rapidly into register with the target in the display, and also to apply the appropriate corrections required to maintain it in such relationship irrespective of the movement of the aircraft and changes in its attitude, signalled by the sensors 12.
  • the unit 25 responds to the change by analysis of the video signals it receives, to apply the relevant compensation, linear shifts in the case of either lateral or vertical translation of the head, change of scaling in the case of movement towards or away from the reflector 1, and rotation of axes in the case of rotation of the head.

Abstract

A head-up display system in a military aircraft provides projection of flight and weapon-aiming information into the pilot's line-of-sight through a partially-transparent reflector. The pilot operates a button firstly when he recognizes a target viewing through the display and then again when he fixes his eye on an aiming marker in the display. On each occasion of button operation a measure of his eye position is entered into the system to derive coordinates of the identified target related to the marker position in the display, for generation of the appropriate shifts to move the marker into register with the identified target. The measure of eye position is provided in each case by analysis of video-signal waveforms derived by a vidicon camera that is carried on the pilot's helmet and scans reflection in the helmet-visor of the pilot's eye as illuminated, also by reflection in the visor, from an infra-red source. The analysis involves determination from the video signals of successive line-scans of the eye-image, of the boundaries of the iris (or pupil) in relation to both the frame and line time-bases, and computation therefrom of coordinates of the centerpoint of the pupil. Compensation for head movement in the interval between the two measurements of eye position is provided in accordance with variation during that interval of the phasing (in the time-bases) of response to two arrays of point-sources mounted on said reflector.

Description

This invention relates to display or other systems that involve observer-identification of a target or other point of interest. The invention is applicable to weapon-aiming systems, but is not limited to such systems.
There is a requirement in weapon-aiming systems as used in aircraft or other military operations for an observer having recognized a potential target, to enter identification of the location of this target into the system rapidly. An existing technique involves the use of a hand controller for positioning a display marker in register with the target so as thereby to identify the coordinates of the target-location in the viewing field. In most situations a high degree of skill is required to position the marker accurately, and also the tactical advantage of the system may be degraded by the time delay in performing the marking operation. It is one of the objects of the present invention to provide a system by which these disadvantages can be overcome or reduced.
According to one aspect of the present invention there is provided a system in which the location of a selected point within the field of view of an observer is identified within the system in accordance with a measure of eye position of the observer when looking at that point. The origin or reference to which the measurement is related is derived from a corresponding measure of eye position when the observer is looking at a reference point defined in relation to the field.
The reference point may be a point within the same field of view as that of the selected point, and in this respect its location may be indicated to the observer by a display symbol that appears, or is caused to appear, in the field. Where the observer is to view an external scene directly, an image of the display symbol may be projected, for example onto a partially-transparent reflector within the observer's line of sight, to appear against the background of the external scene. However where a display of such a scene, as provided for example by radar or television equipment, is to be viewed, then the symbol may be superimposed approximately in the generation of that display.
The measurements of eye position may be derived during operations that immediately follow one after the other. More especially, the observer may first search for a potential target or other point of interest in the field of view and once having recognized such point and fixed his eye on it, may then signify this to the system, for example by operation of a pushbutton. The response of the system may be to record an arbitrarily-related measure of the eye position and then present to the observer, or otherwise direct his attention towards, a reference marker that has a known location within the field of view. Once the observer has transferred his eye to the reference marker and signified the fact to the system, the system may compute appropriate coordinates of location of the identified target or other point of interest within the field, by reference both to the recorded measure of eye position and the corresponding measure made of the eye position when the reference marker is viewed.
It is possible for the measure of eye position to the reference marker to be derived before, rather than after, the measure of eye position to the target or other point of interest. In either case the value or values appropriate to such measure may be recorded for use with the measure of eye position made in relation to a plurality of targets or other points of interest.
Measurement of eye position may be achieved using an imaging sensor, such as a vidicon tube or a charge-coupled semiconductor device, that is arranged to view the eye, and means for analyzing the video signal derived to determine coordinates of the centre of the eye-pupil. The eye may be illuminated with radiation within a narrow band of wavelengths, for example with infra-red radiation, and the imaging sensor may then be arranged to be especially responsive to that band of wavelengths. If gated or other modulation of the illumination is also provided, a high signal-to-noise ratio can be achieved to enable accurate determination of the coordinates of the eye-centre in the field of view of the sensor. Illumination of the eye, or viewing of the eye by the imaging sensor, or both, may be by reflection from, for example, a visor that is carried on a helmet worn by the observer. Further, illumination of the eye may be confined to the period of measurement.
It may be possible in certain circumstances to assume that there is no movement of the head of the observer in the interval between the measurement of eye position to the target or other point of interest, and the measurement of eye position to the reference marker. Where this is not the case it will be necessary to determine either the position or orientation, or in most cases both, of the head when each measurement of eye-position is made, and to compensate for the difference accordingly in the output representation of the location of the target or other point.
The present invention according to another of its aspects provides a system that may be used in the above context for determining the position or orientation, or both, of the head. The equipment, which is applicable where eye-position is to be determined also, but is not limited to this, comprises means for presenting a distinctive optical pattern, imaging-sensor means for viewing said pattern to produce video signals in accordance therewith, one of the two said means being adapted to be carried by the head, and means for analyzing said pattern to determine therefrom the relative position or orientation, or both, of the head relative to the other of said two means.
Systems in accordance with either of the above-identified aspects of the present invention may be used in a military context, but are equally applicable outside this. For example, such systems are applicable where there is to be observer-identification of one or more items selected from within an alpha-numeric or other information display. However they are especially applicable in the context of head-up display systems as used in military aircraft.
A head-up display system for use in a military aircraft, in accordance with the present invention, will now be described, by way, of example, with reference to the accompanying drawings, in which:
FIG. 1 is illustrative of the head-up display system as installed in the aircraft;
FIG. 2 illustrates symbology as projected by the pilot's display unit of the system of FIG. 1, for viewing against the background of the external scene from the cockpit of the aircraft;
FIG. 3 illustrates details of the system of FIG. 1 on an enlarged scale;
FIG. 4 is illustrative of the image field scanned by a camera of the system and also of waveforms derived during such scanning; and
FIGS. 5 and 6 are block schematic representations of circuitry that may be used in the system.
Referring to FIG. 1, a partially-transparent reflector 1 of the pilot's display unit is mounted in front of the pilot inclined to his line-of-sight 2 through the aircraft-windscreen 3. A display of flight and weapon-aiming information is projected onto the reflector 1 so that the pilot sees the display image in his line of sight 2 against the background of the external scene through the windscreen 3. The display is projected from the screen 4 of a cathode-ray tube 5 within the pilot's display unit, by an optical system 6 that serves to focus the image seen by the pilot, substantially at infinity.
The information displayed in the reflector 1 includes, as illustrated in FIG. 2, analogue presentation of aircraft attitude involving an horizon symbol 7 (in the form of two spaced and aligned bars) and a flight-vector symbol 8 (in the form of a circle with short laterally-extending arms). The flight-vector symbol 8 remains stationary on the screen 4 of the cathode-ray tube 5 and so its image remains stationary in the pilot's field of view through the reflector 1. The horizon symbol 7, however, moves so as to be seen by the pilot to be displaced angularly, and also up and down, relative to the symbol 8, in accordance with bank and pitching movements respectively of the aircraft. The weapon-aiming information on the other hand, involves a marker symbol 9 (illustrated in the form of a cross) that is moved in the display on the screen 4 so as to be seen by the pilot in image against the external scene through the windscreen 3. The symbol 9 denotes in relation to the external scene the line of aim of the aircraft weapon-system (or a selected part of it) to an identified target, and the pilot's task is to manoeuvre the aircraft to bring the symbol 9 within the flight-vector symbol 8 and accordingly align the aircraft appropriately for firing of the weapon system.
The electric time-base and video signals required to produce the display of flight and weapon-aiming information on the screen 4, are supplied to the cathode-ray tube 5 via a multi-lead cable 10 from a waveform generator 11. The waveform generator 11 generates the relevant video signals in accordance with signals it receives from appropriate attitude, and other, sensors 12 and a weapon-aiming computer 13. In this respect it is to be understood that the display as generated and embodied in the video signals supplied via the cable 10 to the cathode-ray tube 5, may embrace a wider variety of information than that involved in the basic form illustrated in FIG. 2. Any of the information may be presented in digital or analogue form, or both.
To the extent that the system shown in FIG. 1 has so far been described, it constitutes a head-up display system of conventional form used in military aircraft. With such a system it is necessary for the pilot having once recognized a desired target in the external scene, to identify the location of this target to the weapon-aiming computer 13. The weapon-aiming computer 13 responds to the identification of target-location to maintain the symbol 9 of the projected display in register with the target as seen by the pilot in the external scene through the reflector 1. The ability of the computer 13 to maintain such registration throughout the motion of the aircraft and any changes of its attitude, depends on the accuracy with which the location of the target is initially identified to the computer by the pilot. Additionally, the usefulness of the facility and its tactical advantage depend very much on the speed with which this identification of target location is made.
An existing technique for identifying target location involves the use of a hand controller that is manipulated by the pilot to move a marker such as the marker 9, into register with the desired target. Once this has been achieved the pilot operates a pushbutton switch to enter the coordinates of the marker into the computer as identification of the target location. Although accuracy in target identification to the computer can be achieved in this way, manipulation of the hand controller to position the display marker precisely onto the target in the moving aircraft, necessitates exercise of substantial skill by the pilot. A significant loss of time can also occur between the moment of visual recognition of the target and identification of its location to the computer.
With the system of the present invention the requirement for manipulative skill can be avoided, and loss of time between visual recognition and identification of the target to the computer, can be very much reduced. In the latter respect identification of target location to the computer is provided in the present system in accordance with measurements of the pilot's eye-position at the instant he signifies his recognition of a target. The equipment involved in these measurements is illustrated in greater detail in FIG. 3.
Referring more especially to FIG. 3, a source 20 of near infra-red radiation (for example a gallium-arsenide light-emitting diode) is mounted on the helmet 21 worn by the pilot so that the light it emits illuminates one of his eyes by reflection from the inside of the helmet visor 22. The image in the visor 22 of the illuminated eye is scanned via a narrow-band infra-red filter 23 by a vidicon camera 24 that is also mounted on the helmet 21.
The video signals derived by the camera 24 are supplied to a video-signal processor unit 25 within the computer 13 and are there analyzed to derive measurements of the position of the centre of the eye within the scanned field. The analysis carried out is based on the variation in degree of reflected radiation that occurs from sclera to iris, or from iris to pupil, of the eye. In this the position of the eye-centre is determined by reference to the phasing of the maximum `dark` or `black` pulse within the line and frame time-bases of the camera-scanning raster defined by a time-base generator 26. The scanning of the eye within the field of the camera 24, together with waveforms of the consequent signals derived during successive line-scans, is illustrated diagrammatically in simplified form in FIG. 4.
Referring to FIG. 4, the images I and P of the iris and pupil of the pilot's eye appearing within the field F of the camera 24 are scanned repeatedly as part of the conventional scanning of the whole field F performed by the camera 24. The line scans that intersect the iris image I (illustrated generally by lines S1 and S5) give rise to video signals having waveforms W1 to W5 with distinct `dark` pulses DP, of the general form illustrated to the right-hand side of FIG. 4. Certain of these waveforms (W2, W3, W4) are characterised by a secondary `black` pulse SP depending upon whether the relevant scan intersects the pupil image P. The video signal for which the duration of the pulse DP, or of the secondary pulse SP, is the longest (W3) may be readily identified by computation or a comparison process carried out in the unit 25. More particularly, the phasing within the scanning frame A of the line scan (S3) that intersects the eye-pupil image P across, or most closely across, a diameter is determined so as thereby to derive a coordinate Y of the centre-point of the eye. The other coordinate X of the eye-centre is determined from the instant in the identified line scan at which the mid-point of the relevant pulse occurs.
Referring again more especially to FIG. 3, the coordinates X, Y of the centre-point of the eye are derived in the processor unit 25, and are entered into the computing unit 27 of the computer 13 only in response to depression of a pushbutton 28 that, together with the camera 24, is connected to the computer 13 via a multi-lead cable 29 (FIG. 1). The pilot operates the pushbutton 28 as soon as he recognizes a suitable target and while he has his eye fixed on it. The unit 25 responds to operation of the button 28 to command via the waveform generator 11 immediate introduction of a symbol (for example, the symbol 9) into the projected display, or emphasis of an existing symbol (for example, by brightening up the symbol 9 already displayed), to which the pilot then rapidly transfers his attention. Once his eyes are trained on the introduced or emphasized symbol the pilot again operates the pushbutton 28 to enter the coordinates of the centre-point of the eye in the new position, into the computing unit 27. The position in the display corresponding to these latter coordinates is known and is used as the reference or origin from which the unit 27 computes (using the first-entered coordinates) the appropriate shifts that are to be applied to the symbol 9 for appropriate registration with the identified target. The waveform generator 11 responds to the computed shifts to bring the symbol 9 rapidly into register with the target in the display, and also to apply the appropriate corrections required to maintain it in such relationship irrespective of the movement of the aircraft and changes in its attitude, signalled by the sensors 12.
The function of the processor unit 25 in deriving the coordinates X, Y of the centre of the eye, may be implemented using appropriate programming of a general-purpose processor, or alternatively using special-circuits such as illustrated in FIGS. 5 and 6.
Referring to FIG. 5, operation of the pushbutton 28 acts via a switch unit 40 to enable supply of the line-synchronization pulses LS to two latch circuits 41 and 42 throughout one complete scanning frame. The latch circuits 41 and 42 are set by the trailing edge of each synchronization pulse LS to the condition in which clock pulses CP are supplied via gates 43 and 44 to two counters 45 and 46 respectively. The line-synchronization pulses LS are applied to clear the counters 45 and 46 so that from the beginning of each line-scan there is a gradual increase of count in each counter 45 and 46 in accordance with the progress of the scan.
The video signals derived in each line-scan are applied to the latch circuits 41 and 42. Both circuits 41 and 42 are reset by any pulse DP (FIG. 4) occurring in the scan, the circuit 41 by the leading edge and the circuit 42 by the trailing edge. (In the present example the determination of the coordinates X and Y is to be related to the pulses DP rather than to the alternative pulses SP; however the principle of operation would be exactly the same for the alternative case, the latch circuits 41 and 42 being in that event responsive for resetting purposes, to the leading and trailing edges respectively of the secondary pulses SP.) Resetting of the latch circuit 41 halts counting by the counter 45, and the subsequent resetting of the latch circuit 42 halts counting by the counter 46. The counts accumulated by the counters 45 and 46 in these circumstances are accordingly representative of the X-coordinates of the boundary of the iris image I on the relevant scan line.
A gate 47 detects the condition in which the resetting of both latch circuits 41 and 42 is signalled by a gate 48 within the period of the line-scan (more particularly outside the period of any line-synchronization pulse LS). The response of the gate 47 to this condition is signalled to gating units 49 and 50 which thereupon transfer the contents of the counters 45 and 46 respectively into registers 51 and 52. It is only in the event that a pulse DP occurs within the line-scan that both latch circuits 41 and 42 are reset and give rise to the condition in which the contents of the two counters 45 and 46 are transferred into the resisters 51 and 52. Thus throughout the period of the one scanning frame there are transferred into the registers 51 and 52 pairs of counts related solely to successive line-scans of the iris image I in that frame.
The two counts of each successive pair transferred into the registers 51 and 52 are added together in an adder 53. The resultant sum is transferred into a register 54 to be accumulated with the sums derived from the other pairs of counts during the frame. The total accumulated by the register 54 is transferred via a gating unit 55 into a divider unit 56 upon the next frame-synchronization pulse FS at the end of the frame period. The count of a counter 57 is at the same time transferred into the unit 56 via a gating unit 58, this count, which is derived from the output signals of the gate 48, being representative of the number N of counts that during the period have been accumulated in the register 54. The divider unit 56 divides the total transferred from the register 54 by the number N, and thereby derives an output value of the X-coordinate of the centre of the eye in terms of the mean of all the X-coordinate representations of the boundary of the iris image I derived during the frame-scan.
The Y-coordinate corresponding to the derived X-coordinate of the eye-centre is derived in accordance with a count of the number of line-scans made from the beginning of the one frame period until the occurrence of the first output signal from the gate 48 signifying that the line-scanning has reached the iris image I. To this end, and referring to FIG. 6, a latch circuit 61 is set by the frame-synchronization pulse FS to the condition in which the line-synchronization pulses LS of the frame are supplied via a gate 62 to a counter 63. Counting of the pulses LS by the counter 63 is halted when the latch circuit 61 is reset by the first output signal from the gate 48, but is now begun by a counter 64. The counter 64 receives the pulses LS via a gate 65 throughout that part of the frame-scan for which there are output signals from the gate 48, namely throughout the period of line-scanning or the iris image I. Thus the counters 63 and 64 accumulate counts that are in accordance respectively with the Y-coordinate representative of the boundary at the uppermost part of the iris image I and the diametral distance measured in the Y-coordinate direction to the lowermost part of that image I.
The counts accumulated in the counters 63 and 64 are transferred into registers 66 and 67 via respective gating units 68 and 69 in response to the leading edge of the next frame-synchronization pulse FS at the end of the frame period. The count of the counter 64 is transferred into the register 67 with a shift of one digit place in order to effect division by two, and the content of the register 67 is then added to that of the register 66 in an adder 70. This sum as entered into a register 71 provides the Y-coordinate of the eye-centre.
The computing unit 27 responds to the two sets of coordinates of eye position entered in succession from the processor unit 25 to compute the effective difference between them. Compensation for movement of the pilot's head between entry of the two sets of coordinates is introduced into the computation. This is achieved by reference to a pattern of point-source images that is projected into the image plane of the camera 24 from two upright parallel arrays 30 of equally-spaced points of infra-red light (for example arrays of gallium-arsenide diodes) located to either side of the reflector 1. Throughout the range of possible movement of the pilot's head when viewing through the reflector 1, there are at least two points of each array within the field of the camera 24. This is illustrated in FIG. 4, where the images D of the points in the two arrays are shown within the field F.
The unit 25 acts to determine the head-position parameters required for compensation of the computed eye-position coordinates, by reference to those components of the video signals from the camera 24 that arise from the images D. Lateral and vertical translations of the pattern of images D within the field F between the two measurements of eye-position, correspond to the lateral and vertical movements of the head in the intervening period, whereas a change in the distance between successive images D in each array-pattern (or between the two patterns) corresponds to a change in distance between the head and the display reflector 1. A change in alignment between the images D of the two array-patterns, on the other hand, is indicative of rotation of the head to one side or the other about the line-of-sight 2. Whichever the case, the unit 25 responds to the change by analysis of the video signals it receives, to apply the relevant compensation, linear shifts in the case of either lateral or vertical translation of the head, change of scaling in the case of movement towards or away from the reflector 1, and rotation of axes in the case of rotation of the head.
The components of the video signals from the camera 24 corresponding to the images D are readily detected in the unit 25 on the basis of their `white` level and limited signal-duration. The spacing between successive images D of each array can be readily measured using one or more circuits corresponding to that of FIG. 6, adapted to count the number of line scans between the detected image-signals. Similarly, circuits corresponding to those of FIGS. 5 and 6 can be adapted to provide measurements of the lateral and vertical translations of the pattern of images D within the field F, and of changes of alignment between the two array-patterns. The measurements derived are all conveyed to the computing unit 27 from the unit 25 and are there applied, using straightforward trigonometrical techniques in the shift-computation, to effect the desired compensation for head-position movement.
It is important to note that absolute measurement of head position is not necessary. It is only necessary that the measurements are made at the same time as those of eye-position. Furthermore, an important feature of the system as a whole is the use of the display-defined referencing of the coordinate system immediately after the initial target identification measurement. In this manner there is no requirement for an absolute system of measurement with long-term stability as otherwise required. The techniques described enhance the capability of the weapon-delivery system in regard to targets of opportunity and for the designation of multiple targets in a short time period and the storage of such designations for second-pass attacks. Although these techniques have been described above in relation to a head-up display system they are also applicable to designation and marking of visually-recognized targets on a head-down display, and also to systems where the external scene is itself presented as a display derived from, say, a television or infra-red camera or from radar. 9n

Claims (14)

I claim:
1. A system for providing identification of a point of interest in a field of view of an observer comprising means operable to provide a measure of eye position of the observer when the observer is looking at said point, means to define a reference point relative to said field of view, means operable to provide a measure of eye position of the observer when the observer is looking at said reference point, and means responsive to the two said measures of eye position to provide a representation identifying said point of interest relative to said reference point.
2. A system according to claim 1 including camera means to scan the observer's eye to derive electric video-signals in accordance therewith, said camera means scanning said eye with a raster scan according to frame and line time-bases, means responsive to said video signals to detect in each line-scan the occurrence of signal changes due to variation in reflectance along the scan of the eye, and means for deriving coordinates of the center of the eye-pupil in accordance with analysis of the said signal changes detected, in relation to said frame and line time-bases.
3. A system according to claim 1, including an imaging sensor for viewing the observer's eye to derive electric video-signals in accordance therewith, means for analyzing the said signals to determine coordinates of the center of the eye pupil.
4. A system according to claim 3 including means to illuminate the observer's eye with infra-red radiation, and wherein said imaging sensor is responsive to infra-red radiation reflected from the eye.
5. A system according to claim 3 including means responsive to movement of the observer's head in the interval between the said two measurements, and means to introduce into said identifying-representation compensation for such movement.
6. A system comprising means to define a field of view of an observer, means to provide a display within said field to be viewed by said observer, means defining a reference point with respect to said display, means operable by the observer when viewing a point in the said field to signify that point as a selected point, means to provide two measures of eye position of the observer, one of said measures relating to eye-position when the observer is viewing the said reference point and the other relating to eye-position when the observer is viewing said selected point, means responsive to said two measures to compute therefrom coordinate identification of said selected point with respect to said display, and further means responsive to said coordinate identification to perform a predetermined function in respect of said selected point.
7. A display system according to claim 6 wherein said reference point is a marker moveable in said display, and wherein said further means includes means to move said marker to said selected point in accordance with said coordinate identification.
8. A display system according to claim 6 wherein the said means to provide measures of eye-position includes an imaging-sensor for viewing the observer's eye and deriving electric video-signals in accordance therewith, and means for analyzing said signals to derive coordinate representation of the center of the eye-pupil.
9. A display system according to claim 6 wherein the said means to provide measures of eye-position includes camera means to scan the observer's eye to derive electric video-signals in accordance therewith, said camera means scanning said eye with a raster scan according to frame and line time-bases, means responsive to said video signals to detect in each line-scan the occurrence of signal changes due to variation in reflectance along the scan of the eye, and means for analyzing said video signals to derive in accordance with said signal changes coordinates of the center of the eye-pupil related to said frame and line time-bases.
10. A system according to claim 6 including means responsive to movement of the observer's head in the interval between successive measurements of eye position, and means to introduce into said coordinate identification compensation for such movement.
11. A system for providing a measure of movement of the head of an observer, comprising first means for presenting to the observer a distinctive optical pattern, second means for viewing said pattern to produce electric video-signals in accordance therewith, means for mounting one of the said first and second means on the observer's head, and means for analyzing video signals supplied by said second means to detect changes therein and thereby provide said measure of head movement.
12. A system according to claim 11 wherein said mounting means is means for mounting said second means on the observers's head, and wherein said second means is mounted to view said pattern as reflected in an eye of the observer.
13. A method for providing identification of a point of interest in an observer's field of view, comprising the steps of deriving successive measures of eye position of the observer, one of the measures being derived when the observer views a reference point defined relative to said field and the other when the observer views said point of interest, and deriving from the two said measures of eye position a representation identifying said point of interest relative to said reference point.
14. A method according to claim 13 including the step of providing a measure of head movement of the observer in the interval between the successive measurements of eye position, and applying the said measure of head movement to compensate in said identifying representation for said such movement.
US05/678,795 1975-04-22 1976-04-21 Observer-identification of a target or other point of interest in a viewing field Expired - Lifetime US4034401A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
UK16701/75 1975-04-22
GB16701/75A GB1540992A (en) 1975-04-22 1975-04-22 Display or other systems and equipment for use in such systems

Publications (1)

Publication Number Publication Date
US4034401A true US4034401A (en) 1977-07-05

Family

ID=10082107

Family Applications (1)

Application Number Title Priority Date Filing Date
US05/678,795 Expired - Lifetime US4034401A (en) 1975-04-22 1976-04-21 Observer-identification of a target or other point of interest in a viewing field

Country Status (2)

Country Link
US (1) US4034401A (en)
GB (1) GB1540992A (en)

Cited By (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4075657A (en) * 1977-03-03 1978-02-21 Weinblatt Lee S Eye movement monitoring apparatus
US4145122A (en) * 1977-05-31 1979-03-20 Colorado Seminary Method and apparatus for monitoring the position of the eye
US4287809A (en) * 1979-08-20 1981-09-08 Honeywell Inc. Helmet-mounted sighting system
US4310851A (en) * 1978-05-22 1982-01-12 Avions Marcel Dassault-Breguet Aviation Sighting device for rear firing on combat aircraft
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication
US4395731A (en) * 1981-10-16 1983-07-26 Arnold Schoolman Television microscope surgical method and apparatus therefor
US4559555A (en) * 1982-02-24 1985-12-17 Arnold Schoolman Stereoscopic remote viewing system
US4574314A (en) * 1982-05-28 1986-03-04 Weinblatt Lee S Camera autofocus technique
WO1986001963A1 (en) * 1984-09-24 1986-03-27 The University Of Adelaide Improvements relating to an apparatus and method related to control through eye gaze direction
US4581633A (en) * 1982-03-04 1986-04-08 International Standard Electric Corporation Data compression
US4595990A (en) * 1980-12-31 1986-06-17 International Business Machines Corporation Eye controlled information transfer
WO1986003863A1 (en) * 1984-12-24 1986-07-03 The University Of Adelaide Improvements relating to eye-gaze-direction controlled apparatus
US4620318A (en) * 1983-04-18 1986-10-28 Eye-D Development Ii Ltd. Fovea-centered eye fundus scanner
US4623230A (en) * 1983-07-29 1986-11-18 Weinblatt Lee S Media survey apparatus and method using thermal imagery
US4651201A (en) * 1984-06-01 1987-03-17 Arnold Schoolman Stereoscopic endoscope arrangement
US4652917A (en) * 1981-10-28 1987-03-24 Honeywell Inc. Remote attitude sensor using single camera and spiral patterns
US4702575A (en) * 1981-05-11 1987-10-27 The United States Of America As Represented By The Secretary Of The Navy Helmet mounted eye tracker using a position sensing detector
US4706117A (en) * 1984-06-01 1987-11-10 Arnold Schoolman Stereo laser disc viewing system
US4737972A (en) * 1982-02-24 1988-04-12 Arnold Schoolman Stereoscopic fluoroscope arrangement
US4755045A (en) * 1986-04-04 1988-07-05 Applied Science Group, Inc. Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers
US4768088A (en) * 1985-12-04 1988-08-30 Aisin Seiki Kabushikikaisha Apparatus for commanding energization of electrical device
US4798214A (en) * 1987-11-24 1989-01-17 The United States Of America As Represented By The Secretary Of The Air Force Stimulator for eye tracking oculometer
US4806011A (en) * 1987-07-06 1989-02-21 Bettinger David S Spectacle-mounted ocular display apparatus
US4838681A (en) * 1986-01-28 1989-06-13 George Pavlidis Method and means for detecting dyslexia
US4852988A (en) * 1988-09-12 1989-08-01 Applied Science Laboratories Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
US4859050A (en) * 1986-04-04 1989-08-22 Applied Science Group, Inc. Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers
US4903309A (en) * 1988-05-25 1990-02-20 The United States Of America As Represented By The Secretary Of The Army Field programmable aided target recognizer trainer
US4931865A (en) * 1988-08-24 1990-06-05 Sebastiano Scarampi Apparatus and methods for monitoring television viewers
US5001650A (en) * 1989-04-10 1991-03-19 Hughes Aircraft Company Method and apparatus for search and tracking
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
US5246965A (en) * 1991-06-11 1993-09-21 Ciba-Geigy Arylethers, their manufacture and methods of treatment
WO1993018428A3 (en) * 1992-03-13 1994-02-17 Kopin Corp Head-mounted display system
US5331149A (en) * 1990-12-31 1994-07-19 Kopin Corporation Eye tracking system having an array of photodetectors aligned respectively with an array of pixels
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5451976A (en) * 1993-09-14 1995-09-19 Sony Corporation Image display apparatus
US5488952A (en) * 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
US5541655A (en) * 1991-11-05 1996-07-30 Canon Kabushiki Kaisha Image pickup device
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5594500A (en) * 1991-10-17 1997-01-14 Canon Kabushiki Kaisha Image pickup apparatus
US5644324A (en) * 1993-03-03 1997-07-01 Maguire, Jr.; Francis J. Apparatus and method for presenting successive images
USD381346S (en) * 1995-09-13 1997-07-22 Kopin Corporation Head-mountable matrix display
US5691737A (en) * 1993-09-21 1997-11-25 Sony Corporation System for explaining an exhibit using spectacle-type displays
USD388426S (en) * 1996-05-07 1997-12-30 Kopin Corporation Head-mounted display device
US5734421A (en) * 1995-05-30 1998-03-31 Maguire, Jr.; Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US5751260A (en) * 1992-01-10 1998-05-12 The United States Of America As Represented By The Secretary Of The Navy Sensory integrated data interface
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
US6043800A (en) * 1990-12-31 2000-03-28 Kopin Corporation Head mounted liquid crystal display system
US6072445A (en) * 1990-12-31 2000-06-06 Kopin Corporation Head mounted color display system
US6091899A (en) * 1988-09-16 2000-07-18 Canon Kabushiki Kaisha Apparatus for detecting the direction of visual axis and information selecting apparatus utilizing the same
US6181371B1 (en) 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US6411266B1 (en) 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US6424376B1 (en) 1993-07-30 2002-07-23 Canon Kabushiki Kaisha Selection apparatus using an observer's line of sight
US6424321B1 (en) 1993-10-22 2002-07-23 Kopin Corporation Head-mounted matrix display
US6448944B2 (en) 1993-10-22 2002-09-10 Kopin Corporation Head-mounted matrix display
US6545650B1 (en) * 1998-06-23 2003-04-08 Nec Corporation Apparatus for three-dimensionally displaying object and method of doing the same
US6683584B2 (en) 1993-10-22 2004-01-27 Kopin Corporation Camera display system
US6690338B1 (en) 1993-08-23 2004-02-10 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
EP1403680A1 (en) * 2002-09-27 2004-03-31 The Boeing Company Gaze tracking system, eye-tracking system and an associated method of calibration
US6798443B1 (en) 1995-05-30 2004-09-28 Francis J. Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US20050041100A1 (en) * 1995-05-30 2005-02-24 Maguire Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US20050243277A1 (en) * 2004-04-28 2005-11-03 Nashner Lewis M Isolating and quantifying functional impairments of the gaze stabilization system
US20070066916A1 (en) * 2005-09-16 2007-03-22 Imotions Emotion Technology Aps System and method for determining human emotion by analyzing eye properties
US20070121066A1 (en) * 2004-04-28 2007-05-31 Neurocom International, Inc. Diagnosing and Training the Gaze Stabilization System
US20100039353A1 (en) * 2008-08-14 2010-02-18 Honeywell International Inc. Near-to-eye display artifact reduction system and method
FR2935810A1 (en) * 2008-09-09 2010-03-12 Airbus France METHOD FOR ADJUSTING A HARMONIZATION COMPENSATION BETWEEN VIDEO SENSOR AND HIGH HEAD VISUALIZATION DEVICE, AND DEVICES THEREOF
US20100185113A1 (en) * 2009-01-21 2010-07-22 Teledyne Scientific & Imaging, Llc Coordinating System Responses Based on an Operator's Cognitive Response to a Relevant Stimulus and to the Position of the Stimulus in the Operator's Field of View
DE102011050942A1 (en) 2010-06-16 2012-03-08 Visteon Global Technologies, Inc. Reconfigure an ad based on face / eye tracking
US8136944B2 (en) 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US8265743B2 (en) 2007-12-27 2012-09-11 Teledyne Scientific & Imaging, Llc Fixation-locked measurement of brain responses to stimuli
DE102012109622A1 (en) 2011-10-12 2013-04-18 Visteon Global Technologies, Inc. Method for controlling a display component of an adaptive display system
RU2516857C2 (en) * 2012-03-28 2014-05-20 Открытое акционерное общество "НПО "Геофизика-НВ" Method to detect orientation of pilot helmet and device for helmet system of target designation and indication
US8758018B2 (en) 2009-12-31 2014-06-24 Teledyne Scientific & Imaging, Llc EEG-based acceleration of second language learning
US8986218B2 (en) 2008-07-09 2015-03-24 Imotions A/S System and method for calibrating and normalizing eye data in emotional testing
US20150130714A1 (en) * 2012-05-25 2015-05-14 Sony Computer Entertainment Inc. Video analysis device, video analysis method, and point-of-gaze display system
US20150205105A1 (en) * 2014-01-21 2015-07-23 Osterhout Group, Inc. Eye imaging in head worn computing
US20160044376A1 (en) * 2000-02-01 2016-02-11 Swisscom Ag System and method for distribution of picture objects
US9295806B2 (en) 2009-03-06 2016-03-29 Imotions A/S System and method for determining emotional response to olfactory stimuli
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9436006B2 (en) 2014-01-21 2016-09-06 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD792400S1 (en) 2014-12-31 2017-07-18 Osterhout Group, Inc. Computer glasses
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
USD794637S1 (en) 2015-01-05 2017-08-15 Osterhout Group, Inc. Air mouse
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9939646B2 (en) 2014-01-24 2018-04-10 Osterhout Group, Inc. Stray light suppression for head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US10874357B2 (en) * 2016-10-11 2020-12-29 Tokai Optical Co., Ltd. Eye movement measuring device and eye movement analysis system
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8701288D0 (en) * 1987-01-21 1987-02-25 Waldern J D Perception of computer-generated imagery

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3262210A (en) * 1963-05-06 1966-07-26 Sperry Rand Corp Control system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3262210A (en) * 1963-05-06 1966-07-26 Sperry Rand Corp Control system

Cited By (229)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4075657A (en) * 1977-03-03 1978-02-21 Weinblatt Lee S Eye movement monitoring apparatus
US4145122A (en) * 1977-05-31 1979-03-20 Colorado Seminary Method and apparatus for monitoring the position of the eye
US4310851A (en) * 1978-05-22 1982-01-12 Avions Marcel Dassault-Breguet Aviation Sighting device for rear firing on combat aircraft
US4287809A (en) * 1979-08-20 1981-09-08 Honeywell Inc. Helmet-mounted sighting system
US4595990A (en) * 1980-12-31 1986-06-17 International Business Machines Corporation Eye controlled information transfer
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication
US4702575A (en) * 1981-05-11 1987-10-27 The United States Of America As Represented By The Secretary Of The Navy Helmet mounted eye tracker using a position sensing detector
US4395731A (en) * 1981-10-16 1983-07-26 Arnold Schoolman Television microscope surgical method and apparatus therefor
US4652917A (en) * 1981-10-28 1987-03-24 Honeywell Inc. Remote attitude sensor using single camera and spiral patterns
US4559555A (en) * 1982-02-24 1985-12-17 Arnold Schoolman Stereoscopic remote viewing system
US5488952A (en) * 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
US4737972A (en) * 1982-02-24 1988-04-12 Arnold Schoolman Stereoscopic fluoroscope arrangement
US4581633A (en) * 1982-03-04 1986-04-08 International Standard Electric Corporation Data compression
US4574314A (en) * 1982-05-28 1986-03-04 Weinblatt Lee S Camera autofocus technique
US4620318A (en) * 1983-04-18 1986-10-28 Eye-D Development Ii Ltd. Fovea-centered eye fundus scanner
US4623230A (en) * 1983-07-29 1986-11-18 Weinblatt Lee S Media survey apparatus and method using thermal imagery
US4651201A (en) * 1984-06-01 1987-03-17 Arnold Schoolman Stereoscopic endoscope arrangement
US4706117A (en) * 1984-06-01 1987-11-10 Arnold Schoolman Stereo laser disc viewing system
WO1986001963A1 (en) * 1984-09-24 1986-03-27 The University Of Adelaide Improvements relating to an apparatus and method related to control through eye gaze direction
GB2177276A (en) * 1984-09-24 1987-01-14 Univ Adelaide Improvements relating to an apparatus and method related to control through eye gaze direction
GB2179147A (en) * 1984-12-24 1987-02-25 Univ Adelaide Improvements relating to eye-gaze-direction controlled apparatus
WO1986003863A1 (en) * 1984-12-24 1986-07-03 The University Of Adelaide Improvements relating to eye-gaze-direction controlled apparatus
US4768088A (en) * 1985-12-04 1988-08-30 Aisin Seiki Kabushikikaisha Apparatus for commanding energization of electrical device
US4838681A (en) * 1986-01-28 1989-06-13 George Pavlidis Method and means for detecting dyslexia
US4859050A (en) * 1986-04-04 1989-08-22 Applied Science Group, Inc. Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers
US4755045A (en) * 1986-04-04 1988-07-05 Applied Science Group, Inc. Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers
US4806011A (en) * 1987-07-06 1989-02-21 Bettinger David S Spectacle-mounted ocular display apparatus
US4798214A (en) * 1987-11-24 1989-01-17 The United States Of America As Represented By The Secretary Of The Air Force Stimulator for eye tracking oculometer
US4903309A (en) * 1988-05-25 1990-02-20 The United States Of America As Represented By The Secretary Of The Army Field programmable aided target recognizer trainer
US4931865A (en) * 1988-08-24 1990-06-05 Sebastiano Scarampi Apparatus and methods for monitoring television viewers
US4852988A (en) * 1988-09-12 1989-08-01 Applied Science Laboratories Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
US6091899A (en) * 1988-09-16 2000-07-18 Canon Kabushiki Kaisha Apparatus for detecting the direction of visual axis and information selecting apparatus utilizing the same
US5001650A (en) * 1989-04-10 1991-03-19 Hughes Aircraft Company Method and apparatus for search and tracking
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
US6072445A (en) * 1990-12-31 2000-06-06 Kopin Corporation Head mounted color display system
US6043800A (en) * 1990-12-31 2000-03-28 Kopin Corporation Head mounted liquid crystal display system
US5583335A (en) * 1990-12-31 1996-12-10 Kopin Corporation Method of making an eye tracking system having an active matrix display
US7075501B1 (en) 1990-12-31 2006-07-11 Kopin Corporation Head mounted display system
US5331149A (en) * 1990-12-31 1994-07-19 Kopin Corporation Eye tracking system having an array of photodetectors aligned respectively with an array of pixels
US5246965A (en) * 1991-06-11 1993-09-21 Ciba-Geigy Arylethers, their manufacture and methods of treatment
US5594500A (en) * 1991-10-17 1997-01-14 Canon Kabushiki Kaisha Image pickup apparatus
US5541655A (en) * 1991-11-05 1996-07-30 Canon Kabushiki Kaisha Image pickup device
US5751260A (en) * 1992-01-10 1998-05-12 The United States Of America As Represented By The Secretary Of The Navy Sensory integrated data interface
WO1993018428A3 (en) * 1992-03-13 1994-02-17 Kopin Corp Head-mounted display system
US20030117369A1 (en) * 1992-03-13 2003-06-26 Kopin Corporation Head-mounted display system
US6636185B1 (en) 1992-03-13 2003-10-21 Kopin Corporation Head-mounted display system
US6140980A (en) * 1992-03-13 2000-10-31 Kopin Corporation Head-mounted display system
US6307589B1 (en) 1993-01-07 2001-10-23 Francis J. Maquire, Jr. Head mounted camera with eye monitor and stereo embodiments thereof
US7439940B1 (en) 1993-01-07 2008-10-21 Maguire Jr Francis J Passive virtual reality
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5644324A (en) * 1993-03-03 1997-07-01 Maguire, Jr.; Francis J. Apparatus and method for presenting successive images
US6094182A (en) * 1993-03-03 2000-07-25 Maguire, Jr.; Francis J. Apparatus and method for providing images for viewing at various distances
US6424376B1 (en) 1993-07-30 2002-07-23 Canon Kabushiki Kaisha Selection apparatus using an observer's line of sight
US6411266B1 (en) 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US6690338B1 (en) 1993-08-23 2004-02-10 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US5451976A (en) * 1993-09-14 1995-09-19 Sony Corporation Image display apparatus
US5691737A (en) * 1993-09-21 1997-11-25 Sony Corporation System for explaining an exhibit using spectacle-type displays
US6448944B2 (en) 1993-10-22 2002-09-10 Kopin Corporation Head-mounted matrix display
US6424321B1 (en) 1993-10-22 2002-07-23 Kopin Corporation Head-mounted matrix display
US6452572B1 (en) 1993-10-22 2002-09-17 Kopin Corporation Monocular head-mounted display system
US8040292B2 (en) 1993-10-22 2011-10-18 Kopin Corporation Portable communication display device
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
US20080122736A1 (en) * 1993-10-22 2008-05-29 Kopin Corporation Portable communication display device
US6683584B2 (en) 1993-10-22 2004-01-27 Kopin Corporation Camera display system
US7310072B2 (en) 1993-10-22 2007-12-18 Kopin Corporation Portable communication display device
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5734421A (en) * 1995-05-30 1998-03-31 Maguire, Jr.; Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
US6798443B1 (en) 1995-05-30 2004-09-28 Francis J. Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US20050041100A1 (en) * 1995-05-30 2005-02-24 Maguire Francis J. Apparatus for inducing attitudinal head movements for passive virtual reality
USRE45114E1 (en) 1995-05-30 2014-09-09 Susan C. Maguire Apparatus with moveable headrest for viewing images from a changing direction-of-view
US7724278B2 (en) 1995-05-30 2010-05-25 Maguire Francis J Jr Apparatus with moveable headrest for viewing images from a changing direction-of-view
USRE45062E1 (en) 1995-05-30 2014-08-05 Susan C. Maguire Apparatus for inducing attitudinal head movements for passive virtual reality
US6181371B1 (en) 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
USD381346S (en) * 1995-09-13 1997-07-22 Kopin Corporation Head-mountable matrix display
USD388426S (en) * 1996-05-07 1997-12-30 Kopin Corporation Head-mounted display device
US6545650B1 (en) * 1998-06-23 2003-04-08 Nec Corporation Apparatus for three-dimensionally displaying object and method of doing the same
US20180367846A1 (en) * 2000-02-01 2018-12-20 Swisscom Ag System and method for distribution of picture objects
US10097887B2 (en) * 2000-02-01 2018-10-09 Swisscom Ag System and method for distribution of picture objects
US20160044376A1 (en) * 2000-02-01 2016-02-11 Swisscom Ag System and method for distribution of picture objects
EP1403680A1 (en) * 2002-09-27 2004-03-31 The Boeing Company Gaze tracking system, eye-tracking system and an associated method of calibration
US20040061831A1 (en) * 2002-09-27 2004-04-01 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US7130447B2 (en) 2002-09-27 2006-10-31 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US20050280603A1 (en) * 2002-09-27 2005-12-22 Aughey John H Gaze tracking system, eye-tracking assembly and an associated method of calibration
US6943754B2 (en) 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US20070121066A1 (en) * 2004-04-28 2007-05-31 Neurocom International, Inc. Diagnosing and Training the Gaze Stabilization System
US7500752B2 (en) 2004-04-28 2009-03-10 Natus Medical Incorporated Diagnosing and training the gaze stabilization system
US20050243277A1 (en) * 2004-04-28 2005-11-03 Nashner Lewis M Isolating and quantifying functional impairments of the gaze stabilization system
US7195355B2 (en) * 2004-04-28 2007-03-27 Neurocom International, Inc. Isolating and quantifying functional impairments of the gaze stabilization system
US20070066916A1 (en) * 2005-09-16 2007-03-22 Imotions Emotion Technology Aps System and method for determining human emotion by analyzing eye properties
US8265743B2 (en) 2007-12-27 2012-09-11 Teledyne Scientific & Imaging, Llc Fixation-locked measurement of brain responses to stimuli
US8986218B2 (en) 2008-07-09 2015-03-24 Imotions A/S System and method for calibrating and normalizing eye data in emotional testing
US20100039353A1 (en) * 2008-08-14 2010-02-18 Honeywell International Inc. Near-to-eye display artifact reduction system and method
US9389419B2 (en) 2008-08-14 2016-07-12 Honeywell International Inc. Near-to-eye display artifact reduction system and method
US8814357B2 (en) 2008-08-15 2014-08-26 Imotions A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text
US8136944B2 (en) 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US8537214B2 (en) 2008-09-09 2013-09-17 Airbus Operations Sas Method of regulating a harmonization compensation between video sensor and head up display device, and corresponding devices
FR2935810A1 (en) * 2008-09-09 2010-03-12 Airbus France METHOD FOR ADJUSTING A HARMONIZATION COMPENSATION BETWEEN VIDEO SENSOR AND HIGH HEAD VISUALIZATION DEVICE, AND DEVICES THEREOF
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US20100185113A1 (en) * 2009-01-21 2010-07-22 Teledyne Scientific & Imaging, Llc Coordinating System Responses Based on an Operator's Cognitive Response to a Relevant Stimulus and to the Position of the Stimulus in the Operator's Field of View
US9295806B2 (en) 2009-03-06 2016-03-29 Imotions A/S System and method for determining emotional response to olfactory stimuli
US8758018B2 (en) 2009-12-31 2014-06-24 Teledyne Scientific & Imaging, Llc EEG-based acceleration of second language learning
DE102011050942A1 (en) 2010-06-16 2012-03-08 Visteon Global Technologies, Inc. Reconfigure an ad based on face / eye tracking
US9383579B2 (en) 2011-10-12 2016-07-05 Visteon Global Technologies, Inc. Method of controlling a display component of an adaptive display system
DE102012109622A1 (en) 2011-10-12 2013-04-18 Visteon Global Technologies, Inc. Method for controlling a display component of an adaptive display system
RU2516857C2 (en) * 2012-03-28 2014-05-20 Открытое акционерное общество "НПО "Геофизика-НВ" Method to detect orientation of pilot helmet and device for helmet system of target designation and indication
US20150130714A1 (en) * 2012-05-25 2015-05-14 Sony Computer Entertainment Inc. Video analysis device, video analysis method, and point-of-gaze display system
US9727130B2 (en) * 2012-05-25 2017-08-08 Sony Interactive Entertainment Inc. Video analysis device, video analysis method, and point-of-gaze display system
US11507208B2 (en) 2014-01-17 2022-11-22 Mentor Acquisition One, Llc External user interface for head worn computing
US11231817B2 (en) 2014-01-17 2022-01-25 Mentor Acquisition One, Llc External user interface for head worn computing
US11169623B2 (en) 2014-01-17 2021-11-09 Mentor Acquisition One, Llc External user interface for head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11782529B2 (en) 2014-01-17 2023-10-10 Mentor Acquisition One, Llc External user interface for head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US10321821B2 (en) 2014-01-21 2019-06-18 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11947126B2 (en) 2014-01-21 2024-04-02 Mentor Acquisition One, Llc See-through computer display systems
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651783B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9651789B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-Through computer display systems
US9658457B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9658458B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US11796805B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11796799B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc See-through computer display systems
US9684171B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. See-through computer display systems
US20150205105A1 (en) * 2014-01-21 2015-07-23 Osterhout Group, Inc. Eye imaging in head worn computing
US9684165B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. Eye imaging in head worn computing
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9720227B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720235B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US11650416B2 (en) 2014-01-21 2023-05-16 Mentor Acquisition One, Llc See-through computer display systems
US11619820B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9740012B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. See-through computer display systems
US11622426B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US9746676B2 (en) 2014-01-21 2017-08-29 Osterhout Group, Inc. See-through computer display systems
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9772492B2 (en) 2014-01-21 2017-09-26 Osterhout Group, Inc. Eye imaging in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9811153B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US11353957B2 (en) 2014-01-21 2022-06-07 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9436006B2 (en) 2014-01-21 2016-09-06 Osterhout Group, Inc. See-through computer display systems
US9829703B2 (en) 2014-01-21 2017-11-28 Osterhout Group, Inc. Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US11126003B2 (en) 2014-01-21 2021-09-21 Mentor Acquisition One, Llc See-through computer display systems
US11103132B2 (en) 2014-01-21 2021-08-31 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11099380B2 (en) 2014-01-21 2021-08-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9885868B2 (en) 2014-01-21 2018-02-06 Osterhout Group, Inc. Eye imaging in head worn computing
US9927612B2 (en) 2014-01-21 2018-03-27 Osterhout Group, Inc. See-through computer display systems
US11054902B2 (en) 2014-01-21 2021-07-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9933622B2 (en) 2014-01-21 2018-04-03 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US11002961B2 (en) 2014-01-21 2021-05-11 Mentor Acquisition One, Llc See-through computer display systems
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9958674B2 (en) 2014-01-21 2018-05-01 Osterhout Group, Inc. Eye imaging in head worn computing
US9529192B2 (en) * 2014-01-21 2016-12-27 Osterhout Group, Inc. Eye imaging in head worn computing
US10001644B2 (en) 2014-01-21 2018-06-19 Osterhout Group, Inc. See-through computer display systems
US10890760B2 (en) 2014-01-21 2021-01-12 Mentor Acquisition One, Llc See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10866420B2 (en) 2014-01-21 2020-12-15 Mentor Acquisition One, Llc See-through computer display systems
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US10698223B2 (en) 2014-01-21 2020-06-30 Mentor Acquisition One, Llc See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US10579140B2 (en) 2014-01-21 2020-03-03 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11822090B2 (en) 2014-01-24 2023-11-21 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9939646B2 (en) 2014-01-24 2018-04-10 Osterhout Group, Inc. Stray light suppression for head worn computing
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9843093B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9841602B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Location indicating avatar in head worn computing
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US9928019B2 (en) 2014-02-14 2018-03-27 Osterhout Group, Inc. Object shadowing in head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11727223B2 (en) 2014-04-25 2023-08-15 Mentor Acquisition One, Llc Language translation with head-worn computing
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US11474360B2 (en) 2014-04-25 2022-10-18 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US11880041B2 (en) 2014-04-25 2024-01-23 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10634922B2 (en) 2014-04-25 2020-04-28 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US11402639B2 (en) 2014-06-05 2022-08-02 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US10877270B2 (en) 2014-06-05 2020-12-29 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11663794B2 (en) 2014-06-09 2023-05-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10139635B2 (en) 2014-06-09 2018-11-27 Osterhout Group, Inc. Content presentation in head worn computing
US11327323B2 (en) 2014-06-09 2022-05-10 Mentor Acquisition One, Llc Content presentation in head worn computing
US11790617B2 (en) 2014-06-09 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US11887265B2 (en) 2014-06-09 2024-01-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US11360318B2 (en) 2014-06-09 2022-06-14 Mentor Acquisition One, Llc Content presentation in head worn computing
US9720241B2 (en) 2014-06-09 2017-08-01 Osterhout Group, Inc. Content presentation in head worn computing
US10976559B2 (en) 2014-06-09 2021-04-13 Mentor Acquisition One, Llc Content presentation in head worn computing
US11022810B2 (en) 2014-06-09 2021-06-01 Mentor Acquisition One, Llc Content presentation in head worn computing
US11054645B2 (en) 2014-06-17 2021-07-06 Mentor Acquisition One, Llc External user interface for head worn computing
US10698212B2 (en) 2014-06-17 2020-06-30 Mentor Acquisition One, Llc External user interface for head worn computing
US11789267B2 (en) 2014-06-17 2023-10-17 Mentor Acquisition One, Llc External user interface for head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US11294180B2 (en) 2014-06-17 2022-04-05 Mentor Acquisition One, Llc External user interface for head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11786105B2 (en) 2014-07-15 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US10908422B2 (en) 2014-08-12 2021-02-02 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US11360314B2 (en) 2014-08-12 2022-06-14 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US11630315B2 (en) 2014-08-12 2023-04-18 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US11809628B2 (en) 2014-12-03 2023-11-07 Mentor Acquisition One, Llc See-through computer display systems
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US11262846B2 (en) 2014-12-03 2022-03-01 Mentor Acquisition One, Llc See-through computer display systems
USD792400S1 (en) 2014-12-31 2017-07-18 Osterhout Group, Inc. Computer glasses
USD794637S1 (en) 2015-01-05 2017-08-15 Osterhout Group, Inc. Air mouse
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US10874357B2 (en) * 2016-10-11 2020-12-29 Tokai Optical Co., Ltd. Eye movement measuring device and eye movement analysis system

Also Published As

Publication number Publication date
GB1540992A (en) 1979-02-21

Similar Documents

Publication Publication Date Title
US4034401A (en) Observer-identification of a target or other point of interest in a viewing field
US3793481A (en) Range scoring system
US7391887B2 (en) Eye tracking systems
US7046215B1 (en) Head tracker system
EP0459295B1 (en) Aircraft docking guidance system which takes position reference in anti-collision light of aircraft
US7682026B2 (en) Eye location and gaze detection system and method
US4274609A (en) Target and missile angle tracking method and system for guiding missiles on to targets
EP0055338A1 (en) Eye controlled user-machine communication
US20030067537A1 (en) System and method for three-dimensional data acquisition
CN107894189B (en) A kind of photoelectric sighting system and its method for automatic tracking of target point automatic tracing
US3953669A (en) Video tracking system
US4893922A (en) Measurement system and measurement method
CN109215063A (en) A kind of method for registering of event triggering camera and three-dimensional laser radar
JP2003065812A (en) System for forming numeral data based on image information in measuring instrument display
US6980210B1 (en) 3D stereo real-time sensor system, method and computer program therefor
US4812713A (en) Automatic closed loop scaling and drift correcting system and method
US10176375B2 (en) High speed pupil detection system and method
Frobin et al. Automatic Measurement of body surfaces using rasterstereograph
GB2041689A (en) Vehicle movement sensing
CN112762763B (en) Visual perception system
JPS62194413A (en) Three-dimensional coordinate measuring instrument
CN110069131A (en) A kind of more finger tip localization methods based on the detection of near infrared light round spot
CN108715233A (en) A kind of unmanned plane during flying precision determination method
KR102317913B1 (en) Billiards System Capable of Contents Image Presentation
Hughett Projectile velocity and spin rate by image processing of synchro-ballistic photography