CA2171314C - Electro-optic vision systems which exploit position and attitude - Google Patents

Electro-optic vision systems which exploit position and attitude Download PDF

Info

Publication number
CA2171314C
CA2171314C CA002171314A CA2171314A CA2171314C CA 2171314 C CA2171314 C CA 2171314C CA 002171314 A CA002171314 A CA 002171314A CA 2171314 A CA2171314 A CA 2171314A CA 2171314 C CA2171314 C CA 2171314C
Authority
CA
Canada
Prior art keywords
image
scene
computer
attitude
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA002171314A
Other languages
French (fr)
Other versions
CA2171314A1 (en
Inventor
John Ellenby
Thomas W. Ellenby
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Geovector Corp
Original Assignee
Geovector Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22383980&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CA2171314(C) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Geovector Corp filed Critical Geovector Corp
Publication of CA2171314A1 publication Critical patent/CA2171314A1/en
Application granted granted Critical
Publication of CA2171314C publication Critical patent/CA2171314C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C17/00Compasses; Devices for ascertaining true or magnetic north for navigation or surveying purposes
    • G01C17/34Sun- or astro-compasses
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command

Abstract

The present invention is generally con-cerned with electronic vision devices (1) and methods, and is specifically concerned with image augmentation in combination with nav-igation, position, and attitude devices. In the simplest form, devices (1) of the invention include six major components: a camera, a computer processor, devices to measure the position and attitude of the camera, a data base for storing information associated with various scenes, and a display (1) where an im-age (3) is continuously aligned to a real scene as viewed by a user (2). In the present inven-tion, an image of some real scene is altered by a computer processor to include stored infor-mation of a scene in a storage location iden-tified by real time position and attitude of the vision system (1). The primary function of the invention is to present augmented real images (3), or images that represent a real scene but have deletions, additions, and supplements, and data that is continuously aligned with the real scene as naturally viewed by the user (2) of the system.

Description

WO 95/07526 21~ 4 PCT/US94/06844 , . .
[
. j, s "Electro-Optic Vision Systems which Exploit Position and Attitude"
Specifica~ion for a Leffers Patent Backgrcund of the Invention The present invention is generally concerned with electronic vision devices and methods, and is specific~lly conc~rned with image ~up...t~ ;Qn in coml)il-dlion with navigation, position, and attitude devices.
One may have to look quite far into the annals of history to find the first usesof maps. Maps generally provide information to alert a user to things that are not readily appdl e"l from simple viewing of a real scene from the users location. For example, a user of a city road map may not be able to see a tunnel on Elm street if the user is currently seven miles away on First street and looking in the direction of the Elm street tunnel. However, from the First street location, the user could determine from a road map that there is a tunnel on Elm street. He could learn that the tunnel is 20 three miles long, starts on Eighth street and ends on Eleventh street. There may even be an indication of the size of the tunnel such that it could accommodate four traffic lanes and a bicycle lane.
Unfortunately, it is not always possible to translate the illrolllld~ion from a map to the real scene that the il~",dLion rel,lesenl~ as the scene is actually viewed.
2s It is common for users of maps to attempt to align the map to reality to get a better "feel" of where things are in relation to the real world. Those who are familiar with maps can verify that the fact that maps are drawn with north being generally in the direction ofthe top ofthe map, is of little use when tran.cl~ting the h~mdLion to the scene of interest. Regardless of where north is, one tends to turn the map so that the 30 direction ahead of the user, or in the direction of travel, in a real scene m~t~lles that direction on the map. This may result in the condition of an "upside down" map that is quite difficult to read (the case when the user is traveling south). Althoughtr~n~l~ting the directions ofthe map to reality is a formidable task it is an even greater problem to translate the symbols on the map to those objects in reality which 35 they ,ep,ese"~. The tunnel symbol on the map does not show what the real tunnel actually looks like. The fact that the appea~ ~nce of the tunnel from infinitely many points of view is prohibitively difficult to ~ epl ese"~ on a map accounts for the use of a simple symbol. Furthermore, the map does not have any indication from which point of view the user will first see the tunnel, nor any indication of the path which the user 40 will take to approach the tunnel.

SUBSTITUTE SH~ET (RULE 2~) .~ ' i . ~
wo ss/07s26 2 1 7 1 3 1 ~ PcTrus~/0~8q 1 It is now possible to computerize city road map hL[~l,nalion and display the maps according to the path taken by a user. The map is updated in "real-time" according to the progress of the user through the city streets. It is thelerolt;
possible to relieve the problem of upside-down maps as the computer could re-draw the map with the text in correct orientation relative to the user even when one is traveling in a southerly direction. The computer gel-e.aled map is displayed at a monitor that can be easily refreshed with new h~ll.,alion as the user progressesalong his journey. Maps of this type for automobiles are well known in the art. Even very sophi.ctir,~ted maps with computer generated indicia to assist the user in decision o making are available and described in patents such as DeJong US #5,115,398. This device can display a local scene as it may appear and supelill"~ose onto the scene, symbolic illrolll.~lion that suggests an action to be taken by the user. For example, a left turn as is shown in figure 3 of the disclosure. Even in these advanced systems, a high level of translation is required of the user. The computer generated map does not attempt to present an accurate ~lignmPnt of displayed images to the real object which they rep~esenl. Devices employing image suppl~om~nt~tion are known and include Head Up Displays, HUDs and Helmet Mounted Displays H~Ds. A
HUD is a useful vision system which allows a user to view a real scene, usually through an optical image combiner such as a holographic mirror or a dichroic 20 beamsplitter, and have superimposed thereon, navigational irolll,alion for example symbols of real or im~gin~ry objects, vehicle speed and altitude data, et cetera. It is a primary goal of the Hl~D to . . .~ e the time that the user is looking into the scene of interest. For a fighter pilot, looking at a display device located nearby on an instrument panel, and r.h~nging the focus of ones' eyes to read that device, and to return to the scene of interest, requires a critically long time and could cause a fatal error. A HUD allows a fighter pilot to ...~ continuous concentration on a scene at optical infinity while reading instruments that appear to the eye to also be located at optical infinity and thereby çlimin~ting the need to refocus ones' eyes. A HUD allows a pilot to .n~ a "head-up" position at all times. For the airline industry, HUDs30 have been used to land airplanes in low visibility conditions. HUDs are particularly useful in a landing situation where the boundaries of a runway are obscured in the pilots field of view by fog but artificial boundaries can be projected onto the optical combiner of the HUD system to show where in the user's vision field the real runway boundaries are. The virtual runway projection is positioned in the vision field 35 according to data generated by communication between a colllp~ller with and the airport instrument landing system, ILS which employs a VHF radio beam. The system provides the computer with two data figures. First a glide slope figure, and second, a localizer which is a lateral position figure. With these data, the computer is SUBSTITUTE S~IEET (RULE 26) WO 95/07526 21713 14 PCT/US~1~'0~

~ . . , able to generate an optical image (photon) to ge projected and cGml)hled with the real scene (photon) that passes through the combil-el and thereby enh~nçing certain features of the real scene; for example runway boundaries. The positioning of the overlay depends on the accuracy of the airplane boresight being in ~lignment with the s ILS beam and other physical limit~tions~ The computer is not able to recognizeimages in the real scene and does not attempt to manipulate the real scene except for his~hli~htinp~ parts thereof. HUDs are particularly chara~;Lt;,~ed in that they are an optical co",bil,alion of two photon scenes. The co",bil,alion being a first scene, one that is normally viewed by the users eyes passes through an optical co,l,bhler, and a 10 second, computer gene~led photon image which is coll,l,ined with the real image at an optical element. In a HUD device it is not possible for the computer to address objects of the real scene, for example to alter or delete them. The system only adds enhancem~nt to a feature ofthe real image by drawing ;"leres~illg features ILe,eoll.
Finally, HUDs are very bulky and are typically mounted into an airplane or 15 automobile and require a great deal of space and complex optics incl~ltling holograms and specially dçsi~ned lenses.
Helmet Mounted Displays HMDs are similar to HUDs in that they also combine enhancçmPnt images with real scene photon images but they typically havevery portable components. Micro CRTs and small colllbine~ s make the entire system 20 helmet mountable. It is a complicated matter to align computer g,enel~led images to a real scene in relation to a fast moving helmet. HUDs can align the data generated image that is in~le~ed to the slow moving all~lal-e axis which moves slowly in relation to a runway. For this reason, HMDs generally display data that does not change with the pilots head movements such as altitude and airspeed. E~MDs suffer the same 25 limitation as the HUDs in that they do not provide the capacity to remove or a~l~mçnt elements of the real image.
Another related concept that has resulted in a rapidly developing field of computer ~Qi~ted vision systems is known as virtual reality, VR. Probably best embodied in the fictional television program "Star Trek; The Next Generation", the 30 "Holodeck" is a place where a user can go to have all of his surroundings generated by a computer so as to appear to the user to be another place or another place and time.
Virtual reality systems are useful in particular for a training means. For example in aircra~ .~im~ tion devices. A student pilot can be surrounded by a virtual 35 "cockpit" which is essentially a computer interface whereby the user "feels" the environment that may be present in a real aircraft, in a very real way and perhaps çnh~n~ed with computer generated sounds, images and even m~çh~nic~l stimuli.
Actions taken by the user may be h~Lel~l e~ed by the computer and the computer can SlJBSTlTUTE SHEET (RULE 26) 2 ~ 71 31 4 PCTIUS94/06844 respond to those actions to control the stSimuli that surround the user. VR m~c.hines can create an entire visual scene and there is no effort to superimpose a computer generated scene onto a real scene. A VR device generally does not have any comml-nic~tion between its actual location in reality and the stimuli being presented to 5 the user. The location of the VR m~ ine and the location of the scene being genc~aled generally have no physical relationship.
VR systems can be used to visualize things that do not yet exist. For example, a home can be completely modeled with a computer so that a potential buyer can "walk-through" before it is even built. The buyer could enter the VR atmosphere and o proceed through computer generated images and stimuli that accurately represe~ll what a home would be like once it is built. In this way, one could know if a particular style of home is likable before the large cost of building the home is incurred. The VR m~chine being entirely pro~l~lulled with h~l~ ion from a dçsignlor does not ~ntir.ip~qte things that presently exist and there is no commllnic~tion between the 15 elements presented in the VR system to those elçm~nts e~i~ting in reality.
While the systems and inventions of the prior art are de~igned to achieve particular goals, features, advantages, and objectives, some of those being no less than l~;lllalk~ble, these systems and inventions have limitations and faults that prevent their use in ways that are only possible by way of the present invention. The prior art 20 systems and inventions can not be used to realize the advantages and objectives of the present invention.

Summary of the Invention Comes now, an invention of a vision system in~ lin~ devices and methods of 25 a~lgmçnted reality wherein an image of some real scene is altered by a computer processor to include h~l lll~lion from a data base having stored information of that scene in a storage location that is i~entified by the real time position and attitude of the vision system. It is a plilllaly function of the vision system of the invention, and a contrast to the prior art, to present ~lgm~nted real images and data that is 30 continuously aligned with the real scene as that scene is naturally viewed by the user of the vision system. An ~lgm~ntec~ image is one that lelJlt;sellLs a real scene but has deletions, additions and supplements. The camera of the device has an optical axis which defines the direction of viewing as in a simple "camcorder" type video camera where the image displayed accurately leplesenl~ the real scene as it appears from the 35 point of view of one looking along the optical axis. In this way, one easily orients the h~llllaLion displayed to the world as it exists. A filn~m~nt~l difference between the vision system of the invention and that of a CalllCOI der can be found in the image ~lgm~nt~tion. While a camcorder may present the superposition of an image and SUBSTITUTE SHEET (RULE 26) ~ WO 95/07S26 . PCT/US~tOC8~q ~7 ~3l ~

data such as a "low battery" indicator, et cetera, it has no "knowledge" of the scene that is being viewed. The data displayed usually is related to the vision device or so...~l~.;ng independent ofthe scene such as the time and date. Image fl~lgment~tion of the invention can include i~ ation particular to a scene being viewed with the s invention.
The vision system of the invention can include a data base with prerecorded i. ror~.alion regarding various scenes. The precise position and fltti~-de ofthe vision system indicates to the data base, the scene that is being viewed. A computer processor can receive inrol llla~ion about the particular scene from the data base and 0 can then fl~lgm.ont an image of the scene genelated by the camera of the vision system and present a final image at the display with incllldes a co...bi..ation of h~ro-...~lion from the optical input and illrullllaLion that was stored in the data base. Particularly important, is the possibility of communication belween the data from the data base and the real image. Analyzing and processing routines may include recognition of5 items in the real scene and compa-isons with artifacts ofthe stored data. This could be useful in ali nment of the real images to the recalled data. In a situation where the optical input of a scene is entirely blocked from the camera of the system, for example by dense fog, an image ofthe scene can be generated which incllldes only h ru~n.ation from the data base. Alternatively, the data base being of finite size may not have any 20 inro.---alion about a particular scene. In this case, the image plesenled at the display would be entirely from the optical input of the real scene. This special case reduces the vision system to the equivalent of a simple camcorder or electronic binocular. It is also a very special case where the features of a real scene are selectively removed. If the bright lights of a city-scape obstruct the more subtle navigation lights of a marine 2~ port of entry, then it is possible for the processing routines to disc. i~ a~e between the city lights and the navigation lights. The undesirable city lights could be removed in the processor before the final image is displayed. In the final image, the important navigation lights show clearly and the city lights are not present at all. Ther~rore, a final image of the invention can be comprised of h~rul ...alion from two sources, in 30 various combinations, superimposed together to form a single, high inru~ alion-density image. The inro--ndlion of the two sources are con.paled and combined together to form a single augmented image that is presented at a display and is aligned to the real scene as the scene is viewed by the user.
In the simplest form, devices of the invention can be envisioned to include six 35 major components: A 1) camera to collect optical i.L~....alion about a real scene and present that inrù----alion as an electronic signal to; a 2) computer prOCeSSOr; a 3) device to measure the ~c: ~ ~ ofthe camera; and a 4) device to measure the attitude of the camera (direction of the optic axis), thus uniquely idw~liryil1g the SUBSTITUTE SHEET (RULE 26) WO 9S/07526 ~ PCT/US94/06844 scene being viewed, and thus identifying a location in; a 5) data base where i,~.~.aLion associated with various scenes is stored, the computer processor combines the data from the camera and the data base and perfects a single image to be plese.l~ed at; a 6) display whose image is continuously aligned to the real scene as it s is viewed by the user.
The camera is related to the position measuring device and the attitude measuring device in that the measu. elllell~s of position and attitude are made at the camera with respect to &~ ly references. The position and attitude measurement means are related to the data base in that the values of those measureme..~ specify lo particular data base locations where particular image data are stored. We can think of the position and attitude measurements as d~finin~: the data base pointer of twoorthogonal variables. The camera is related to the computer processor in that the image generated at the camera is an electronic image and is processed by the computer processor. The data base is related to the computer in that the data base S furnishes the processor hlru.~ tion in~lurling images for use in processing routines.
The display is related to the computer as it receives the final processed image and converts the computers electric image signal into an optical image that can be viewed by the user. The display is boresight aligned with the optical axis of the camera such that the information corresponds to reality and appears to the user in a way that 20 allows the user to view the final ~lgm~nted image without needing to translate the image to the orientation of the real scene.
In the simplest form, methods of the invention can be envisioned to include seven major steps: An 1) acquire step whereby the light from a scene is imaged by a lens; a 2) conversion step whereby optical h~llna~ion ofthe acquire step is 2s converted into an electrical signal; a 3) posit ~ determining step in which the position of the camera is measured; an 4) attitude determining step in which the~ttit~lde of the camera is measured; a 5) data recall step where a data location is selected in accordance with the measu. t;...~..L~ in steps 3 and 4 and user impute data, is recalled by a computer processor; a 6) ~. ocr ~ step wherein data from the data 30 store and the electronic image are combined and processed; and a 7) display step wherein a processed final image is displayed.
The product of the acquire step, an optical image, is converted to an electric signal in the conversion step. The electronic image of the conversion step is tran.~mitted to the processor in the processing step. The products of the position 35 d~e.~ g step and attitude detel....l....g step are values that are used in the data recall step. The result of the data recall step is also ~I,.n.~ led to the processor to be combined with the electronic image of the conversion step in the processing step.

SUBSTITUTE SHEET (RULE 26) ~_ Wo 95/07526 ~1~7 ~314. PCT/US94/06844 The product of the processing step, a final electronic l epr~se~ ion of an ~gm~nted image is ll~n~...;lled to, and displayed in, optical format in the display step.The invention will be sulll."~ed further by pres~.l;"g six ~.Y~mples of the invention wherein a description of the devices, methods, and uses thereof, follow.
s In a first sl~mm~ry example ofthe invention, the reader is to im~ine a scenario where a boat is approaching a port of entry and the user of the invention is a navigation officer of that boat. It is quite con.lllon for a navigation officer to require many aids to guide his course through a SLppillg rh~nn~l Charts, a compass, lighted buoys, sonic devices, ranges, radar, binoculars are some of the instruments that one 0 may use to navigate a boat. Recent advances in position detellllinillg technologies, in particular the Global Positioning System, or GPS, have simplified the task of navigation. With the GPS, a navigation officer can rely on knowing the position of the craft to within applux~lla~ely ~t300 feet, north and east; and in some special cases within less. Even with such a good position detelll.in~lion~ the navigation officer must locate where on the chart his position collc;spollds, and identify symbols on the chart to create a mental image of his surro--n-1ing.e Then the navigation officer must look about at the real scene before him for i~ ble objects to determine how whathe sees corresponds to the symbols on the chart. Frequently, visibility is limited by darkness or weather conditions and particular lights must be recognized to identify chart markings. These can be colored fl~chin~ lights and can easily be mict~k~n for the bright lights of a city skyline. In other cases, the l,l&lkel~ may be un-lit and may be impossible to find in the dark. Dangerous objects, for l A&I."~le sunken ships, kelp, and reefs, are generally marked on the chart but can not be seen by the navigation officer because they can be partially or entirely sub.llelged. The navigation officer must im~gine in his mind his position with respect to objects in the real scene and those on the chart and must also im~gine where in the real scene that the chart is warning of dangers. This procedure requires many complex ll ~n~laLions and intel~ alions between the real scene, the markers ofthe chart, the scene as it is viewed by the navigation officer, and the chart as understood by the navigation offlcer. Obviously, there is great potential for mi~t~kes Many very skilled and experienced naval navigators have failed the complicated task of safely navigating into a port resulting in tragic con~eqllences. With the system ofthe invention, a navigation officer can look with cel Laillly at a scene and locate exactly, known marks.
The system of the invention e~ es the need for detelll.h~ing where in a real scene the symbols of a chart COl . e~l,ol1d. The user of the invention can position the display between his eyes and the real scene to see an image of the real scene with the symbols of a chart superimposed thereon. In the navigator's mind it becomes very concrete where the otherwise invisible reefs are located and which lights are the real navigation S~JBSTITUTE SHEET (RULE 26) WO 95/07S26 ~ 1 7 1314 . PCT/US94/06844 Iights and which lights are simply street lights. It is possible for the computer to remove ;,~1 ,.,~lion such as stray lights from the image as it is recorded from the real scene and to present only those lights that are used for navigation in the display. This is possible because the data base of the invention "knows" of all navigation lights and s the processor can e~ e any others. The display that the user views inr.ludes a,~,es~ n of a scene with complic~t~d undesirable objects removed and useful data and objects being added thereto. Whenever the navigator points the device of the invention in some direction, the device records the optical image of the real scene and ~imlllt~neo~l~ly dc;le~ni~les the position and attitude ofthe device and calls on a 0 data base for h~ol ...alion I egal dil.g the scene being observed. The processor analyzes the image and any data recalled and co",l)ines them to form the final displayed image.
In a further summary example of the invention, a city planning co~ ission may wish to know how a proposed building may look in the skyline of the city. Ofcourse it is possible to make a photograph of the skyline and to airbrush the proposed s building into the photograph. This commonly used method has some shortfalls. It shows only a single pt;rs~.ec~ e ofthe proposed building that may be p,t;se"led in the "best light" by a biased developer (or "undesirable light" by a biased opponent/competitor). The building may be presented to appear very handsome nextto the city hall as shown in the developer's rendition. Since only one pe~ ~I,e.;Live is 20 generally shown in a photograph, it may be impossible to determine the full impact the building may have with respect to other points of view. It may not be clear from the plel)ared photograph that the bç~1tifi-l bay view enjoyed by users of city hall would be blocked after the building is constructed. With the current invention, the details of every perspective could be easily vi~ li7e~ Data that accurately represents the 2s proposed building could be entered into a data base of a device of the invention.
When the camera of the invention is pointed in the direction of the new building, the camera portion records the real scene as in appears and transmits that signal to a processor. The device accurately deLe""i"es the position and attitude of the camera with respect to the scene and recalls data from the data base that properly repl ~se,lLs 30 the perspective of the building from that point of view. The processor then combines the real scene with the data of the proposed building to create a final image of the building from that particular pe~ ~e.~ e. It would even be possible for a helicopter to fly in a circle around the location of the building and for a user to see it from all possible points of view. A council member could see what the future structure would 35 be like in real life from any perspective before voting to approve the plan.
In a still further summary example, we choose a scenario where an ~ineer uses products of the invention for analysis and troubleshooting of an ong;~.ee~ g problem. In particular the case where a problem has been detected in the plumbing SUBSTITUTE SHEET (RULE 26) wo 95/075~6 2 1 7 13 i 4 PCT/US~)1tO~8~

aboard a submarine. The complicated works int~llldin~ pipes, tubes, pumps, cables, wires, et cetera, of a submarine may be ~ .nely difficult to understand by looking at a design plan and tr~n~l~ting the ;,~.",aLion from the plan to the real world.
T.. f~JiAIe and positive identification of a particular element may be critical to s survival of the ship in an t;,llel~e"cy. The following illustrative r~ ee~ illg use of the invention provides for a greatly cimrlified way of positive identific~tion of ~ineering ~alul ~S.
An engineer aboard a sub",alille is tasked to work in the torpedo room on a saltwater pump used to pump down the torpedo tubes after use. In pl t;pal ,llion for lo the job, the data base of a portable electro-optic vision device is updated from the ship's central computers with il~llllalion regardi"g the details of the pump in question and of the details of the torpedo room where the pump is located. In the event of a battle damaged ship or in case of limited visibility due to fire or power outages, the vision device can provide ~ An~e to the location of the pump through 5 visual and audio clues. As the various pipes may be routed through b-llkheArls and behind walls, the vision system can be used to see through walls and to "fill-in" the locations of the pipes so that the otherwise hidden pipe can be followed continuously and without ambiguit,v. Upon arriving at the pump in question, the ~ineer pointsthe camera axis of the vision device in the direction of the pump. In the display of the 20 device, a real image of the pump is superimposed with data such as the part number of the pump and clues to features of the pump such as the type of material beingpumped and flow direction. The vision device can receive inputs from the user toselect advance display modes. This input may be made by way of ~ .hAI~ical devices like a "mouse", or a simple button, or may be verbal co.. An~c that are recognized 25 electronically by speech recognition means. If the pump is to be 11icAccemhled~ the user may be able to instruct the device to display seqll~ntiAI rli.cAcsemhly steps by a verbal "step" co~....AI~d Di~csemkly could then be simplified by clues provided by the vision system's display. A particular part can be hi~hli~hted and the motionrequired for its proper removal can be cimlllAted such that the user can learn the correct ~1icAcs~mhly procedure in real time. Similarly, re-assembly can be expedited because prior knowledge ofthe re~ccçmhly sequence rests in the data base and is easily presented to the user in a way that does not require translation from a parts book to reality. A highli~hted part and the motion to assemble that part can be superimposed onto a real image of the workpiece.
In a still further summary example, one can imAgine the complexity of a situation where a structure fire is being fought. It may be typical for a fire captain to be reading the detailed features of the building interior from a blueprint and to ll ans".il that h~ol ...alion by radio to firefighters on the inside of the building. The SUBSTITUTE SHEET (RULE 26) WO 9S/07S26 217~ 1 4 PCT/US94/06844 ~

problem with this method is that it depends on multiple ~ slalions between various media. With each translation, the h~m~ion is distorted according to some l.~nsrer function thus limiting its accuracy. The present invention offers an alternative which allows detailed .1~....alion of a structure to be directly displayed to a firefighter s inside the building.
A video camera and goggle set could be in~t~lled into a sL~dard firefighter helmet, the goggle set having therein, the requisite display. The display unit would be in commllnic~tion~ either by hard wire or radio, to a computer processor that is also in communication with a data store having il~llllalion p-e.ecol-led therein legard;llg 0 features of the structure. A locating system could dele...,i.,c at all times where the firefighter is positioned and where she is looking. A real image from the camera can then be combined with the recorded data to realize a final ~u~m~nte~l image. If the bright light generated by the fire in the real image is blocking important features such as a door handle, the processor can delete the fire from the scene and "draw-in" the 5 door handle. The firefighter could easily navigate her way through a complicated building interior full of smoke and fire that would otherwise prohibit ones progress through that building. First hand i.~llll~lion provided by a vision system of the invention directly to a firewoman can greatly increase the speed and effectiveness of fire fighting tasks. A device mounted into a helmet would necess~ ily be small and 20 compact. It may be a..anged such that the camera, display, and the locating (position and attitude) means are in radio comm--nic~tion with a computer and processing means that may be bulky but located I ~;n,olely. This possibility is considered to be a subset of, and inclllded within, the scope of the invention.
It should be pointed out that in the case of very heavy smoke where no useful 25 real image can be detected by the camera and the displayed image is consists entirely of data from the data store, the system beco...es quite similar to the virtual reality devices. The critical di~tinl,tion is that the images displayed are exactly aligned to reality and oriented to the direction in which the user is looking. If the firefighter pushes on a door, a real door will open. In VR applications, it will appear and feel 30 like a real door is being opened, but the door of a VR system is completely .~iml ~l~ted and there is no real door being opened.
In a still further summary example, to provide a viewing device that allows a tourist to "see" into a previous time. One can im~gine a bus load of tourists arriving at the grounds of Pompeii. Each person equipped with portable vision system could 3~ be free to roam the grounds at will. It is possible to have predetermined points where the user could be directed and to have a sound track that collesl)onds to the scenes as presented to the user. Looking into the display of the vision system a user can see the buildings as they are now and a simulation of how they appeared prior to the eruption SUBSTITUTE SHEET (RULE 26) WO 95107S26 2 ~ ~ ~ 314 PCT/US94/06844 of Mt. Etna. He sees a sim~ tion of people going about their daily lives in ancient rO"~peii. Without warning, the viewer sees the ash falling on the bu~ ing~c and people running for cover. The simul~ted events of the past can be easily viewed in a modern time with provisions of the present invention. In this version, each portable device 5 can be in radio communication with a shared data and proces~ing center that communicates with each individual pe. ~onal device indepçnrlçntly.
In a final su"l".a. ~ example of the invention, we consider the need for surveyors to "see" into the ground to locate pipes and wires that have been buried.
When construction projects require that such pipes and wires be removed or altered, 0 or new projects to be installed such that they do not hl~t,re,~; with eYietin~ works, then surveyors must attempt to locate those works. A surveyor relies on docllm.ont~tion ofthe previous construction projects and a~le",pls to plot onto the real world the locations of the various works as indicated by those maps. Again, the surveyors lransla~ion from maps and other doc~ ;on to the real world and to the S operators of the digging equipment require skill and experience. Even with recent and accurate maps, the task can become a nigl~ e and failures very expensive. Ifone could look at a display aligned to the real scene which inr.lll(led an image ofthe scene and superimposed thereon the positions of the buried works, then it would be a trivial matter to mark the locations for caution. With the system of the invention, the 20 buried pipes, and wires, et cetera, could be recorded in the data base. The locations of construction projects could then be surveyed in a very short time.
It should be noted that most mapping systems of the art, even very sophisticated computerized maps with data and data symbols co~bil-ed with images, require a translation of the presented image to the images of the real world as viewed 25 by a user. Although HUDs combine a real image and data images that are aligned to reality, the invention can be ~ietinglliehed in that it has the capacity for communication between the real image and the data or data images. They cannot provide for recognition, alignment, undesirable feature extraction, and others advanced features of the invention. Only the present invention provides a system of 30 producing ~lgm~nted images aligned to a scene as the scene is naturally viewed.
It is a primary object of the invention to provide a versatile vision system having capabilities completely unknown to systems in the art.
It is a filrther primary object of the invention to introduce to surveying, çngine~ring, touring, navigation and other arts, a device and method of ~ugmented 35 reality whereby a real scene is imaged with ~lgm~nt~tion which provides additional information about the scene to the user.

SUBSTITUTE SHEET (RULE 26) WO 95/07526 PCT/US9 1J'~
~17~31~ .

It is a further object of the invention to provide a vision system for seeing a topography that includes objects that do not exist at the time of use, but are inten~led to be a part of the scene in some future time.
It is a further object of the invention to provide a vision system for seeing a s lopo~phy that includes objects that can not be seen in conventional viewing systems inclutling c~e~as, binoculars, telescopes, video recorders, head up displays, helmet mounted displays, et cetera.
It is a still further object of the invention to provide a vision system that allows undesirable objects of a real scene to be removed and replaced with computer 0 generated objects to leplese~ objects that are important to the user.

Brief Description of the Drawings These and other features, ~cpectc~ and advantages ofthe present invention will become better understood with regard to the follo~,ving description, appended claims 1S and drawings where:
figure one is a drawing of a user of the invention, a scene as it appears in reality, and the scene as it appears in the ~llgmented display ofthe device ofthe invention;
figure t~vo is a drawing of a city street scenario (2a) whereby the invention aDows a surveyor to see otherwise invisible features such as buried pipes and sewer lines (2b);
figure three is a block diagram a~ngen,~;"l of components and their relationship to each other that sets forth the novel co",bi~ ion of the invention;
figure four is a further det~iled block diagram;
2s figure five is a electro-optical sc~ ;c drawing of some of the system components.

Preferred Embodiments of the Invention The invention provides for a vision system device operable for producing an augm~nted image. In drawing figure one, a user 2 is shown to be looking into a canyon through a portable assembly 1 of the invention. An a~lgmp~lted image 3 ispresented to the user and is aligned with the real scene. The ~ugm~nted image may include: elements that do not yet exist, for example a bridge 4, repres~nting information added to a real scene; elemPnt.e that exist in the real scene but do not 3s appear in the final image, for can,ple trees at the right side ofthe bridge, leprese~-l;,~g information that has been deleted; and el~ "e"le that have been changed from their actual appearance. In drawing figure two, the c~n,ple of construction site surveying is illustrated. A scene as viewed normally would look like figure two(a).

SUBSTITUTE SHEET (RULE 26~

wo 95/07526 217 ~ 31~ PCT/US94/06844 In figure two(b), the fresh water pipes 5 leading to homes 7 and fire hydrants 8 and the buried sewer lines can be easily "seen" in the display of the invention.
In a first p.~r~" ed embodiment of the invention, an electro-optic app&~ s cû",p,ising a camera 9, a position dete",-inil,g means 16, an ~ttit~1de dele""ini"g means 1~, a data store 12, a computer 14, and a display 13; the camera being an electro-optic device opelable for converting a photon input from a field of view, into an optical image at an image field and ope~able for converting that photon image into an eICCIIOI~~C image and ope,~ble for II~n!`~ that electronic image to the computer, the positioning means being operable for dete. lnilfing the position of the 0 camera, the ~ttihlde means being operable for det~."l.nillg the camera pointing attitude of the camera defined by the symmetry axis of the camera field of view, the data store being in communication with the position and ~ttitude dt;le, lllining means whereby the values of position and attitude in-lic~te a location or locations in the data store being further in communication 39 with the computer whereby image data canlS be l~ ed thereto, the computer whereby the CleCIIOI~iC image from the camera and the electronic image 4 of the data store are COnlp&l t;d, analyzed, processed and co",bined for display as a single image, the computer being in communication with, the display, whereby camera position and attitude infullllalion particular to the scene are used to augment a real image of a scene that is particular to the position and ~ttitl1~e of the camera, is provided. A detailed description of each of the main device elements follows In plcfe~;d embodiments ofthe invention it is ~nticip~ted that the camera portion of the device will be comprised of four main subsystems integrated together 2s into a portable assembly 1 that is easy for a user 2 to hold. These include: an optical input lens assembly 38, an image stabilization system, at least one charge coupled device, CCD 20, and an optical ranging system 34.
A photon input im~ging lens assembly can be mounted into the portable camera unit 1. Single axis, or "monocular" systems are commercially available and are commonly used in devices such as camcorders and digital photographic cameras. It is pl ;;~1 l c;d that the lens have a zoom and autofocus capability; these systems are well developed in the art. An autofocus system can be computer driven but releasable for user override. Lens zoom operations can also be power driven and controlled m~nll~lly by the user at an electronic control switch, or could be driven by thecomputer based on known features particular to a chosen scene. The physical dimensions of the input lens define the field of view of the device which is variable with the zoom plùpellies ofthe lens. II~ Lion rega,di,lg the field of view can be supplied to the processor to be used in image combination routines.

SUBSTITUTE SHEET (RlJLE 26) 217~31~

Image stabilization systems are also well developed in the electronic im~gin~
arts and the known systems are useful for pl~;r~,led embodiments of this invention.
Image stabilization can be realized by way of solid state piezo-electric devices to drive a distortable prisms. Motion information from sensitive gyros 35 could detect s movements and provide a driving signal to the prisms. The system of the invention can be modified so that output from each of the gyros is made available to the control computer for calculations of the attitude and position of the portable assembly. This can be particularly important for systems not having access to a GPS signal, forexample in a subn,~ine or in a system dçcigned for microscale applications such as o surgery or micro manipulation. As an alternative to meçh~nical stabilization systems, an electronic image stabilization system could be used but there could be a loss of image quality and, in certain situations of rapid motion, a loss of continuity in image input. Also; an electronic image stabilization system could not be readily modified to provide motion i,~, Illa~ion to the control computer.
After the photon optical input from a scene is stabilized by the image stabilization system, it is then focused onto at least one chal~ed coupled device, CCD, that is positioned in the image plane of the input lens. A possible alternative could be a silicon intçn~ified camera or other image-con device but CCDs are p~rel~ed because their output can be compalil)le with electronic computers using digital data.
The light field from the scene is converted to a digital electronic image signal which 1 epl ese"Ls the scene. CCD devices are also quite comrnonly available and used in modern im~ing devices and standard CCDs are sl.ffiçiçnt for use in the present invention without modification or with only minor rh~nges. The camera can be color sensitive and two common methods for producing color images are possible. It is also anticipated that CCD devices that are sensitive to portions of the spectrum other than the human-visible region may be useful. For example some systems may benefit from CCD devices that are tuned for infrared IR im~gjng Use of IR devices may beimportant for applications such as fire fighting and agronomy where the important image i,~",nalion may be in a spectrum other that the visible region.
A ranging system is useful for a plurality of system functions. A first system function is to provide for the focus of the input lens. For images to be properly focused onto the CCD, it is necess~ry to adjust the input lens accol di"g to thedistance of the objects being imaged. As a second system function, range i,~".,~lion is also useful in the processing stage as it gives hints to the proper scaling of the real image with respect to the data from the data store. The camera image and data store inputs will be combined such that scale is an impol l~l consideration for proper apl)ealance ofthe final image. A further use, a third use of optical ranging can be to çlimin~te unwanted image features. For example, if a ship's rigging tends to SUBSTIIIJTE SHEET (RULE 2~) ~ wo 95/07526 ~ 17131~ PCT/US~'1!6Yq~l block the view of the shore, then it is possible to input to the image processor the range that is of interest. The processor can then remove objects that are not within the selectecl range.
A ranging system, for example a sonic or ultrasound device, can measure the s ~ t~n~e from the camera to the objects of a scene; the range. It is also possible to use laser or radar systems but they tend to be more complicated and more expensive.
The computer can receive range i~ alion directly from a ranging system or can receive range i,~",laLion from the focus position and zoom condition 18 of the input lens that can be interrogated by a computer routine. This is possible if the lens is 10 dç~i ned with tr~n~duGer devices that convert the focus and zoom lens conditions into electronic signals that are readable by the computer. As an alternative to active ranging systems or as a supplement to, it is possible to determine the range to objects of a scene from co",l,alisons of image features to known elements in the data base that correspond to those image features. A disclosure of this method is presented in US patent #5,034,812, Rawlings. Once position and attitude ofthe portable camerais established, a real image may be input and analyzed by the computer. By reference to a computerized topographic data base, the position and therefore di~ ce and bearing to user selected features may be established. An example; an object such as a lighthouse whose height was previously stored in the data base may be measured from the real image where the magnification conditions of the lens are known thereby making a range detel"ul,alion possible. Each ofthe ranging measurG",e"~s may be made indepçn~lently or in col~lbhl~Lion with a plurality of other measurements to increase accuracy.
po~;ti~r determining means In plGrel I ed embodiments of the invention it is anticipated that the position de~G,,,un.llg means ofthe device will be a Global Positioning System GPS receiver 16.
There are many alternative positioning systems that could be used effectively for other embod;,-,en~s. These incl~ld~, but are not limited to, Loran, Glonass, and Omega.
Since GPS has very high accuracy and has altitude capability, it is an obvious choice for the invention which benefits from the extra precision of that system. In self contained en~drolullGl~ like subterranean or a SUbln~lille, the positioning means may have to be a simplified version of tri~n~ tion from known locations or other - positioning detelllunalion means. For applications on a microscale, for example semiconductor inspection or microdevice mi1nllf~ctllre, it may be useful to have a 3s laser intelre~ ,lG~Gl position measurement means that has accuracy at the sub-nanometer level. The invention does not depend on the particulars of any positioning means just so that the position can be determined and input into a computer processor.

SUBSTITUTE SHEET (RULE 26) wo 95/07s26 2 ~ 7 ~ 3 1 4 PCT/US94/06844 ~

One objective of the invention is to have a small, lightweight, portable camera.To achieve this, it may be llecçc.c~. y to put the bulk of the computing power in a separate unit that is in communication with the portable camera. In this embodiment, the ~ntçnn~ 36 for the GPS receiver would be prere.ably put within the portable s camera or in close proximity thereto, and the more bulky processing portion of the GPS receiver could be combined with other system computing f~ ties in a b~cl~racor ground unit. For units to be used in obstructed locations where a GPS signal may not be available such as within the bridge of a ship, the ~nt~nn~ can be placed in some known position relative to the portable camera and the known and con~
0 dispiaçem~nt therebetween removed in the processing elecl,ollics.
attitude determining means In plert;"ed embodiments ofthe invention it is anticipated that the ~ttiturle detel ll~h~ g means of the device will be one of three alternatives. Each provides the h~rling, tilt and roll of the portable camera unit. A flux gate compass 37 located 5 within the portable camera provides he~ing hlro~ ~,alion to an accuracy of + 1degree. The output from this flux gate cor"pass can be made available to the computer. Outputs from piezo-electric gyros located in the image stabilization system can also be used by the computer to calculate motion from a datum established by the user. The datum can be from rest in a known level position or can be established by 20 observing the horizon by looking through the portable camera. An alternative is a triaxial m~gnPtQmeter system incorporating a biaxial electrolytic inclinometer. This tr~n~ c~r is located within the camera and is operable for providing the computer with a complete attitude de~elll~in~lion. It is a further alternative to compute attitude (inclllt1ing he~ing) from a known datum using the piezo-electric gyros. Selection of 25 this alternative may be for reasons of lowered cost as well as pe.",ill...~ application where m~gn~o.tic fields may preclude accurate h.o~ing readings ( i.e. close to electric arc welding equipment; large ,.~"~rc.""e,s and etc.) Note that since the image stabilization system already colllains the required piezo-electric gyros, it would be possible and desirable to provide this alternative as an optional method for 30 establishing hç~riing~ in all above alternatives.
computer In pl ere~ ed embodiments of the invention it is anticipated that the computer processor of the device will be a microcomputer with very fast graphic and videoabilities.
35 Because of the advanced state of computer development, many of the nece~s~ry elements of a computer, for example fast frame grabbers 22 and massive cache memories 23, have already been developed. The computers and software used in video games and computer generated animation applications lend themselves to the SUBSTITUTE SHEET (RULE 25) ~ 171314 WO 95/07S26 ~ PCTIUS!)1~

tasks at hand and can be readily be converted to those tasks. As the functions of the present invention are very unique, the l-~cess~,y sonw~e to achieve those functions will also necessarily be unique. It is ~nticirated that the complicated system instruction set design, either imple~ ed in h&rdwa~e such as ~OM or in software,s will be proprietary. It is likely to have task specific instruction sets and to have many devices of the invention each having their own particular instruction set.
data store In pl~re,led embo-lim~ntc ofthe invention it is ~ntit.ir~tecl that the data store means of the device will be a mass memory device; for ,Aa,.,ple a CD-ROM. l'he 10 data store can have pre-prog,~",ed i"ru""aLion ,ega, di~-g particular scenes that are of interest to a specific user. The position dete,l",ni.~, means and the ~ttitl-~e detellllinillg means control the pointer ofthe data store. In the simplest embodiment, the data store has locations defined by an orthogonal array of two variables. The values of position Pn and of attitude An uniquely define a corresponding scene SCENE{Pn, An3. In advanced versions, the range can also be important and a rangevariable, a third orthogonal variable defines a three dimensional array: SCENE{Pn, An~ Rn} The variable defined by R tells the computer in which plane normal to the axis of the camera, lies the il~l ln~lion of interest. In even further advanced versions, a magnification variable Mn can be similarly used to give a four dimensional array:
SCENE{Pn, An~ Rn~ Mn}. Many other factors can be used to control the way that data is stored and recalled. Each of the methods being particular to the task at hand, are subsets of the general case where data I e~ardil,~ a scene is selected from a data store in accordance to identification of the scene being viewed.
display 2s In plt;~"ed embodiments ofthe invention it is anticipated that the display 13 means of the device is a common active matrix LCD 32 device. These are currentlythe brightest, fastest displays available and are quite colllmollly used in conjunction with video applications. Alternatively, it is possible to use plasma display, electrol~ esce,.l or any of several possible displays. The particular type of display is not hl~po, l~ and may be application specific without deviating from the gist of the invention. An impol lalll feature of the display is that an electronic signal generated from a coml)u~el video card 28 can be converted into a optically viewable image that represents an a~lgmented real image of the scene that the camera is addressing. It is also a major feature of the invention to have the display oriented with reality and 3s therefore aligned with respect to the optical axis ofthe camera as is shown in figure 1.
other In prert;"ed embodiments ofthe invention it is ~nti~.ip~ted that the device can also include such appa, ~ ses such as user input control keys 29 to interact with the SUBSTlM~ ~HEET ~Rlll E 2~) -. ',, ~.
computer routines and to specify h~o",.alion that drives further computer routines;
audio tr~n.~ducPrs 31 both input types such as microphones and output types such as speakers also for control of computer routines and for prese"lalion of h~,l"alion to a user. Standard equipment known to couple vision devices to human users and 5 physical conditions such as a tripods, lens shades, eyepiece fixtures 30, et cetera are all considered to be co~p~ )le with the device of the invention.
The invention also provides for methods of producing an ~llgm~nted image.
In p, er~;l, ed embo~impnts of the invention, a vision system method co",~
0 an acquire step, a conversion step, a position del~l"",f,ng step, an attitude d~l~",~i,.."g step, a data recall step, a processing step, and a display step; wherein the im~ ing step the objects of a scene are imaged onto a two dimensional plane as the light e~ g from the objects propagates through at least one lens and forms the image at the plane, v~/hele;ll the digiti7:ing step the two dimensional optical input is lS converted into an electrical signal, wherein the position delel ll,il,ing step the position ofthe camera is dete"""~ed with respect to some a,l,i~,~"~ rt;rt;,t;,lce point, wherein the ~ttittlde d~ ""ining step the attitude of the camera is determined with respect to some albi~laly direction, wherein the data recall step data from a data base is recalled in accol iance with the measu, e",e"ls of the previous two steps, v~L~rein the 20 processing step the image of the ligiti~ing step and the data of the data recall step are combined to form a final image, and wherein the display step the final electronic image is converted to an optical image and displayed such that it is aligned to and co" esponds with the real scene, is provided. A det~iled description of each of the main method steps follows.
25 the acquire step In the acquire step, a camera of the device is pointed in the direction of a scene and a photon image is forrned by a lens. The photon image is formed onto an electronic device that has been placed into the image plane of the camera lens. The desired field of view can be challged with the zoom function of the lens to selectively 30 choose portions of a scene. The acquire step is likely to be continuous in p, efe"ed embo~im~nt~ and pointing at various objects will result in a continuously updated display. Real-time functions such as "p~nning" in which many acquire steps occurs in rapid sllccç~ion, are completely anticipated. The acquire step can be user influenced.
If a pre-selected range is chosen by way of a user input, the computer can 3s accommodate and adjust the lens in accordance with that input. This becomes more clear in the exa",ple where the user wishes to see the features of a distant object but the automatic ranging is keying on some object in the fo,egloLmd. A selected range can el;.. ~ e acquisition of undesirable foreground objects.

SU~3STITUTE SHEET (~ULE 26 WO 95/07526 ~, ~ 7 1~ ~ 4 PCT/US~1~'0~81 the converting step The method ofthe invention distin~.ixhes itselffrom methods such as HUDs and HMDs in that a photon r eprcsenL~lion of a scene is converted into an electronic signal that can be processed by a digital computer. The image of the real scene can be 5 converted into image frames at standard video rates. In this way, çh~n~es are not required to ;xl;.,g video equipment and that equipment can be easily integrated into systems of the invention. The digital signal of a CCD is particularly desirable because a great deal of processing hald~a.c, cc)~ dlilJle with CCDs already exists in the art.
The pixel replesc.lLalion of images of CCD devices is also co...~ le with the pixel 10 leples_lalion of images of LCD displays. The digital image ofthe CCD can be directly l~ ed to the computer processor routines.
the determine position step The position of the camera can be deterrnined in global (macro) applications with GPS methods and in microscopic applications with a laser interferometer. The S positioning step is used to locate the camera with respect to an albill~ly point and to relay the result of that measurement to the computer such that the computer can identify where in the data base the il~ Lion that corresponds to the scene that the camera is addressing is located.
the determine attitude step The attitude of the camera optical axis with respect to all three axis of rotation is measured to identify the direction the camera is pointed thereby further idenliryil.g what scene the camera is addressing and what appears in the camera field of view.
The ~ttit~lde figure typically drives one of the data store pointer variables and thereby, when in colllbinalion with a position figure, uniquely selects data stored therein.
2s Attitude can be realized di~e. cllLly in various applications. In marine applications, "heading" and an implied horizon can uniquely deffne a scene. This is considered a special case of attitude on all three axis of rotation. It many applications, altitude will need to be considered for a scene to be properly identified. The plill.aly function of the position and attitude steps are to give the computer enough i-lru-malion to unambiguously identify a scene.
the recall recorded data step Having made a position and attitude dete--nin~ion, the minimllm amount of - il.rOl IllaLion needed to specify a location in the data base is realized. For some simple applications, this is enough h~l Ill~Lion to recall stored data that can be colllbined with a real image produced by a camera looking into a scene located by the position and attitute measul elllc;llL~. In advanced applications, other factors such as range to object and magnification, may also need to be specified to specify the data to be SUBSTITUTE SHEET (F~ULE 26 W0 95/07526 2 ~7 ~ PCT/U~ ,~

recalled. The data after being recalled is trQn~mitted to the processor to be combined with real image data.
the p. oce~;ng step Real image data are received into the processor from the converting step and s ~u~ ;on data is received into the processor from the data recall step. Feature recognition teçhniques and image manipulation routines are applied to two forms of il~""~ion to finally yield an image that is a desired col"bil-alion of the two. Certain undesirable il~""~lion can be removed from the image of the real scene. Important n~""~lion not appea,i,lg in the image of the real scene, but found in the data can be o ~nh~nced in the final image.
display augmented image A ~ m~nted real image comprised of real image i,~, ~-la~ion and data from the data store is assembled in the processor and ~ n.~ (ed in standard video format to a display device. The final image is then displayed in accurate ~ nm.ont with the lS real scene in a way that allows the viewer to "see" objects and features, otherwise invisible in other vision systems in perfect orientation with the real scene.
mented real images can be generated at video rates giving a display that provides a real-time view of reality with computer generated ~I~gm~nt~tion combined therewith.
Although the present invention has been described in considerable detail with reference to certain prt;r~;"ed versions thereof other versions are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the versions contained herein.
In accordance with each of the pl ere" ~d embodiments of the invention, there is provided an apph-~t-ls for and method of realizing a vision system. It will be appreciated that each ofthe embodiments described include both an app~ s and method, and that the appa ~ s and method of one p.t;r~;.. ed embodiment may be dirrel elll than the appa- ~l~s and method of another embodiment.

SUBST~TUTE SHEET (RULE 26

Claims (5)

Claims What is claimed is:
1) An electro-optic apparatus for producing an augmented image from an image of a real scene and a computer generated image, the augmented image being aligned to the scene such that the directions up, down, left and right of the augmented image correspond directly with up, down, left and right of the real scene and the normal direction of the image plane points in a direction away from the user and towards the scene, said augmented image being comprised of information generated by an imaging means and information recalled from a data store, the information being particular to the position as determined by the apparatus and attitude of the apparatus, which comprises:
an imaging means;
a position determining means;
an attitude determining means;
a data store;
a computer; and a display, said imaging means having an optic axis which is a symmetry axis and defines the direction of viewing, a lens symmetric about that axis which defines an image field, and a charge coupled device in the image field;
said position determining means having a reference point corresponding to said imaging means;
said attitude determining means having a reference direction parallel to the direction of viewing of said imaging means;
said data store having memory locations wherein prerecorded data are stored, a plurality of orthogonal variables which identify those memory locations, and a pointer responsive to values from the position and attitude determining means which selects particular memory locations containing particular prerecorded data corresponding to the real scene;
said computer being in electronic communication with said imaging means, position and attitude determining means, data base, and display, having graphics and image processing capabilities; and said display being an electro-optic emissive display having a normal direction orthogonal to the display plane, the normal direction being colinear with the symmetry axis of the imaging means and being in communication with said computer.
2) The apparatus in claim 1 where the display is integrated into a binocular type viewing device having an optical path which corresponds to each of two eyes.
3) The apparatus of claim 1 further comprising an optics system of lenses to image the display at optical infinity to give the user the appearance and feeling of actually looking at the real scene as opposed to looking at a display device in the near field thereby allowing the user to relax the eye muscles by focusing at infinity.
4) The apparatus of claim 1 said computer being responsive to image features of the real scene as indicated by some predetermined condition.
5) An electro-optic apparatus for producing an augmented image from an image of a real scene and a computer generated image, the augmented image being aligned to the scene such that the directions up, down, left and right of the augmented image correspond directly with up, down, left and right of the real scene and the normal direction of the image plane points in a direction away from the user and towards the scene, said augmented image being comprised of information of generated by an imaging means and information recalled from a data store, the information being particular to the position as determined by the apparatus and attitude of the apparatus, which comprises:
an imaging means;
a position determining means;
an attitude determining means;
a data store;
a computer; and a display, said imaging means being sensitive to photon input and operable for converting light into an electronic signal processable by said computer;
said positioning determining means being operable for determining the location of the imaging means and presents a value which represents that location to the computer;
said attitude determining means being operable for determining the attitude of the imaging means as defined by the axis of the imaging means and presents a value which represents that attitude to the computer;
said data store being operable for storing prerecorded data corresponding to and representing objects known to be in the present field of view of the imaging means as defined by the present position and attitude thereof, recalling that data and transmitting it to the computer;
said computer being operable for receiving signals from imaging means and data store and combining those signals such that a composite image signal is formed and further operable for transmitting the composite image to said display;
said display being an electro-optic emissive display operable for converting the composite image signal of the computer to a physical signal viewable by a user that is aligned to the viewing axis of said imaging means, the electro-optic vision apparatus allows the user to see his environment with computer generated augmented images aligned to the scene as the user would view the scene normally.
CA002171314A 1993-09-10 1994-06-16 Electro-optic vision systems which exploit position and attitude Expired - Lifetime CA2171314C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US08/119,360 US5815411A (en) 1993-09-10 1993-09-10 Electro-optic vision system which exploits position and attitude
US08/119,360 1993-09-10
PCT/US1994/006844 WO1995007526A1 (en) 1993-09-10 1994-06-16 Electro-optic vision systems which exploit position and attitude

Publications (2)

Publication Number Publication Date
CA2171314A1 CA2171314A1 (en) 1995-03-16
CA2171314C true CA2171314C (en) 2001-06-12

Family

ID=22383980

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002171314A Expired - Lifetime CA2171314C (en) 1993-09-10 1994-06-16 Electro-optic vision systems which exploit position and attitude

Country Status (11)

Country Link
US (5) US5815411A (en)
EP (1) EP0722601B1 (en)
JP (1) JP3700021B2 (en)
KR (1) KR960705297A (en)
AT (1) ATE256327T1 (en)
AU (1) AU691508B2 (en)
CA (1) CA2171314C (en)
CH (1) CH689904A5 (en)
DE (1) DE69433405D1 (en)
NZ (1) NZ269318A (en)
WO (1) WO1995007526A1 (en)

Families Citing this family (464)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020091850A1 (en) 1992-10-23 2002-07-11 Cybex Corporation System and method for remote monitoring and operation of personal computers
GB2278196A (en) * 1993-05-18 1994-11-23 William Michael Frederi Taylor Information system using GPS
US7301536B2 (en) * 1993-09-10 2007-11-27 Geovector Corporation Electro-optic vision systems
US6037936A (en) * 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
US6064398A (en) * 1993-09-10 2000-05-16 Geovector Corporation Electro-optic vision systems
EP0807352A1 (en) 1995-01-31 1997-11-19 Transcenic, Inc Spatial referenced photography
US6535210B1 (en) 1995-06-07 2003-03-18 Geovector Corp. Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time
US5721842A (en) 1995-08-25 1998-02-24 Apex Pc Solutions, Inc. Interconnection system for viewing and controlling remotely connected computers with on-screen video overlay for controlling of the interconnection switch
JP3599208B2 (en) * 1995-10-20 2004-12-08 フジノン株式会社 Special glasses
US20070001875A1 (en) * 1995-11-14 2007-01-04 Taylor William M F GPS explorer
JP3658659B2 (en) * 1995-11-15 2005-06-08 カシオ計算機株式会社 Image processing device
TW395121B (en) 1996-02-26 2000-06-21 Seiko Epson Corp Personal wearing information display device and the display method using such device
US6181302B1 (en) * 1996-04-24 2001-01-30 C. Macgill Lynde Marine navigation binoculars with virtual display superimposing real world image
US6167562A (en) * 1996-05-08 2000-12-26 Kaneko Co., Ltd. Apparatus for creating an animation program and method for creating the same
WO1997044737A1 (en) 1996-05-22 1997-11-27 Geovector Corporation Mehtod and apparatus for controlling electrical devices in response to sensed conditions
US6804726B1 (en) 1996-05-22 2004-10-12 Geovector Corporation Method and apparatus for controlling electrical devices in response to sensed conditions
JP3338618B2 (en) * 1996-10-07 2002-10-28 ミノルタ株式会社 Display method and display device for real space image and virtual space image
US6758755B2 (en) 1996-11-14 2004-07-06 Arcade Planet, Inc. Prize redemption system for games executed over a wide area network
JP3745475B2 (en) * 1996-12-06 2006-02-15 株式会社セガ GAME DEVICE AND IMAGE PROCESSING DEVICE
US6080063A (en) 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US6654060B1 (en) * 1997-01-07 2003-11-25 Canon Kabushiki Kaisha Video-image control apparatus and method and storage medium
JP3569409B2 (en) * 1997-01-17 2004-09-22 ペンタックス株式会社 Telescope using GPS
JPH10327433A (en) * 1997-05-23 1998-12-08 Minolta Co Ltd Display device for composted image
US5940172A (en) * 1998-06-03 1999-08-17 Measurement Devices Limited Surveying apparatus
US6034739A (en) * 1997-06-09 2000-03-07 Evans & Sutherland Computer Corporation System for establishing a three-dimensional garbage matte which enables simplified adjusting of spatial relationships between physical and virtual scene elements
JP4251673B2 (en) * 1997-06-24 2009-04-08 富士通株式会社 Image presentation device
DE19728890A1 (en) * 1997-07-07 1999-02-04 Daimler Benz Ag Process to improve optical perception by modifying the retinal image
JP3426919B2 (en) * 1997-07-28 2003-07-14 シャープ株式会社 Figure creation device
US6256043B1 (en) * 1997-09-26 2001-07-03 Lucent Technologies Inc. Three dimensional virtual reality enhancement techniques
DE19734409C2 (en) * 1997-08-08 1999-11-25 Jeffrey Shaw Orientation device
JP3052286B2 (en) * 1997-08-28 2000-06-12 防衛庁技術研究本部長 Flight system and pseudo visual field forming device for aircraft
US8432414B2 (en) 1997-09-05 2013-04-30 Ecole Polytechnique Federale De Lausanne Automated annotation of a view
JPH11144040A (en) * 1997-11-05 1999-05-28 Nintendo Co Ltd Portable game machine and cartridge for portable game machine
US6681398B1 (en) * 1998-01-12 2004-01-20 Scanz Communications, Inc. Systems, devices and methods for reviewing selected signal segments
US6631522B1 (en) * 1998-01-20 2003-10-07 David Erdelyi Method and system for indexing, sorting, and displaying a video database
JP3840898B2 (en) * 1998-02-03 2006-11-01 セイコーエプソン株式会社 Projection display device, display method therefor, and image display device
FR2775814B1 (en) * 1998-03-06 2001-01-19 Rasterland Sa SYSTEM FOR VIEWING REALISTIC VIRTUAL THREE-DIMENSIONAL IMAGES IN REAL TIME
US6625299B1 (en) 1998-04-08 2003-09-23 Jeffrey Meisner Augmented reality technology
FR2778248A1 (en) * 1998-04-30 1999-11-05 Pierre Emmanuel Bathie Broadcast of images synthesizing a virtual environment relative to the user's actual environment
US6285317B1 (en) 1998-05-01 2001-09-04 Lucent Technologies Inc. Navigation system with three-dimensional display
US6141440A (en) * 1998-06-04 2000-10-31 Canon Kabushiki Kaisha Disparity measurement with variably sized interrogation regions
JP4296451B2 (en) * 1998-06-22 2009-07-15 株式会社日立製作所 Image recording device
US6266100B1 (en) 1998-09-04 2001-07-24 Sportvision, Inc. System for enhancing a video presentation of a live event
CA2345084C (en) 1998-09-22 2004-11-02 Cybex Computer Products Corporation System for accessing personal computers remotely
US6173239B1 (en) * 1998-09-30 2001-01-09 Geo Vector Corporation Apparatus and methods for presentation of information relating to objects being addressed
JP2000194726A (en) * 1998-10-19 2000-07-14 Sony Corp Device, method and system for processing information and providing medium
US20050060739A1 (en) * 1999-01-12 2005-03-17 Tony Verna Systems, devices and methods for reviewing selected signal segments
US6422942B1 (en) * 1999-01-29 2002-07-23 Robert W. Jeffway, Jr. Virtual game board and tracking device therefor
US7324081B2 (en) * 1999-03-02 2008-01-29 Siemens Aktiengesellschaft Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus
DE50003377D1 (en) * 1999-03-02 2003-09-25 Siemens Ag AUGMENTED REALITY SYSTEM FOR SITUATIONALLY SUPPORTING INTERACTION BETWEEN A USER AND A TECHNICAL DEVICE
US7471211B2 (en) * 1999-03-03 2008-12-30 Yamcon, Inc. Celestial object location device
US6844822B2 (en) 1999-03-03 2005-01-18 Yamcon, Inc. Celestial object location device
US6366212B1 (en) * 1999-03-03 2002-04-02 Michael Lemp Celestial object location device
US7068180B2 (en) * 1999-03-03 2006-06-27 Yamcon, Inc. Celestial object location device
US7124425B1 (en) * 1999-03-08 2006-10-17 Immersion Entertainment, L.L.C. Audio/video system and method utilizing a head mounted apparatus with noise attenuation
US6578203B1 (en) 1999-03-08 2003-06-10 Tazwell L. Anderson, Jr. Audio/video signal distribution system for head mounted displays
JP3902904B2 (en) 1999-03-23 2007-04-11 キヤノン株式会社 Information presenting apparatus, method, camera control apparatus, method, and computer-readable storage medium
JP4878409B2 (en) * 1999-03-23 2012-02-15 キヤノン株式会社 Information control apparatus, information control method, and storage medium
US6815687B1 (en) * 1999-04-16 2004-11-09 The Regents Of The University Of Michigan Method and system for high-speed, 3D imaging of optically-invisible radiation
WO2000065514A2 (en) * 1999-04-27 2000-11-02 I3E Holdings, Llc Remote ordering system
JP2000333139A (en) * 1999-05-18 2000-11-30 Sony Corp Information service unit, its method, information receiver, its method, lottery system, its method and medium
US7210160B2 (en) * 1999-05-28 2007-04-24 Immersion Entertainment, L.L.C. Audio/video programming and charging system and method
US20020057364A1 (en) 1999-05-28 2002-05-16 Anderson Tazwell L. Electronic handheld audio/video receiver and listening/viewing device
US20060174297A1 (en) * 1999-05-28 2006-08-03 Anderson Tazwell L Jr Electronic handheld audio/video receiver and listening/viewing device
JP2000346634A (en) * 1999-06-09 2000-12-15 Minolta Co Ltd Three-dimensionally inputting device
EP1059510A1 (en) * 1999-06-10 2000-12-13 Texas Instruments Incorporated Wireless location
JP2000350865A (en) 1999-06-11 2000-12-19 Mr System Kenkyusho:Kk Game device for composite real space, image processing method therefor and program storage medium
US7084887B1 (en) * 1999-06-11 2006-08-01 Canon Kabushiki Kaisha Marker layout method, mixed reality apparatus, and mixed reality space image generation method
EP1206680B1 (en) * 1999-06-26 2006-07-26 Otto K. Dufek Optical device
US8160994B2 (en) * 1999-07-21 2012-04-17 Iopener Media Gmbh System for simulating events in a real environment
WO2001009812A1 (en) * 1999-07-30 2001-02-08 David Rollo Personal tour guide system
EP1074943A3 (en) * 1999-08-06 2004-03-24 Canon Kabushiki Kaisha Image processing method and apparatus
US6396475B1 (en) * 1999-08-27 2002-05-28 Geo Vector Corp. Apparatus and methods of the remote address of objects
CN1284381C (en) * 1999-09-17 2006-11-08 自然工艺株式会社 Image pickup system, image processor, and camera
US6671390B1 (en) 1999-10-18 2003-12-30 Sport-X Inc. Automated collection, processing and use of sports movement information via information extraction from electromagnetic energy based upon multi-characteristic spatial phase processing
US6965397B1 (en) 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
US6909381B2 (en) * 2000-02-12 2005-06-21 Leonard Richard Kahn Aircraft collision avoidance system
US6522292B1 (en) * 2000-02-23 2003-02-18 Geovector Corp. Information systems having position measuring capacity
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
WO2001071282A1 (en) * 2000-03-16 2001-09-27 Geovector Corporation Information systems having directional interference facility
US7213048B1 (en) 2000-04-05 2007-05-01 Microsoft Corporation Context aware computing devices and methods
US6674448B1 (en) * 2000-04-05 2004-01-06 Ods Properties, Inc. Interactive wagering system with controllable graphic displays
US7076255B2 (en) 2000-04-05 2006-07-11 Microsoft Corporation Context-aware and location-aware cellular phones and methods
US7421486B1 (en) 2000-04-05 2008-09-02 Microsoft Corporation Context translation methods and systems
US7096029B1 (en) * 2000-04-05 2006-08-22 Microsoft Corporation Context aware computing devices having a common interface and related methods
US7743074B1 (en) 2000-04-05 2010-06-22 Microsoft Corporation Context aware systems and methods utilizing hierarchical tree structures
JP3833486B2 (en) * 2000-04-19 2006-10-11 富士写真フイルム株式会社 Imaging device
DE10027136C2 (en) * 2000-05-31 2002-11-21 Luigi Grasso Mobile system for creating a virtual display
US6970189B1 (en) 2000-05-31 2005-11-29 Ipac Acquisition Subsidiary I, Llc Method and system for automatically configuring a hand-held camera using wireless communication to improve image quality
DE10028713B4 (en) * 2000-06-08 2010-11-25 Art+Com Ag Visualization device and method
US7812856B2 (en) 2000-10-26 2010-10-12 Front Row Technologies, Llc Providing multiple perspectives of a venue activity to electronic wireless hand held devices
US7630721B2 (en) 2000-06-27 2009-12-08 Ortiz & Associates Consulting, Llc Systems, methods and apparatuses for brokering data between wireless devices and data rendering devices
JP2002014683A (en) * 2000-06-28 2002-01-18 Pioneer Electronic Corp Recording medium and recording data creating device therefor, and data restoration device
US6903707B2 (en) * 2000-08-09 2005-06-07 Information Decision Technologies, Llc Method for using a motorized camera mount for tracking in augmented reality
EP1311918B1 (en) * 2000-08-22 2004-03-17 Siemens Aktiengesellschaft System and method for automatically processing data especially in the field of production, assembly, service or maintenance
EP1311916B1 (en) * 2000-08-22 2005-05-11 Siemens Aktiengesellschaft Context-dependent protocol creation system and method, especially for use in the field of production, assembly, service or maintenance
US20020080279A1 (en) * 2000-08-29 2002-06-27 Sidney Wang Enhancing live sports broadcasting with synthetic camera views
US6470251B1 (en) 2000-08-31 2002-10-22 Trimble Navigation Limited Light detector for multi-axis position control
JP2002092012A (en) * 2000-09-19 2002-03-29 Olympus Optical Co Ltd Particular area information display system
US6895126B2 (en) 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
US7101988B2 (en) * 2000-10-12 2006-09-05 Marical, Inc. Polyvalent cation-sensing receptor in Atlantic salmon
JP2002132806A (en) * 2000-10-18 2002-05-10 Fujitsu Ltd Server system, and information providing service system and method
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8817045B2 (en) 2000-11-06 2014-08-26 Nant Holdings Ip, Llc Interactivity via mobile image recognition
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US20120154438A1 (en) * 2000-11-06 2012-06-21 Nant Holdings Ip, Llc Interactivity Via Mobile Image Recognition
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US7899243B2 (en) 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
US8218873B2 (en) * 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US6358164B1 (en) * 2000-11-08 2002-03-19 Joseph S. Bracewell Strike zone indicator measurement device
US7072956B2 (en) * 2000-12-22 2006-07-04 Microsoft Corporation Methods and systems for context-aware policy determination and enforcement
US7493565B2 (en) 2000-12-22 2009-02-17 Microsoft Corporation Environment-interactive context-aware devices and methods
US7148815B2 (en) * 2000-12-22 2006-12-12 Byron Scott Derringer Apparatus and method for detecting objects located on an airport runway
US6944679B2 (en) * 2000-12-22 2005-09-13 Microsoft Corp. Context-aware systems and methods, location-aware systems and methods, context-aware vehicles and methods of operating the same, and location-aware vehicles and methods of operating the same
US7174305B2 (en) 2001-01-23 2007-02-06 Opentv, Inc. Method and system for scheduling online targeted content delivery
US7031875B2 (en) 2001-01-24 2006-04-18 Geo Vector Corporation Pointing systems for addressing objects
EP1227301A1 (en) * 2001-01-29 2002-07-31 Siemens Aktiengesellschaft Position determination with the assistance of a mobile telephon
JP2002229992A (en) * 2001-01-31 2002-08-16 Fujitsu Ltd Server device for space information service and method for providing the same, and accounting device and method for space information service
JP2002229991A (en) 2001-01-31 2002-08-16 Fujitsu Ltd Server, user terminal, system and method for providing information
US6765569B2 (en) 2001-03-07 2004-07-20 University Of Southern California Augmented-reality tool employing scene-feature autocalibration during camera motion
JP4649050B2 (en) 2001-03-13 2011-03-09 キヤノン株式会社 Image processing apparatus, image processing method, and control program
JP4878083B2 (en) * 2001-03-13 2012-02-15 キヤノン株式会社 Image composition apparatus and method, and program
US8117313B2 (en) * 2001-03-19 2012-02-14 International Business Machines Corporation System and method for adaptive formatting of image information for efficient delivery and presentation
US6834249B2 (en) 2001-03-29 2004-12-21 Arraycomm, Inc. Method and apparatus for controlling a computing system
US20020143790A1 (en) * 2001-03-29 2002-10-03 Qian Richard J. Method and apparatus for automatic generation and management of sporting statistics
JP2004534963A (en) * 2001-03-30 2004-11-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Methods, systems and devices for augmented reality
US20020152127A1 (en) * 2001-04-12 2002-10-17 International Business Machines Corporation Tightly-coupled online representations for geographically-centered shopping complexes
JP4672190B2 (en) * 2001-04-26 2011-04-20 三菱電機株式会社 Video navigation device
US6810141B2 (en) * 2001-05-04 2004-10-26 Photon-X, Inc. Method for processing spatial-phase characteristics of electromagnetic energy and information conveyed therein
US20020180866A1 (en) * 2001-05-29 2002-12-05 Monroe David A. Modular sensor array
GB2377147A (en) * 2001-06-27 2002-12-31 Nokia Corp A virtual reality user interface
WO2003003194A1 (en) 2001-06-27 2003-01-09 Sony Corporation Integrated circuit device, information processing device, information recording device memory management method, mobile terminal device, semiconductor integrated circuit device, and communication method using mobile terminal device
US6903752B2 (en) * 2001-07-16 2005-06-07 Information Decision Technologies, Llc Method to view unseen atmospheric phenomenon using augmented reality
US7213046B2 (en) * 2001-07-31 2007-05-01 International Business Machines Corporation System and method for providing efficient and secure data exchange using strip information elements
US7143129B2 (en) * 2001-07-31 2006-11-28 International Business Machines Corporation System and method for distributing proximity information using a two-tiered bootstrap process
US8002623B2 (en) * 2001-08-09 2011-08-23 Igt Methods and devices for displaying multiple game elements
US7901289B2 (en) * 2001-08-09 2011-03-08 Igt Transparent objects on a gaming machine
US7367885B2 (en) * 2001-08-09 2008-05-06 Igt 3-D text in a gaming machine
US8267767B2 (en) 2001-08-09 2012-09-18 Igt 3-D reels and 3-D wheels in a gaming machine
US6990681B2 (en) * 2001-08-09 2006-01-24 Sony Corporation Enhancing broadcast of an event with synthetic scene using a depth map
US6887157B2 (en) * 2001-08-09 2005-05-03 Igt Virtual cameras and 3-D gaming environments in a gaming machine
US7909696B2 (en) * 2001-08-09 2011-03-22 Igt Game interaction in 3-D gaming environments
US7173672B2 (en) * 2001-08-10 2007-02-06 Sony Corporation System and method for transitioning between real images and virtual images
US7339609B2 (en) * 2001-08-10 2008-03-04 Sony Corporation System and method for enhancing real-time data feeds
US7091989B2 (en) 2001-08-10 2006-08-15 Sony Corporation System and method for data assisted chroma-keying
US20030030658A1 (en) * 2001-08-10 2003-02-13 Simon Gibbs System and method for mixed reality broadcast
US20030040943A1 (en) * 2001-08-22 2003-02-27 International Business Machines Corporation System and method for selecting arena seat locations for display
US6940538B2 (en) * 2001-08-29 2005-09-06 Sony Corporation Extracting a depth map from known camera and model tracking data
US7194148B2 (en) * 2001-09-07 2007-03-20 Yavitz Edward Q Technique for providing simulated vision
JP3805231B2 (en) * 2001-10-26 2006-08-02 キヤノン株式会社 Image display apparatus and method, and storage medium
US6987877B2 (en) * 2001-10-30 2006-01-17 Itt Manufacturing Enterprises, Inc. Superimposing graphic representations of ground locations onto ground location images after detection of failures
DE10156832B4 (en) * 2001-11-20 2004-01-29 Siemens Ag Method and device for displaying information
US20030095681A1 (en) * 2001-11-21 2003-05-22 Bernard Burg Context-aware imaging device
GB0129793D0 (en) * 2001-12-13 2002-01-30 Koninkl Philips Electronics Nv Real time authoring
US7341530B2 (en) * 2002-01-09 2008-03-11 Sportvision, Inc. Virtual strike zone
JP2003215494A (en) * 2002-01-22 2003-07-30 Canon Inc Composite reality feeling presenting device and image processing method
US6759979B2 (en) 2002-01-22 2004-07-06 E-Businesscontrols Corp. GPS-enhanced system and method for automatically capturing and co-registering virtual models of a site
JP2003241112A (en) * 2002-02-14 2003-08-27 Pentax Corp Binoculars with photographing function
FR2836215B1 (en) * 2002-02-21 2004-11-05 Yodea SYSTEM AND METHOD FOR THREE-DIMENSIONAL MODELING AND RENDERING OF AN OBJECT
US20030187820A1 (en) 2002-03-29 2003-10-02 Michael Kohut Media management system and process
US6917370B2 (en) * 2002-05-13 2005-07-12 Charles Benton Interacting augmented reality and virtual reality
US7245315B2 (en) * 2002-05-20 2007-07-17 Simmonds Precision Products, Inc. Distinguishing between fire and non-fire conditions using cameras
US7280696B2 (en) * 2002-05-20 2007-10-09 Simmonds Precision Products, Inc. Video detection/verification system
US7256818B2 (en) * 2002-05-20 2007-08-14 Simmonds Precision Products, Inc. Detecting fire using cameras
EP1514417A4 (en) * 2002-05-31 2005-06-15 Predictive Media Corp Method and system for the storage, viewing management, and delivery of targeted advertising
US20040073557A1 (en) * 2002-06-14 2004-04-15 Piccionelli Gregory A. Method for obtaining information associated with item at specific location
US7918730B2 (en) * 2002-06-27 2011-04-05 Igt Trajectory-based 3-D games of chance for video gaming machines
US7138963B2 (en) * 2002-07-18 2006-11-21 Metamersion, Llc Method for automatically tracking objects in augmented reality
US7042421B2 (en) * 2002-07-18 2006-05-09 Information Decision Technologies, Llc. Method for advanced imaging in augmented reality
US8797260B2 (en) * 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US9474968B2 (en) * 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US7391409B2 (en) * 2002-07-27 2008-06-24 Sony Computer Entertainment America Inc. Method and system for applying gearing effects to multi-channel mixed input
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8686939B2 (en) * 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US7760248B2 (en) 2002-07-27 2010-07-20 Sony Computer Entertainment Inc. Selective sound source listening in conjunction with computer interactive processing
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US9682319B2 (en) * 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US6782238B2 (en) * 2002-08-06 2004-08-24 Hewlett-Packard Development Company, L.P. Method for presenting media on an electronic device
US20040027365A1 (en) * 2002-08-07 2004-02-12 Sayers Craig P. Controlling playback of a temporal stream with a user interface device
DE10237952B4 (en) * 2002-08-20 2005-12-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device for viewing the environment
US20070035562A1 (en) * 2002-09-25 2007-02-15 Azuma Ronald T Method and apparatus for image enhancement
US7002551B2 (en) * 2002-09-25 2006-02-21 Hrl Laboratories, Llc Optical see-through augmented reality modified-scale display
US20040066391A1 (en) * 2002-10-02 2004-04-08 Mike Daily Method and apparatus for static image enhancement
US20040068758A1 (en) * 2002-10-02 2004-04-08 Mike Daily Dynamic video annotation
WO2004034617A1 (en) * 2002-10-07 2004-04-22 Immersion Entertainment, Llc System and method for providing event spectators with audio/video signals pertaining to remote events
US8458028B2 (en) * 2002-10-16 2013-06-04 Barbaro Technologies System and method for integrating business-related content into an electronic game
US7643054B2 (en) * 2002-12-09 2010-01-05 Hewlett-Packard Development Company, L.P. Directed guidance of viewing devices
DE10259123B3 (en) * 2002-12-18 2004-09-09 Dräger Safety AG & Co. KGaA Device and method for monitoring the use of respiratory protective device wearers
SE0203908D0 (en) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
US7515156B2 (en) * 2003-01-08 2009-04-07 Hrl Laboratories, Llc Method and apparatus for parallel speculative rendering of synthetic images
FR2851678B1 (en) * 2003-02-26 2005-08-19 Ludovic Oger METHOD OF MAKING SYNTHESIS IMAGES TO FACILITATE THE EXAMINATION OF SEVERAL PROJECTS OF ARCHITECTS IN COMPETITIONS.
US20040219961A1 (en) * 2003-04-08 2004-11-04 Ellenby Thomas William Computer games having variable execution dependence with respect to spatial properties of a mobile unit.
US8072470B2 (en) * 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US7391424B2 (en) * 2003-08-15 2008-06-24 Werner Gerhard Lonsing Method and apparatus for producing composite images which contain virtual objects
US20050046706A1 (en) * 2003-08-28 2005-03-03 Robert Sesek Image data capture method and apparatus
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US9573056B2 (en) * 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US10279254B2 (en) * 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8294712B2 (en) * 2003-09-19 2012-10-23 The Boeing Company Scalable method for rapidly detecting potential ground vehicle under cover using visualization of total occlusion footprint in point cloud population
JP4401728B2 (en) * 2003-09-30 2010-01-20 キヤノン株式会社 Mixed reality space image generation method and mixed reality system
DE10345743A1 (en) * 2003-10-01 2005-05-04 Kuka Roboter Gmbh Method and device for determining the position and orientation of an image receiving device
US7593687B2 (en) * 2003-10-07 2009-09-22 Immersion Entertainment, Llc System and method for providing event spectators with audio/video signals pertaining to remote events
EP1683063A1 (en) * 2003-11-10 2006-07-26 Siemens Aktiengesellschaft System and method for carrying out and visually displaying simulations in an augmented reality
US7755608B2 (en) * 2004-01-23 2010-07-13 Hewlett-Packard Development Company, L.P. Systems and methods of interfacing with a machine
JP4364002B2 (en) * 2004-02-06 2009-11-11 オリンパス株式会社 Head-mounted camera and photographing method using head-mounted camera
US8159337B2 (en) * 2004-02-23 2012-04-17 At&T Intellectual Property I, L.P. Systems and methods for identification of locations
US7567256B2 (en) * 2004-03-31 2009-07-28 Harris Corporation Method and apparatus for analyzing digital video using multi-format display
US7197829B2 (en) * 2004-05-04 2007-04-03 Acres John F Laser guided celestial identification device
US20050250085A1 (en) * 2004-05-07 2005-11-10 Yamcon, Inc. Viewing and display apparatus
US20050280705A1 (en) * 2004-05-20 2005-12-22 Immersion Entertainment Portable receiver device
CA2567280A1 (en) * 2004-05-21 2005-12-01 Pressco Technology Inc. Graphical re-inspection user setup interface
EP1759173A4 (en) * 2004-06-02 2012-02-22 Rockwell Collins Control Technologies Inc Image-augmented inertial navigation system (iains) and method
US7010862B2 (en) * 2004-06-04 2006-03-14 Yamcon, Inc. Viewing and display apparatus position determination algorithms
US8751156B2 (en) 2004-06-30 2014-06-10 HERE North America LLC Method of operating a navigation system using images
US7460953B2 (en) * 2004-06-30 2008-12-02 Navteq North America, Llc Method of operating a navigation system using images
US7456847B2 (en) * 2004-08-12 2008-11-25 Russell Steven Krajec Video with map overlay
US8547401B2 (en) * 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US8312132B2 (en) * 2004-08-20 2012-11-13 Core Wireless Licensing S.A.R.L. Context data in UPNP service information
JP4417386B2 (en) * 2004-09-15 2010-02-17 パイオニア株式会社 Video display system and video display method
US7557775B2 (en) * 2004-09-30 2009-07-07 The Boeing Company Method and apparatus for evoking perceptions of affordances in virtual environments
WO2006041864A2 (en) * 2004-10-07 2006-04-20 Melvin Bacharach Handheld roster device and method
IL164650A0 (en) * 2004-10-18 2005-12-18 Odf Optronics Ltd An application for the extraction and modeling of skyline for and stabilization the purpese of orientation
WO2006045101A2 (en) * 2004-10-20 2006-04-27 Talo, Inc. Apparatus for making celestial observations
US8585476B2 (en) 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
EP1814101A1 (en) * 2004-11-19 2007-08-01 Daem Interactive, Sl Personal device with image-acquisition functions for the application of augmented reality resources and corresponding method
FR2879300A1 (en) * 2004-12-10 2006-06-16 Vincent Piraud Electronic binocular apparatus for e.g. tripper, has GPS generating latitude and altitude data, compass and inclinometer modules giving orientation and inclination data, and LCD screen to display digitized 3D geographical cartography data
US20060170760A1 (en) * 2005-01-31 2006-08-03 Collegiate Systems, Llc Method and apparatus for managing and distributing audio/video content
WO2006086223A2 (en) * 2005-02-08 2006-08-17 Blue Belt Technologies, Inc. Augmented reality device and method
US20060190812A1 (en) * 2005-02-22 2006-08-24 Geovector Corporation Imaging systems including hyperlink associations
US7451041B2 (en) 2005-05-06 2008-11-11 Facet Technology Corporation Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
HUP0500483A2 (en) * 2005-05-12 2006-12-28 Mta Sztaki Szamitastechnikai E Method and equipment for matching virtual images to the real environment
US20060262140A1 (en) * 2005-05-18 2006-11-23 Kujawa Gregory A Method and apparatus to facilitate visual augmentation of perceived reality
US20060293837A1 (en) * 2005-06-02 2006-12-28 Bennett James R Photograph with map
NZ564319A (en) * 2005-06-06 2009-05-31 Tomtom Int Bv Navigation device with camera-info
US8423292B2 (en) 2008-08-19 2013-04-16 Tomtom International B.V. Navigation device with camera-info
US7737965B2 (en) * 2005-06-09 2010-06-15 Honeywell International Inc. Handheld synthetic vision device
US7728869B2 (en) * 2005-06-14 2010-06-01 Lg Electronics Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
DE102005029230A1 (en) * 2005-06-23 2007-01-04 Carl Zeiss Ag Display device and method
GB2428102A (en) * 2005-07-06 2007-01-17 David Smith Viewing device
IL169934A (en) * 2005-07-27 2013-02-28 Rafael Advanced Defense Sys Real-time geographic information system and method
CN101273368A (en) * 2005-08-29 2008-09-24 埃韦里克斯技术股份有限公司 Interactivity via mobile image recognition
JP4785662B2 (en) * 2005-10-03 2011-10-05 キヤノン株式会社 Information processing apparatus and information processing method
US7840032B2 (en) * 2005-10-04 2010-11-23 Microsoft Corporation Street-side maps and paths
US20090153468A1 (en) * 2005-10-31 2009-06-18 National University Of Singapore Virtual Interface System
US20070117078A1 (en) * 2005-11-23 2007-05-24 Trex Enterprises Corp Celestial compass
US20070136026A1 (en) * 2005-11-28 2007-06-14 Farheap Solutions Inc. GPS-Guided Visual Display
US9697644B2 (en) 2005-12-28 2017-07-04 Solmetric Corporation Methods for solar access measurement
US7873490B2 (en) * 2005-12-28 2011-01-18 Solmetric Corporation Solar access measurement device
KR101309176B1 (en) * 2006-01-18 2013-09-23 삼성전자주식회사 Apparatus and method for augmented reality
DE102006007283B4 (en) * 2006-02-16 2018-05-03 Airbus Operations Gmbh Landmark information system in an airplane
JP4847184B2 (en) * 2006-04-06 2011-12-28 キヤノン株式会社 Image processing apparatus, control method therefor, and program
US20070265075A1 (en) * 2006-05-10 2007-11-15 Sony Computer Entertainment America Inc. Attachable structure for use with hand-held controller having tracking ability
FI20060470A0 (en) * 2006-05-12 2006-05-12 Nokia Corp Orientation-based retrieval of messages
IL175835A0 (en) * 2006-05-22 2007-07-04 Rafael Armament Dev Authority Methods and systems for communicating and displaying points-of-interest
JP5521226B2 (en) * 2006-05-25 2014-06-11 富士フイルム株式会社 Display system, display method, and display program
US7477367B2 (en) * 2006-06-07 2009-01-13 Yamcon, Inc. Celestial object identification device
DE102006033147A1 (en) * 2006-07-18 2008-01-24 Robert Bosch Gmbh Surveillance camera, procedure for calibration of the security camera and use of the security camera
US8085990B2 (en) * 2006-07-28 2011-12-27 Microsoft Corporation Hybrid maps with embedded street-side images
US20080043020A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation User interface for viewing street side imagery
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8471906B2 (en) * 2006-11-24 2013-06-25 Trex Enterprises Corp Miniature celestial direction detection system
US20120116711A1 (en) * 2007-09-13 2012-05-10 Trex Enterprises Corp. Portable celestial compass
CN101583842B (en) 2006-12-05 2011-11-16 株式会社纳维泰 Navigation system, portable terminal device, and peripheral-image display method
US20080147325A1 (en) * 2006-12-18 2008-06-19 Maassel Paul W Method and system for providing augmented reality
US20080168492A1 (en) * 2007-01-05 2008-07-10 Meade Instruments Corp. Celestial Viewing System With Video Display
FR2911463B1 (en) * 2007-01-12 2009-10-30 Total Immersion Sa REAL-TIME REALITY REALITY OBSERVATION DEVICE AND METHOD FOR IMPLEMENTING A DEVICE
DE102007001949A1 (en) * 2007-01-13 2008-07-17 Schuster, Werner, Dr. Dipl.-Ing. Camera e.g. video camera, for use with computer, is connected with global positioning system receiver for impression of position and height of image recording, and is connected with directional device e.g. digital compass
US8335345B2 (en) 2007-03-05 2012-12-18 Sportvision, Inc. Tracking an object with multiple asynchronous cameras
JP5162928B2 (en) * 2007-03-12 2013-03-13 ソニー株式会社 Image processing apparatus, image processing method, and image processing system
US7640105B2 (en) 2007-03-13 2009-12-29 Certus View Technologies, LLC Marking system and method with location and/or time tracking
US8060304B2 (en) 2007-04-04 2011-11-15 Certusview Technologies, Llc Marking system and method
US20080231763A1 (en) * 2007-03-21 2008-09-25 Texas Instruments Incorporated System and method for displaying and capturing images
US8102420B2 (en) * 2007-03-24 2012-01-24 Yan Yuejun Portable digital photographing system combining position navigation information and image information
US8384710B2 (en) * 2007-06-07 2013-02-26 Igt Displaying and using 3D graphics on multiple displays provided for gaming environments
US9329052B2 (en) * 2007-08-07 2016-05-03 Qualcomm Incorporated Displaying image data and geographic element data
US8994851B2 (en) * 2007-08-07 2015-03-31 Qualcomm Incorporated Displaying image data and geographic element data
US20090094188A1 (en) * 2007-10-03 2009-04-09 Edward Covannon Facilitating identification of an object recorded in digital content records
US8073198B2 (en) * 2007-10-26 2011-12-06 Samsung Electronics Co., Ltd. System and method for selection of an object of interest during physical browsing by finger framing
JP4989417B2 (en) 2007-10-26 2012-08-01 キヤノン株式会社 Image display system, image display apparatus, control method therefor, and computer program
US7885145B2 (en) * 2007-10-26 2011-02-08 Samsung Electronics Co. Ltd. System and method for selection of an object of interest during physical browsing by finger pointing and snapping
JP5044817B2 (en) 2007-11-22 2012-10-10 インターナショナル・ビジネス・マシーンズ・コーポレーション Image processing method and apparatus for constructing virtual space
US9013505B1 (en) 2007-11-27 2015-04-21 Sprint Communications Company L.P. Mobile system representing virtual objects on live camera image
US8542907B2 (en) * 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US9582937B2 (en) * 2008-01-02 2017-02-28 Nokia Technologies Oy Method, apparatus and computer program product for displaying an indication of an object within a current field of view
US8239132B2 (en) * 2008-01-22 2012-08-07 Maran Ma Systems, apparatus and methods for delivery of location-oriented information
DE102008008502B4 (en) * 2008-02-11 2014-08-14 Siemens Aktiengesellschaft Arrangement for controlling an antenna arrangement in a magnetic resonance apparatus
CA2707246C (en) 2009-07-07 2015-12-29 Certusview Technologies, Llc Automatic assessment of a productivity and/or a competence of a locate technician with respect to a locate and marking operation
US8280117B2 (en) * 2008-03-18 2012-10-02 Certusview Technologies, Llc Virtual white lines for indicating planned excavation sites on electronic images
US8290204B2 (en) 2008-02-12 2012-10-16 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8672225B2 (en) 2012-01-31 2014-03-18 Ncr Corporation Convertible barcode reader
US8249306B2 (en) 2008-03-18 2012-08-21 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8532342B2 (en) 2008-02-12 2013-09-10 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
CN102016877B (en) * 2008-02-27 2014-12-10 索尼计算机娱乐美国有限责任公司 Methods for capturing depth data of a scene and applying computer actions
US9576330B2 (en) 2008-03-07 2017-02-21 Virtually Live (Switzerland) Gmbh Media system and method
GB0804274D0 (en) 2008-03-07 2008-04-16 Virtually Live Ltd A media sysyem and method
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
DE102008023439B4 (en) 2008-05-14 2011-02-17 Christian-Albrechts-Universität Zu Kiel Augmented reality binoculars for navigation support
US8711176B2 (en) 2008-05-22 2014-04-29 Yahoo! Inc. Virtual billboards
US20090289955A1 (en) * 2008-05-22 2009-11-26 Yahoo! Inc. Reality overlay device
US8531462B1 (en) * 2008-06-09 2013-09-10 EyezOnBaseball, LLC Interactive scorekeeping and animation generation
US8620587B2 (en) 2008-10-02 2013-12-31 Certusview Technologies, Llc Methods, apparatus, and systems for generating electronic records of locate and marking operations, and combined locate and marking apparatus for same
US8280631B2 (en) 2008-10-02 2012-10-02 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation based on marking device actuations
US8265817B2 (en) * 2008-07-10 2012-09-11 Lockheed Martin Corporation Inertial measurement with an imaging sensor and a digitized map
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US8476906B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US8442766B2 (en) 2008-10-02 2013-05-14 Certusview Technologies, Llc Marking apparatus having enhanced features for underground facility marking operations, and associated methods and systems
US8065082B2 (en) * 2008-11-14 2011-11-22 Honeywell International Inc. Display systems with enhanced symbology
US20100287500A1 (en) * 2008-11-18 2010-11-11 Honeywell International Inc. Method and system for displaying conformal symbology on a see-through display
US8817092B2 (en) * 2008-11-25 2014-08-26 Stuart Leslie Wilkinson Method and apparatus for generating and viewing combined images
US8961313B2 (en) * 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
JP5328810B2 (en) * 2008-12-25 2013-10-30 パナソニック株式会社 Information display device and information display method
WO2010088290A1 (en) * 2009-01-27 2010-08-05 Arthur Thomas D Tight optical intergation (toi) of images with gps range measurements
US8572193B2 (en) 2009-02-10 2013-10-29 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US8902251B2 (en) 2009-02-10 2014-12-02 Certusview Technologies, Llc Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
CA2897462A1 (en) 2009-02-11 2010-05-04 Certusview Technologies, Llc Management system, and associated methods and apparatus, for providing automatic assessment of a locate operation
US8384742B2 (en) 2009-02-11 2013-02-26 Certusview Technologies, Llc Virtual white lines (VWL) for delimiting planned excavation sites of staged excavation projects
US20100201690A1 (en) * 2009-02-11 2010-08-12 Certusview Technologies, Llc Virtual white lines (vwl) application for indicating a planned excavation or locate path
US20100208033A1 (en) * 2009-02-13 2010-08-19 Microsoft Corporation Personal Media Landscapes in Mixed Reality
US8379056B2 (en) * 2009-02-27 2013-02-19 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Device and method for providing a video signal of a virtual image
US8264377B2 (en) 2009-03-02 2012-09-11 Griffith Gregory M Aircraft collision avoidance system
US8264379B2 (en) * 2009-03-10 2012-09-11 Honeywell International Inc. Methods and systems for correlating data sources for vehicle displays
US20140240313A1 (en) * 2009-03-19 2014-08-28 Real Time Companies Computer-aided system for 360° heads up display of safety/mission critical data
US8527657B2 (en) * 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8342963B2 (en) * 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8839121B2 (en) * 2009-05-06 2014-09-16 Joseph Bertolami Systems and methods for unifying coordinate systems in augmented reality applications
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US20100309290A1 (en) * 2009-06-08 2010-12-09 Stephen Brooks Myers System for capture and display of stereoscopic content
US8427508B2 (en) * 2009-06-25 2013-04-23 Nokia Corporation Method and apparatus for an augmented reality user interface
CA2771286C (en) 2009-08-11 2016-08-30 Certusview Technologies, Llc Locating equipment communicatively coupled to or equipped with a mobile/portable device
EP2467674A1 (en) 2009-08-20 2012-06-27 Certusview Technologies, LLC Methods and marking devices with mechanisms for indicating and/or detecting marking material color
CA2710189C (en) 2009-08-20 2012-05-08 Certusview Technologies, Llc Methods and apparatus for assessing marking operations based on acceleration information
CA2713282C (en) 2009-08-20 2013-03-19 Certusview Technologies, Llc Marking device with transmitter for triangulating location during marking operations
US8395547B2 (en) 2009-08-27 2013-03-12 Hewlett-Packard Development Company, L.P. Location tracking for mobile computing device
US8755815B2 (en) 2010-08-31 2014-06-17 Qualcomm Incorporated Use of wireless access point ID for position determination
US9001252B2 (en) * 2009-11-02 2015-04-07 Empire Technology Development Llc Image matching to augment reality
US20110102460A1 (en) * 2009-11-04 2011-05-05 Parker Jordan Platform for widespread augmented reality and 3d mapping
US20110115671A1 (en) * 2009-11-17 2011-05-19 Qualcomm Incorporated Determination of elevation of mobile station
US8494544B2 (en) * 2009-12-03 2013-07-23 Osocad Remote Limited Liability Company Method, apparatus and computer program to perform location specific information retrieval using a gesture-controlled handheld mobile device
US8583372B2 (en) 2009-12-07 2013-11-12 Certusview Technologies, Llc Methods, apparatus, and systems for facilitating compliance with marking specifications for dispensing marking material
TW201122711A (en) * 2009-12-23 2011-07-01 Altek Corp System and method for generating an image appended with landscape information
US8400548B2 (en) * 2010-01-05 2013-03-19 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
WO2011093430A1 (en) * 2010-01-29 2011-08-04 東海ゴム工業株式会社 Vehicle stabilizer bushing
KR101082285B1 (en) * 2010-01-29 2011-11-09 주식회사 팬택 Terminal and method for providing augmented reality
WO2011106520A1 (en) * 2010-02-24 2011-09-01 Ipplex Holdings Corporation Augmented reality panorama supporting visually impaired individuals
US8488246B2 (en) 2010-02-28 2013-07-16 Osterhout Group, Inc. See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US8472120B2 (en) 2010-02-28 2013-06-25 Osterhout Group, Inc. See-through near-eye display glasses with a small scale image source
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
EP2539759A1 (en) * 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US8467133B2 (en) 2010-02-28 2013-06-18 Osterhout Group, Inc. See-through display with an optical assembly including a wedge-shaped illumination system
US20120212484A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content placement using distance and location information
DE102010010951A1 (en) * 2010-03-10 2011-09-15 Astrium Gmbh The information reproducing apparatus
JP4922436B2 (en) * 2010-06-07 2012-04-25 株式会社エヌ・ティ・ティ・ドコモ Object display device and object display method
US10096161B2 (en) 2010-06-15 2018-10-09 Live Nation Entertainment, Inc. Generating augmented reality images using sensor and location data
CA2802686C (en) 2010-06-15 2019-10-01 Ticketmaster, Llc Methods and systems for computer aided event and venue setup and modeling and interactive maps
US9781170B2 (en) 2010-06-15 2017-10-03 Live Nation Entertainment, Inc. Establishing communication links using routing protocols
US8762041B2 (en) 2010-06-21 2014-06-24 Blackberry Limited Method, device and system for presenting navigational information
KR20120004669A (en) * 2010-07-07 2012-01-13 삼성전자주식회사 Apparatus and method for dispalying world clock in portable terminal
EP2622920B1 (en) 2010-09-29 2024-01-17 QUALCOMM Incorporated Non-transient computer readable storage medium and mobile computing device employing matching of access point identifiers
IL208600A (en) 2010-10-10 2016-07-31 Rafael Advanced Defense Systems Ltd Network-based real time registered augmented reality for mobile devices
JP5870929B2 (en) 2010-11-02 2016-03-01 日本電気株式会社 Information processing system, information processing apparatus, and information processing method
WO2012081319A1 (en) 2010-12-15 2012-06-21 株式会社日立製作所 Video monitoring apparatus
US8659663B2 (en) * 2010-12-22 2014-02-25 Sportvision, Inc. Video tracking of baseball players to determine the start and end of a half-inning
US9007463B2 (en) * 2010-12-22 2015-04-14 Sportsvision, Inc. Video tracking of baseball players which identifies merged participants based on participant roles
JP5838560B2 (en) * 2011-02-14 2016-01-06 ソニー株式会社 Image processing apparatus, information processing apparatus, and imaging region sharing determination method
US9507416B2 (en) * 2011-02-22 2016-11-29 Robert Howard Kimball Providing a corrected view based on the position of a user with respect to a mobile platform
US8188880B1 (en) 2011-03-14 2012-05-29 Google Inc. Methods and devices for augmenting a field of view
EP2689393A1 (en) 2011-03-23 2014-01-29 Metaio GmbH Method for registering at least one part of a first and second image using a collineation warping function
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
DE102011075372A1 (en) * 2011-05-05 2012-11-08 BSH Bosch und Siemens Hausgeräte GmbH System for the extended provision of information to customers in a sales room for home appliances and associated method and computer program product
US9215383B2 (en) 2011-08-05 2015-12-15 Sportsvision, Inc. System for enhancing video from a mobile camera
US9817017B2 (en) * 2011-10-17 2017-11-14 Atlas5D, Inc. Method and apparatus for monitoring individuals while protecting their privacy
US9341464B2 (en) 2011-10-17 2016-05-17 Atlas5D, Inc. Method and apparatus for sizing and fitting an individual for apparel, accessories, or prosthetics
US9974466B2 (en) 2011-10-17 2018-05-22 Atlas5D, Inc. Method and apparatus for detecting change in health status
US9026896B2 (en) 2011-12-26 2015-05-05 TrackThings LLC Method and apparatus of physically moving a portable unit to view composite webpages of different websites
US9965140B2 (en) 2011-12-26 2018-05-08 TrackThings LLC Method and apparatus of a marking objects in images displayed on a portable unit
US8532919B2 (en) * 2011-12-26 2013-09-10 TrackThings LLC Method and apparatus of physically moving a portable unit to view an image of a stationary map
US9573039B2 (en) * 2011-12-30 2017-02-21 Nike, Inc. Golf aid including heads up display
US9597574B2 (en) 2011-12-30 2017-03-21 Nike, Inc. Golf aid including heads up display
US8957916B1 (en) * 2012-03-23 2015-02-17 Google Inc. Display method
EP2672458A3 (en) * 2012-06-06 2017-04-26 Samsung Electronics Co., Ltd Terminal, apparatus and method for providing an augmented reality service
US20180048750A1 (en) * 2012-06-15 2018-02-15 Muzik, Llc Audio/video wearable computer system with integrated projector
CN104756063A (en) 2012-10-31 2015-07-01 扩张世界有限公司 Image display system, electronic device, program, and image display method
JP6007377B2 (en) * 2012-11-14 2016-10-12 ピーアンドダブリューソリューションズ株式会社 Seat layout display device, method and program
KR101981964B1 (en) 2012-12-03 2019-05-24 삼성전자주식회사 Terminal and method for realizing virtual reality
US8918246B2 (en) 2012-12-27 2014-12-23 Caterpillar Inc. Augmented reality implement control
US20140184643A1 (en) * 2012-12-27 2014-07-03 Caterpillar Inc. Augmented Reality Worksite
US20140225917A1 (en) * 2013-02-14 2014-08-14 Geovector Corporation Dynamic Augmented Reality Vision Systems
US20140247281A1 (en) * 2013-03-03 2014-09-04 Geovector Corp. Dynamic Augmented Reality Vision Systems
US20150287224A1 (en) * 2013-10-01 2015-10-08 Technology Service Corporation Virtual tracer methods and systems
US9256072B2 (en) * 2013-10-02 2016-02-09 Philip Scott Lyren Wearable electronic glasses that detect movement of a real object copies movement of a virtual object
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
US20160179842A1 (en) * 2014-12-22 2016-06-23 Geovector Corp. Computerized Search Dependant on Spatial States
US9600993B2 (en) 2014-01-27 2017-03-21 Atlas5D, Inc. Method and system for behavior detection
DE102014003221A1 (en) * 2014-03-05 2015-09-10 Audi Ag Method for scale correct scaling of a recording of a camera sensor
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
US20150312264A1 (en) * 2014-04-23 2015-10-29 Front Row Technologies, Llc Method, system and server for authorizing computing devices for receipt of venue-based data based on the geographic location of a user
US9610489B2 (en) 2014-05-30 2017-04-04 Nike, Inc. Golf aid including virtual caddy
US9694269B2 (en) 2014-05-30 2017-07-04 Nike, Inc. Golf aid including heads up display for green reading
US9798299B2 (en) 2014-06-20 2017-10-24 International Business Machines Corporation Preventing substrate penetrating devices from damaging obscured objects
US10127601B2 (en) 2014-07-16 2018-11-13 Sony Corporation Mesh network applied to fixed establishment with movable items therein
US9906897B2 (en) 2014-07-16 2018-02-27 Sony Corporation Applying mesh network to pet carriers
US9900748B2 (en) * 2014-07-16 2018-02-20 Sony Corporation Consumer electronics (CE) device and related method for providing stadium services
US9361802B2 (en) 2014-07-16 2016-06-07 Sony Corporation Vehicle ad hoc network (VANET)
CN112862775A (en) * 2014-07-25 2021-05-28 柯惠Lp公司 Augmenting surgical reality environment
US9435635B1 (en) * 2015-02-27 2016-09-06 Ge Aviation Systems Llc System and methods of detecting an intruding object in a relative navigation system
US10013756B2 (en) 2015-03-13 2018-07-03 Atlas5D, Inc. Methods and systems for measuring use of an assistive device for ambulation
EP3113470B1 (en) * 2015-07-02 2020-12-16 Nokia Technologies Oy Geographical location visual information overlay
IL241446B (en) * 2015-09-10 2018-05-31 Elbit Systems Ltd Adjusting displays on user monitors and guiding users' attention
USD797790S1 (en) * 2016-03-22 2017-09-19 Teletracking Technologies, Inc. Display screen with set of graphical user interface icons
US11017901B2 (en) 2016-08-02 2021-05-25 Atlas5D, Inc. Systems and methods to identify persons and/or identify and quantify pain, fatigue, mood, and intent with protection of privacy
US10354439B2 (en) 2016-10-24 2019-07-16 Charles C. Carrington System for generating virtual building plan data based upon stored and scanned building data and related methods
JP6874448B2 (en) * 2017-03-17 2021-05-19 株式会社デンソーウェーブ Information display system
US20190004544A1 (en) * 2017-06-29 2019-01-03 Ge Aviation Systems, Llc Method for flying at least two aircraft
EP3634586A1 (en) 2017-07-28 2020-04-15 Idex Europe GmbH Control device for operating a fire extinguisher system
WO2019020192A1 (en) 2017-07-28 2019-01-31 Idex Europe Gmbh Control device for operating a fire extinguisher system and extinguisher nozzle
KR102496683B1 (en) 2017-10-11 2023-02-07 삼성디스플레이 주식회사 Display panel and display device comprising the display panel
US10803610B2 (en) 2018-03-06 2020-10-13 At&T Intellectual Property I, L.P. Collaborative visual enhancement devices
US11526211B2 (en) * 2018-05-08 2022-12-13 Nevermind Capital Llc Methods and apparatus for controlling, implementing and supporting trick play in an augmented reality device
US10497161B1 (en) 2018-06-08 2019-12-03 Curious Company, LLC Information display by overlay on an object
US10650600B2 (en) 2018-07-10 2020-05-12 Curious Company, LLC Virtual path display
US10818088B2 (en) * 2018-07-10 2020-10-27 Curious Company, LLC Virtual barrier objects
US10902678B2 (en) 2018-09-06 2021-01-26 Curious Company, LLC Display of hidden information
US11055913B2 (en) 2018-12-04 2021-07-06 Curious Company, LLC Directional instructions in an hybrid reality system
US10970935B2 (en) 2018-12-21 2021-04-06 Curious Company, LLC Body pose message system
US10872584B2 (en) 2019-03-14 2020-12-22 Curious Company, LLC Providing positional information using beacon devices
DE202019103022U1 (en) 2019-05-28 2019-06-07 Martina Lammel Device for virtual representation of image information
CN111060918B (en) * 2019-12-27 2021-10-26 中国有色金属长沙勘察设计研究院有限公司 Dry beach monitoring method based on photogrammetry and laser ranging
US20210241580A1 (en) * 2020-02-05 2021-08-05 Adrenalineip Play by play wagering through wearable device
US11763624B2 (en) * 2020-09-22 2023-09-19 Adrenalineip AR VR in-play wagering system
US11657578B2 (en) 2021-03-11 2023-05-23 Quintar, Inc. Registration for augmented reality system for viewing an event
US20220295139A1 (en) * 2021-03-11 2022-09-15 Quintar, Inc. Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model
US11645819B2 (en) 2021-03-11 2023-05-09 Quintar, Inc. Augmented reality system for viewing an event with mode based on crowd sourced images
US11527047B2 (en) 2021-03-11 2022-12-13 Quintar, Inc. Augmented reality system for viewing an event with distributed computing
US20220295040A1 (en) * 2021-03-11 2022-09-15 Quintar, Inc. Augmented reality system with remote presentation including 3d graphics extending beyond frame
US11682313B2 (en) 2021-03-17 2023-06-20 Gregory M. Griffith Sensor assembly for use in association with aircraft collision avoidance system and method of using the same
CN114252075A (en) * 2021-12-09 2022-03-29 国网安徽省电力有限公司电力科学研究院 Path tracking method and system of cable trench inspection robot
US11586286B1 (en) 2022-05-18 2023-02-21 Bank Of America Corporation System and method for navigating on an augmented reality display
US11720380B1 (en) 2022-05-18 2023-08-08 Bank Of America Corporation System and method for updating augmented reality navigation instructions based on a detected error

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE429690B (en) * 1979-11-19 1983-09-19 Hans Erik Ove Olofsson AIRPLANE REFERRED WORLD REGISTRATION WITH THE HELP OF CAMERA, WHICH ALLOWS ELECTRONIC REGISTRATION, AND THEIR CONCERNING ENCOURAGEMENT PROCEDURE
US4322726A (en) * 1979-12-19 1982-03-30 The Singer Company Apparatus for providing a simulated view to hand held binoculars
US4489389A (en) * 1981-10-02 1984-12-18 Harris Corporation Real time video perspective digital map display
US4545576A (en) * 1982-01-15 1985-10-08 Harris Thomas M Baseball-strike indicator and trajectory analyzer and method of using same
US4667190A (en) * 1982-07-30 1987-05-19 Honeywell Inc. Two axis fast access memory
US4645459A (en) * 1982-07-30 1987-02-24 Honeywell Inc. Computer generated synthesized imagery
US4661849A (en) * 1985-06-03 1987-04-28 Pictel Corporation Method and apparatus for providing motion estimation signals for communicating image sequences
US4682225A (en) * 1985-09-13 1987-07-21 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for telemetry adaptive bandwidth compression
US4688092A (en) * 1986-05-06 1987-08-18 Ford Aerospace & Communications Corporation Satellite camera image navigation
US4807158A (en) * 1986-09-30 1989-02-21 Daleco/Ivex Partners, Ltd. Method and apparatus for sampling images to simulate movement within a multidimensional space
FR2610752B1 (en) * 1987-02-10 1989-07-21 Sagem METHOD FOR REPRESENTING THE PERSPECTIVE IMAGE OF A FIELD AND SYSTEM FOR IMPLEMENTING SAME
US4937570A (en) * 1987-02-26 1990-06-26 Mitsubishi Denki Kabushiki Kaisha Route guidance display device
US4872051A (en) * 1987-10-01 1989-10-03 Environmental Research Institute Of Michigan Collision avoidance alarm system
DE3737972A1 (en) * 1987-11-07 1989-05-24 Messerschmitt Boelkow Blohm HELMET LOCATION DEVICE
JPH0695008B2 (en) * 1987-12-11 1994-11-24 株式会社東芝 Monitoring device
US4855822A (en) * 1988-01-26 1989-08-08 Honeywell, Inc. Human engineered remote driving system
US5072218A (en) * 1988-02-24 1991-12-10 Spero Robert E Contact-analog headup display method and apparatus
US4970666A (en) * 1988-03-30 1990-11-13 Land Development Laboratory, Inc. Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment
CA2001070A1 (en) * 1988-10-24 1990-04-24 Douglas B. George Telescope operating system
GB8826550D0 (en) * 1988-11-14 1989-05-17 Smiths Industries Plc Image processing apparatus and methods
NL8901695A (en) * 1989-07-04 1991-02-01 Koninkl Philips Electronics Nv METHOD FOR DISPLAYING NAVIGATION DATA FOR A VEHICLE IN AN ENVIRONMENTAL IMAGE OF THE VEHICLE, NAVIGATION SYSTEM FOR CARRYING OUT THE METHOD AND VEHICLE FITTING A NAVIGATION SYSTEM.
US5045937A (en) * 1989-08-25 1991-09-03 Space Island Products & Services, Inc. Geographical surveying using multiple cameras to obtain split-screen images with overlaid geographical coordinates
US5124915A (en) * 1990-05-29 1992-06-23 Arthur Krenzel Computer-aided data collection system for assisting in analyzing critical situations
JP3247126B2 (en) * 1990-10-05 2002-01-15 テキサス インスツルメンツ インコーポレイテツド Method and apparatus for providing a portable visual display
CA2060406C (en) * 1991-04-22 1998-12-01 Bruce Edward Hamilton Helicopter virtual image display system incorporating structural outlines
FR2675977B1 (en) * 1991-04-26 1997-09-12 Inst Nat Audiovisuel METHOD FOR MODELING A SHOOTING SYSTEM AND METHOD AND SYSTEM FOR PRODUCING COMBINATIONS OF REAL IMAGES AND SYNTHESIS IMAGES.
US5182641A (en) * 1991-06-17 1993-01-26 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Composite video and graphics display for camera viewing systems in robotics and teleoperation
US5333257A (en) * 1991-08-09 1994-07-26 C/A Architects, Inc. System for displaying selected assembly-facility seating views
GB9121707D0 (en) * 1991-10-12 1991-11-27 British Aerospace Improvements in computer-generated imagery
JPH05113751A (en) * 1991-10-22 1993-05-07 Pioneer Electron Corp Navigation device
FR2683330B1 (en) * 1991-10-31 1994-11-25 Thomson Csf COMPUTER BINOCULAR.
FR2683918B1 (en) * 1991-11-19 1994-09-09 Thomson Csf MATERIAL CONSTITUTING A RIFLE SCOPE AND WEAPON USING THE SAME.
JPH05167894A (en) * 1991-12-10 1993-07-02 Sony Corp Vtr integral video camera
US5252950A (en) * 1991-12-20 1993-10-12 Apple Computer, Inc. Display with rangefinder
US5363297A (en) * 1992-06-05 1994-11-08 Larson Noble G Automated camera-based tracking system for sports contests
US5311203A (en) * 1993-01-29 1994-05-10 Norton M Kent Viewing and display apparatus
JP3045625B2 (en) * 1993-04-16 2000-05-29 川崎重工業株式会社 Ship navigation support device
GB9424273D0 (en) * 1994-12-01 1995-01-18 Wrigley Adrian M T Improvements in and relating to image constrcution
US5798761A (en) * 1996-01-26 1998-08-25 Silicon Graphics, Inc. Robust mapping of 2D cursor motion onto 3D lines and planes

Also Published As

Publication number Publication date
NZ269318A (en) 1997-12-19
EP0722601A4 (en) 1998-06-10
JP3700021B2 (en) 2005-09-28
DE69433405D1 (en) 2004-01-22
AU7314694A (en) 1995-03-27
EP0722601B1 (en) 2003-12-10
US6307556B1 (en) 2001-10-23
US6031545A (en) 2000-02-29
US5682332A (en) 1997-10-28
CA2171314A1 (en) 1995-03-16
WO1995007526A1 (en) 1995-03-16
US5815411A (en) 1998-09-29
AU691508B2 (en) 1998-05-21
US5742521A (en) 1998-04-21
JPH09505138A (en) 1997-05-20
KR960705297A (en) 1996-10-09
EP0722601A1 (en) 1996-07-24
ATE256327T1 (en) 2003-12-15
CH689904A5 (en) 2000-01-14

Similar Documents

Publication Publication Date Title
CA2171314C (en) Electro-optic vision systems which exploit position and attitude
US6208933B1 (en) Cartographic overlay on sensor video
US5825480A (en) Observing apparatus
EP3136964B1 (en) Improved registration for vehicular augmented reality using auto-harmonization
US5579165A (en) Computerized binoculars
US6101431A (en) Flight system and system for forming virtual images for aircraft
US5072218A (en) Contact-analog headup display method and apparatus
EP3596588B1 (en) Gradual transitioning between two-dimensional and three-dimensional augmented reality images
US20030210832A1 (en) Interacting augmented reality and virtual reality
EP2302531A1 (en) A method for providing an augmented reality display on a mobile device
CN107010239A (en) For generating flight deck display system and the method that driving cabin is shown
US20230032219A1 (en) Display control method, display control apparatus, program, and recording medium
Foxlin et al. Improved registration for vehicular AR using auto-harmonization
US6002379A (en) Glass unit
Link et al. Hybrid enhanced and synthetic vision system architecture for rotorcraft operations
Walko et al. Integration and use of an augmented reality display in a maritime helicopter simulator
Burch et al. Enhanced head-up display for general aviation aircraft
RU2263881C1 (en) Sighting navigational complex for multi-mission aircraft
de Haag EE6900 Flight Management Systems
Thorndycraft et al. Flight assessment of an integrated DNAW helicopter pilotage display system: flight trial HAWKOWL
Shabaneh Probability Grid Mapping System for Aerial Search (PGM)
Mulligan et al. Pilot behavior and course deviations during precision flight
Edwards The development of a helicopter visually-coupled system research facility
Boehm et al. Visual aids for future helicopters
Wright Helicopter Urban Navigation Training Using Virtual Environments

Legal Events

Date Code Title Description
EEER Examination request
MKEX Expiry

Effective date: 20140616

MKEX Expiry

Effective date: 20140616