WO2002029700A2 - Intra-operative image-guided neurosurgery with augmented reality visualization - Google Patents

Intra-operative image-guided neurosurgery with augmented reality visualization Download PDF

Info

Publication number
WO2002029700A2
WO2002029700A2 PCT/US2001/042506 US0142506W WO0229700A2 WO 2002029700 A2 WO2002029700 A2 WO 2002029700A2 US 0142506 W US0142506 W US 0142506W WO 0229700 A2 WO0229700 A2 WO 0229700A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
stereoscopic
guided surgery
accordance
data
Prior art date
Application number
PCT/US2001/042506
Other languages
French (fr)
Other versions
WO2002029700A3 (en
Inventor
Michael Wendt
Ali Bani-Hashemi
Frank Sauer
Original Assignee
Siemens Corporate Research, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Corporate Research, Inc. filed Critical Siemens Corporate Research, Inc.
Priority to CA002425075A priority Critical patent/CA2425075A1/en
Priority to JP2002533197A priority patent/JP2004538538A/en
Priority to EP01977904A priority patent/EP1356413A2/en
Publication of WO2002029700A2 publication Critical patent/WO2002029700A2/en
Publication of WO2002029700A3 publication Critical patent/WO2002029700A3/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00707Dummies, phantoms; Devices simulating patient or parts of patient
    • A61B2017/00716Dummies, phantoms; Devices simulating patient or parts of patient simulating physical properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Definitions

  • the present invention relates to the field of image-guided surgery, and more particularly to MR-guided neurosurgery wherein imaging scans, such as magnetic resonance (MR) scans, are taken intra-operatively or inter-operatively.
  • imaging scans such as magnetic resonance (MR) scans
  • 3-dimensional (3D) volume images taken with MR (magnetic resonance) and CT (computed tomography) scanners are used for diagnosis and for surgical planning.
  • the brain After opening of the skull (craniotomy), the brain, being non-rigid in its physical the brain will typically further deform. This brain shift makes the pre-operative 3D imaging data fit the actual brain geometry less and less accurately so that it is significantly out of correspondence with what is confronting the surgeon during the operation.
  • Intra-operative MR imaging usually refers to MR scans that are being taken while the actual surgery is ongoing, whereas the term “inter-operative” MR imaging is used when the surgical procedure is halted for the acquisition of the scan and resumed afterwards.
  • Equipment has been developed by various companies for providing intra/inter -operative MR imaging capabilities in the operating room. For example. General Electric has built an MR scanner with a double-dougfmut-shaped magnet, where the surgeon has access to the patient inside the scanner.
  • a normal anatomical model is also displayed as a guide in reconstructive surgery.
  • Another embodiment employs three-dimensional viewing.
  • Siemens has built a combination of MR scanner and operating table where the operating table with the patient can be inserted into the scanner for MR image capture (imaging position) and be withdrawn into a position where the patient is accessible to the operating team, that is, into the operating position.
  • the MR data are displayed on a computer monitor.
  • a specialized neuroradiologist evaluates the images and discusses them with the neurosurgeon. The neurosurgeon has to understand the relevant image information and mentally map it onto the patient's brain. While such equipment provides a useful modality, this type of mental mapping is difficult and subjective and cannot preserve the complete accuracy of the information.
  • An object of the present invention is to generate an augmented view of the patient from the surgeon's own dynamic viewpoint and display the view to the surgeon.
  • Augmented Reality visualization for medical applications has been proposed as early as 1992; see, for example, M. Bajura, H. Fuchs. and R. Ohbuchi. "Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient.” Proceedings of S1GGRAPH 92 (Chicago, IL, July 26-31, 1992). In Computer Graphics 26, #2 (July 1992): 203-210.
  • the "augmented view” generally comprises the “real” view overlaid with additional “virtual " graphics.
  • the real view is provided as video images.
  • the virtual graphics is derived from a 3D volume imaging system.
  • the virtual graphics also coiresponds to real anatomical structures; however, views of these structures are available only as computer graphics renderings.
  • the real view of the external structures and the virtual view of the internal structures are blended with an appropriate degree of transparency, which may vary over the field of view. Registration between real and virtual views makes all structures in the augmented view appear in the correct location with respect to each other.
  • the MR data revealing internal anatomic -str-uctures-ar-e-showni7fc ⁇ it .-OV-eid
  • Augmented Reality type of visualization the derived image of the internal anatomical structure is directly presented in the surgeon's workspace in a registered fashion.
  • the surgeon wears a head-mounted display and ' cai ⁇ xu sculpturenc the spatial relationship between the anatomical structures from varying positions in a natural way.
  • surgeon to look back and forth Between monitor and patient, and to mentally map the image information to the real brain. As a consequence, the surgeon can better focus on the surgical task at hand and perform the operation more precisely and confidently.
  • FIG. 1 shows a system block diagram in accordance with the invention
  • FIG. 2 shows a flow diagram in accordance with the invention:
  • Figure 3 shows a headmounted display as may be used in an embodiment of the invention
  • Figure 4 shows a frame in accordance with the invention
  • Figure 5 show a boom-mounted see-through display in accordance with the invention
  • Figure 6 shows a robotic ami in accordance with the invention
  • Figure 7 shows a 3D camera calibration object as may be used in an embodiment of the invention.
  • Figure 8 shows an MR calibration object as may be used in an embodiment of the invention. Ball-shaped MR markers and doughnut shaped MR markers are shown
  • the MR information is utilized in an effective and optimal manner.
  • the surgeon wears a stereo video-see-through head-mounted display.
  • a pair of video cameras attached to the head- mounted display captures a stereoscopic view of the real scene.
  • the video images are blended together with the computer images of the internal anatomical structures and displayed on the head-mounted stereo display in real time.
  • the internal structures appear directly superimposed on and in the patient's brain.
  • a computer provides the precise, objective 3D registration between the " cbmpufer miage ' s ' f the i ⁇ leriial ' structure ' s and the video images of the real brain.
  • This in situ or "augmented reality” visualization gives the surgeon intuitively based, direct, and precise access to the image information in regard to the surgical task of removing the patient's tumor without hurting vital regions.
  • the stereoscopic video-see-through display may not be head- mounted but be attached to an articulated mechanical arm that is, e.g., suspended from the ceiling (FeXere ⁇ celo "videosc ⁇ pe" provisional filing)(include in claims).
  • a video-see-through display is understood as a display withi a video camera attachment, whereby the video camera looks into substantially the same direction as the user who views the display.
  • a " slereoscopic video-see-through display combines a stereoscopic display, e.g. a pair of miniature displays, and a stereoscopic camera system, e.g. a pair of cameras.
  • Figure 1 shows the building blocks of an exemplary system in accordance with the invention.
  • a 3D imaging apparatus 2 in the present example an MR scanner, is used to capture 3D volume data of the patient.
  • the volume data contain information about internal structures of the patient, -A video-see-through head-mounted display 4 gives the surgeon a dynamic viewpoint. It comprises a pair of video cameras 6 to capture a stereoscopic view of the scene (external structures) and a pair of displays 8 to display the augmented view in a stereoscopic way.
  • a tracking device or apparatus 10 measures position and orientation (pose) of the pair of cameras with respect to the coordinate system in which the 3D data are described.
  • the computer 12 comprises a set of networked computers.
  • One of the computer tasks is to process, with possible user interaction, the volume data and provide one or more graphical representations of the imaged structures: volume representations and/or surface representations (based on segmentation of the volume data).
  • volume representations and/or surface representations based on segmentation of the volume data.
  • graphical representation to mean a data set that is in a "graphical" format (e.g. VRML format), ready to be efficiently visualized respectively rendered into an image.
  • the user can selectively enhance structures, color or annotate them, pick out relevant ones, include graphical objects as guides for the surgical procedure and so forth. This preprocessing can be done "off-line", in preparation of the actual image guidance.
  • Another computer task is 1o render, in real time, the augmented stereo view to provide the image guidance for the surgeon.
  • the computer receives the video images
  • An optional recording means 14 allows one to record the augmented view for documentation and training.
  • The-recording-means can-be a digital storage device, or it can be a video -recorder, if necessary, combined with a scan converter.
  • a general user interface 16 allows one to control the system in general, and in particular to interactively select the 3D data and pre-process them.
  • a realtime user interface 18 allows the user to control the system during its realtime operation, i.e. during the realtime display of the augmented view. It allows the user to interactively change the augmented view, e.g. invoke an optical or digital zoom, switch between different degrees of transparency for the blending of real and virtual graphics, show or turn off different graphical structures.
  • a possible hands-free embodiment would be a voice controlled user interface.
  • An optional remote user interface 20 allows an additional user to see and interact with the augmented view during the system's realtime operation as described later in this document.
  • a common frame of reference is defined, that is. a common coordinate system, to be able to relate the 3D data and the 2D video images, with the respective pose and pre-determined internal parameters of the video cameras, to this common coordinate system.
  • the common coordinate system is most conveniently one in regard to which the patient ' s head does not move.
  • the patient's head is fixed in a clamp during surgery and intermittent 3D imaging. Markers rigidly attached to this head clamp can serve as landmarks to define and locate the common coordinate system.
  • Figure 4 shows as an example a photo of a head clamp 4-2 with an attached frame of markers 4-4.
  • the individual markers are retro-reflective discs 4-6, made from 3M's Scotchlite 8710 Silver Transfer Film.
  • a preferred embodiment of the marker set is in form of a bridge as seen in the photo. See Figure 7.
  • the markers should be visible in the volume data or should have at least a known geometric relationship to other markers that are visible in the volume data. If necessary, this Telali nship ⁇ hbe-deterinihe ⁇ i ari initial cal ⁇ bra ⁇ tiorrstej57 "_ Tl ⁇ eh " the volume data can be measured with regard to the common coordinate system, or the volume data can be transformed into this common coordinate system.
  • FIG. 7 shows a photo of an example of a calibration object that has been used for the calibration of a camera triplet consisting of a stereo pair of video cameras and an attached tracker camera.
  • the markers 7-2 are retro-reflective discs.
  • the 3D coordinates of the markers were measured with a commercial Optotrak® system. Then one can measure the 2D coordinates of the markers in the images, and calibrate the cameras based on 3D-2D point correspondences for example -with Tsai-'s-algorithm as-described in Roger Y.
  • MR data - patient transformation for the example of the Siemens inter-operative MR imaging arrangement.
  • the patient ' s bed can be placed the magnet ' s fringe field for the surgical procedure or swiveled into the magnet for MR scanning.
  • the bed with the head clamp, and therefore also the patient's head are reproducibly positioned in the magnet with a specified accuracy of ⁇ lmm.
  • Fig. 8 shows an example for a phantom that can be used for pre-detemiining the transforaiation. It consists of two sets of markers visible the MR data set and a set of optical markers visible o the tracker camera.
  • One type of MR markers is ball-shaped 8-2 and can, e.g., be obtained from Brainlab, Inc.
  • the other type of MR markers 8-4 is doughnut- shaped, e.g. Multi-Modality Radiographics Markers from IZI Medical Products, Inc. In principle, only a single set of at least three MR markers is necessary.
  • the disc-shaped retro- reflective optical markers 8-6 can be punched out from 3M's Scotchlite 8710 Silver Transfer Film.
  • optical tracking is used due to its superior accuracy.
  • a preferred implementation of optical tracking comprises rigidly attaching an additional video camera to the stereo pair of video cameras that provide the stereo view of the scene. This tracker video
  • Figure 2 shows a flow diagram of the system when it operates in real-time mode, i.e. when it is displaying the augmented view in real time.
  • the computing means 2-2 receives input from tracking systems, which are here separated into tracker camera (understood to be a head- mounted tracker camera) 2-4 and external tracking systems 2-6.
  • the computing means perfomi pose calculations 2-8, based on this input and prior calibration data.
  • the computing means also receives as input the real-lime video of the scene cameras 2-10 and has available the stored data for the 3D graphics 2-12.
  • the computing means renders graphics and video into a composite augmented view, according to the pose information. Via the user interface 2-16, the user can select between different augmentation modes (e.g. the user can vary the transparency of the virtual structures or select a digital zoom for the rendering process).
  • the display 2-18 displays the rendered augmented view to the user.
  • the two video cameras that provide the stereo view of the scene point downward at an angle, whereby the surgeon can work on the patient without having to bend the head down into an uncomfortable position.
  • Figure 3 shows a photo of a stereoscopic video-see-through head-mounted display. It includes the stereoscopic display 3-2 and a pair of downward tilted video cameras 3-4 for capturing the scene (scene cameras). Furthermore, it includes a tracker camera 3-6 and an infrared illuminator in form of a ring of infrared LEDs 3-8. In another embodiment, the augmented view is recorded for documentation and/or for subsequent use in applications such as training.
  • the augmented view can be provided for pre-operative planning for surgery.
  • interactive annotation of the augmented view is provided to permit communication between a user of the head-mounted display and an observer or associate who watches the augmented view on a monitor, stereo monitor, or another head-mounted display so that the augmented view provided to the surgeon can be shared; for example, it can observed by neuroradiologist.
  • the neuroradiologist can then point out, such as by way of an interface to the computer (mouse, 3D mouse, Trackball, etc.) certain features to the surgeon by adding extra graphics to the augmented view or highlighting existing graphics that is being displayed as part of the augmented view.
  • FIG. 5 shows a diagram of a boom-mounted video-see-through display.
  • the video-see- through display comprises a display and a video camera, respectively a stereo display and a stereo pair of video cameras.
  • the video-see-through display 52 is suspended from a ceiling 50 by a boom 54.
  • tracking means 56 are attached to the video- see-through display, more specifically to the video cameras as it is their pose that needs to be determined for rendering a conectly registered augmented view.
  • Tracking means can include a tracking camera -that works iir conjunction with active or passive optical markers that are placed in the scene.
  • tracking means can include passive or active optical markers that work in conjunction with an external tracker camera.
  • different kind of tracking systems can be employed such as magnetic tracking, inertia! tracking, ultrasonic " tracking, etc. Mechanical tracking is possible by fitting the joints of the boom with encoders. However, optical tracking is preferred because of its accuracy.
  • Figure 6 shows elements of a system that employs a robotic arm 62, attached to a ceiling 60.
  • the system includes a video camera respectively a stereo pair of video cameras 64.
  • On a remote display and control station 66 the user sees an augmented video and controls the robot.
  • the robot includes tools, e.g. a drill, that the user can position and activate remotely.
  • Tracking means 68 enable the system to render an accurately augmented video view and to position the instruments correctly.
  • Embodiments of the tracking means are the same as in the description of Figure 5.
  • a robot carries scene cameras. The tracking camera may then no longer be required as robot ami can be mechanically tracked. However, in order to establish the relationship between the robot and patient coordinate systems, the tracking camera can still be useful.
  • the user sited in a remote location, can move the robot "head" around by remote control to gain appropriate views, look at the augmented views on a head-mounted display or other stereo viewing display or external monitor, preferably stereo, to diagnose and consult.
  • the remote user may also be able to perform actual surgery via remote control of the robot, with or without help of personnel present at the patient site.
  • a video- see-through head-mounted display has downward looking scene camera/cameras.
  • the scene cameras are video cameras that provide a view of the scene, mono or stereo, allowing a comfortable work position.
  • the downward angle of the camera /cameras is such that - in the preferred work posture - the head does not have to be tilted up or down to any substantial degree.
  • a video-see-through display comprises-an integrated tracker camera whereby the tracker camera is forward looking or is looking into substantially the same direction as the scene cameras, tracking landmarks that are positioned on or around the object of interest.
  • the tracker camera can have a larger field of view than the scene cameras, and can work in limited wavelength range (for example, the infrared wavelength range). See the afore-mentioned pending patent application Ser. No. entitled AUGMENTED REALITY VISUALIZATION DEVICE, filed September 17, 2001, Express Mail Label No. EL727968622US, in the names of Sauer and Bahi-Hasheini, " Attorney Docket No. " 2001P14757US, hereby incorporated herein by reference.
  • a light source for illumination is placed close to or around the tracker camera lens.
  • the wavelength of the light source is adapted to the wavelength range for which the tracker camera is sensitive.
  • active markers for example small lightsources such as LEDs can be utilized as markers.
  • a video-see-through display includes a digital zoom feature. The user can zoom in to see a magnified augmented view, interacting with the computer by voice or other interface, or telling an assistant to interact with the computer via keyboard or mouse or other interface.
  • the present invention makes it unnecessary for the surgeon to look at an augmented view, then determine the relative positions of external and internal structures and thereafter orient himself based on the external structures, drawing upon his memory of the relative position of the internal structures.
  • a "video-see-through" head mounted display in accordance with the present invention provides an augmented view in a more direct and intuitive way without the need for -the-user-to-look-baek-and- forth betwee moni-tor-and patient.- This- also results in better spatial perception because o_f kinetic (parallax) depth cues and.there is no need for the physician to orient himself with respect to surface landmarks, since he is directly guided by the augmented view.
  • a prior art system mixing is performed in the video domain wherein the graphics is converted into video format and then mixed with the live video such that the mixer arrangement creates a composite image with a movable window which is in a region in the composite image that shows predominantly the video image or the computer image.
  • an embodiment in accordance with the present invention does not require a movable window; however, such a movable window may be helpful in certain kinds of augmented views.
  • a composite image is created in the computer graphics domain whereby the live video is converted into a digital representation in the computer and therein blended together with the graphics.
  • internal structures are segmented and visualized as surface models; in accordance with the present invention. 3D images can be shown in surface or in volume representations.

Abstract

Apparatus for image-guided surgery includes medical imaging apparatus. The imaging apparatus is utilized for capturing 3-dimensional (3D) volume data of patient portions in reference to a coordination system. A computer processes the volume data so as to provide a graphical representation of the data. A stero camera assembly captures a stereoscopic video view of a scene including at least portions of the patient. A tracking system measures pose data of the stereoscopic video view in reference to the coordinate system. The computer is utilized for rendering the graphical representation and the stereoscopic video view in a blended way in conjunction with the pose data so as to provide a stereoscopic augmented image. A head-mounted video-see-through displays the stereoscopic augmented image.

Description

LNTRA-OPERATIVE IMAGE-GUIDED NEUROSURGERY WITH AUGMENTED REALITY VISUALIZATION
Reference is hereby made to Provisional Patent Application No. 60/238,253 entitled INTRA- OPERATIVE-MR GUIDED NEUROSURGERY WITH AUGMENTED REALITY VISUALIZATION, filed October 10, 2000 in the names of Wendt et al.; and to Provisional Patent Application No. 60/279,931 entitled METHOD AND APPARATUS FOR AUGMENTED REALITY VISUALIZATION, filed March 29, 2001 in the name of Sauer, whereof the disclosures are hereby herein incorporated by reference.
The present invention relates to the field of image-guided surgery, and more particularly to MR-guided neurosurgery wherein imaging scans, such as magnetic resonance (MR) scans, are taken intra-operatively or inter-operatively.
In the practice of neurosurgery, an operating surgeon is generally required to look back and forth between the patient and a monitor displaying patient anatomical information for guidance in the operation. In this manner, a form of "mental mapping" occurs of the image information observed on the monitor and the brain.
Typically, in the case of surgery of a brain tumor, 3-dimensional (3D) volume images taken with MR (magnetic resonance) and CT (computed tomography) scanners are used for diagnosis and for surgical planning.
After opening of the skull (craniotomy), the brain, being non-rigid in its physical the brain will typically further deform. This brain shift makes the pre-operative 3D imaging data fit the actual brain geometry less and less accurately so that it is significantly out of correspondence with what is confronting the surgeon during the operation.
However, there are tumors that look like and are textured like normal healthy brain matter so that they are visually indistinguishable. Such tumors can be distinguished only by MR data and reliable resection is generally only possible with MR data that are updated during the course of the surgery. The term "intra-operative" MR imaging usually refers to MR scans that are being taken while the actual surgery is ongoing, whereas the term "inter-operative" MR imaging is used when the surgical procedure is halted for the acquisition of the scan and resumed afterwards. Equipment has been developed by various companies for providing intra/inter -operative MR imaging capabilities in the operating room. For example. General Electric has built an MR scanner with a double-dougfmut-shaped magnet, where the surgeon has access to the patient inside the scanner.
U.S. Patent No. 5,740,802 entitled COMPUTER GRAPHIC AND LIVE VIDEO SYSTEM FOR ENHANCING VISUALIZATION OF BODY STRUCTURES DURING SURGERY. assigned 10 General Electric Company, issued April 21, 1998 in the names of Nafis et al., is directed to an interactive surgery planning and display system which mixes live video of external surfaces of the patient with interactive computer generated models of internal anatomy obtained from medical diagnostic imaging data of the patient. The computer images and the live video are coordinated and displayed to a surgeon in real-time during surgery allowing the surgeon to view internal and external structures and the relation between them simultaneously, and adjust his surgery accordingly. In an alternative embodiment, a normal anatomical model is also displayed as a guide in reconstructive surgery. Another embodiment employs three-dimensional viewing.
Work relating to ultrasound imaging is disclosed by Andrei State, Mark A. Livingston, Gentaro Hirota, William F. Garrett, Mary C. W itton, Henry Fuchs. and Etta D. Pisano, "Technologies for Augmented Reality Systems: realizing Ultrasound-Guided Needle Biopsies, " Proceed, of S1GGRAPH (New Orleans, LA, August 4-9, 1996), in Computer Graphics Proceedings, Annual Conference Series 1996, ACM SIGGRAPH, 439-446.
For inter-operative imaging, Siemens has built a combination of MR scanner and operating table where the operating table with the patient can be inserted into the scanner for MR image capture (imaging position) and be withdrawn into a position where the patient is accessible to the operating team, that is, into the operating position.
In the case of the Siemens equipment, the MR data are displayed on a computer monitor. A specialized neuroradiologist evaluates the images and discusses them with the neurosurgeon. The neurosurgeon has to understand the relevant image information and mentally map it onto the patient's brain. While such equipment provides a useful modality, this type of mental mapping is difficult and subjective and cannot preserve the complete accuracy of the information.
An object of the present invention is to generate an augmented view of the patient from the surgeon's own dynamic viewpoint and display the view to the surgeon. The use of Augmented Reality visualization for medical applications has been proposed as early as 1992; see, for example, M. Bajura, H. Fuchs. and R. Ohbuchi. "Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient." Proceedings of S1GGRAPH 92 (Chicago, IL, July 26-31, 1992). In Computer Graphics 26, #2 (July 1992): 203-210.
As herein used, the "augmented view" generally comprises the "real" view overlaid with additional "virtual" graphics. The real view is provided as video images. The virtual graphics is derived from a 3D volume imaging system. Hence, the virtual graphics also coiresponds to real anatomical structures; however, views of these structures are available only as computer graphics renderings.
The real view of the external structures and the virtual view of the internal structures are blended with an appropriate degree of transparency, which may vary over the field of view. Registration between real and virtual views makes all structures in the augmented view appear in the correct location with respect to each other.
In accordance with an aspect of the invention, the MR data revealing internal anatomic -str-uctures-ar-e-showni7fc^it .-OV-eid With this
Augmented Reality type of visualization, the derived image of the internal anatomical structure is directly presented in the surgeon's workspace in a registered fashion.
In accordance with an aspect of the invention, the surgeon wears a head-mounted display and 'caiϊ xuiriinc the spatial relationship between the anatomical structures from varying positions in a natural way.
Figure imgf000004_0001
surgeon to look back and forth Between monitor and patient, and to mentally map the image information to the real brain. As a consequence, the surgeon can better focus on the surgical task at hand and perform the operation more precisely and confidently.
The invention will be more fully understood from the following detailed description of preferred embodiments, in conjunction with the Drawings, in which
Figure 1 shows a system block diagram in accordance with the invention;
Figure 2 shows a flow diagram in accordance with the invention:
Figure 3 shows a headmounted display as may be used in an embodiment of the invention;
Figure 4 shows a frame in accordance with the invention: Figure 5 show a boom-mounted see-through display in accordance with the invention;
Figure 6 shows a robotic ami in accordance with the invention:
Figure 7 shows a 3D camera calibration object as may be used in an embodiment of the invention: and
Figure 8 shows an MR calibration object as may be used in an embodiment of the invention. Ball-shaped MR markers and doughnut shaped MR markers are shown
In accordance with the principles of the present invention, the MR information is utilized in an effective and optimal manner. In an exemplary embodiment, the surgeon wears a stereo video-see-through head-mounted display. A pair of video cameras attached to the head- mounted display captures a stereoscopic view of the real scene. The video images are blended together with the computer images of the internal anatomical structures and displayed on the head-mounted stereo display in real time. To the surgeon, the internal structures appear directly superimposed on and in the patient's brain. The surgeon is free to move his or her head around to view the spatial relationship of the structures from varying positions, whereupon a computer provides the precise, objective 3D registration between the "cbmpufer miage's' f the iήleriial'structure's and the video images of the real brain. This in situ or "augmented reality" visualization gives the surgeon intuitively based, direct, and precise access to the image information in regard to the surgical task of removing the patient's tumor without hurting vital regions.
In an alternate embodiment, the stereoscopic video-see-through display may not be head- mounted but be attached to an articulated mechanical arm that is, e.g., suspended from the ceiling (FeXereήcelo "videoscόpe" provisional filing)(include in claims). For our purpose, a video-see-through display is understood as a display withi a video camera attachment, whereby the video camera looks into substantially the same direction as the user who views the display. A"slereoscopic video-see-through display combines a stereoscopic display, e.g. a pair of miniature displays, and a stereoscopic camera system, e.g. a pair of cameras.
Figure 1 shows the building blocks of an exemplary system in accordance with the invention.
A 3D imaging apparatus 2. in the present example an MR scanner, is used to capture 3D volume data of the patient. The volume data contain information about internal structures of the patient, -A video-see-through head-mounted display 4 gives the surgeon a dynamic viewpoint. It comprises a pair of video cameras 6 to capture a stereoscopic view of the scene (external structures) and a pair of displays 8 to display the augmented view in a stereoscopic way.
A tracking device or apparatus 10 measures position and orientation (pose) of the pair of cameras with respect to the coordinate system in which the 3D data are described.
The computer 12 comprises a set of networked computers. One of the computer tasks is to process, with possible user interaction, the volume data and provide one or more graphical representations of the imaged structures: volume representations and/or surface representations (based on segmentation of the volume data). In this context, we understand the term graphical representation to mean a data set that is in a "graphical" format (e.g. VRML format), ready to be efficiently visualized respectively rendered into an image. The user can selectively enhance structures, color or annotate them, pick out relevant ones, include graphical objects as guides for the surgical procedure and so forth. This preprocessing can be done "off-line", in preparation of the actual image guidance.
Another computer task is 1o render, in real time, the augmented stereo view to provide the image guidance for the surgeon. For that purpose, the computer receives the video images
-aπd-the--caraera_pose-infoπi3ation,-and-makes-use-of.the-pre^p]:ocessed-3D data, i.e. the stored graphical representation If the video images are not already in digital form, the computer digitizes them. Views of the 3D data are rendered according to the camera pose and blended with the corresponding video images. The augmented images are then output to the stereo jdispjay_._
An optional recording means 14 allows one to record the augmented view for documentation and training. The-recording-means can-be a digital storage device, or it can be a video -recorder, if necessary, combined with a scan converter.
A general user interface 16 allows one to control the system in general, and in particular to interactively select the 3D data and pre-process them.
A realtime user interface 18 allows the user to control the system during its realtime operation, i.e. during the realtime display of the augmented view. It allows the user to interactively change the augmented view, e.g. invoke an optical or digital zoom, switch between different degrees of transparency for the blending of real and virtual graphics, show or turn off different graphical structures. A possible hands-free embodiment would be a voice controlled user interface. An optional remote user interface 20 allows an additional user to see and interact with the augmented view during the system's realtime operation as described later in this document.
For registration, a common frame of reference is defined, that is. a common coordinate system, to be able to relate the 3D data and the 2D video images, with the respective pose and pre-determined internal parameters of the video cameras, to this common coordinate system.
The common coordinate system is most conveniently one in regard to which the patient's head does not move. The patient's head is fixed in a clamp during surgery and intermittent 3D imaging. Markers rigidly attached to this head clamp can serve as landmarks to define and locate the common coordinate system.
Figure 4 shows as an example a photo of a head clamp 4-2 with an attached frame of markers 4-4. The individual markers are retro-reflective discs 4-6, made from 3M's Scotchlite 8710 Silver Transfer Film. A preferred embodiment of the marker set is in form of a bridge as seen in the photo. See Figure 7.
The markers should be visible in the volume data or should have at least a known geometric relationship to other markers that are visible in the volume data. If necessary, this Telali nship ^hbe-deterinihe^ i ari initial calϊbra^tiorrstej57"_Tlτeh"the volume data can be measured with regard to the common coordinate system, or the volume data can be transformed into this common coordinate system.
The calibration procedures follow in more detail. For correct registration between graphics and patient.-the system-needs to be calibrated. One needs to deteraiine the transformation that maps the medical data onto the patient, and one needs to deteraiine the internal parameters ϋ, ~fξTatiyp poses ofTKeNideδ cameras to slTow "the~m ppTng '"cόιτectly in the augmented view.
Camera- calibration and camera-patient transformation. Fig. 7 shows a photo of an example of a calibration object that has been used for the calibration of a camera triplet consisting of a stereo pair of video cameras and an attached tracker camera. The markers 7-2 are retro-reflective discs. The 3D coordinates of the markers were measured with a commercial Optotrak® system. Then one can measure the 2D coordinates of the markers in the images, and calibrate the cameras based on 3D-2D point correspondences for example -with Tsai-'s-algorithm as-described in Roger Y. Tsai,"A versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses", IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, August 1987, pages 323-344. For realtime tracking, one rigidly attaches a set of markers with known 3D coordinates to the patient (respectively a head clamp) defining the patient coordinate system. For more detailed information, refer to F. Sauer et al., "Augmented Workspace: Designing an AR Testbed," IEEE and ACM Int. Symp. On Augmented Reality - 1SAR 2000 (Munich. Germany. October 5-6, 2000), pages 47-53.
MR data - patient transformation for the example of the Siemens inter-operative MR imaging arrangement. The patient's bed can be placed the magnet's fringe field for the surgical procedure or swiveled into the magnet for MR scanning. The bed with the head clamp, and therefore also the patient's head, are reproducibly positioned in the magnet with a specified accuracy of ±lmm. One can pre-determine the transformation between the MR volume set and the head clamp with a phantom and then re-apply the same transformation when mapping the MR data to the patient's head, with the head-clamp still in the same position.
Fig. 8 shows an example for a phantom that can be used for pre-detemiining the transforaiation. It consists of two sets of markers visible the MR data set and a set of optical markers visible o the tracker camera. One type of MR markers is ball-shaped 8-2 and can, e.g., be obtained from Brainlab, Inc. The other type of MR markers 8-4 is doughnut- shaped, e.g. Multi-Modality Radiographics Markers from IZI Medical Products, Inc. In principle, only a single set of at least three MR markers is necessary. The disc-shaped retro- reflective optical markers 8-6 can be punched out from 3M's Scotchlite 8710 Silver Transfer Film. One'tracks'tlϊe optical markers, and - with the knowledge of the phantom's geometry - deiemαines the 3D locations of the MR markers in the patient coordinate system. One also determines the 3D locations of the MR markers in the MR data set, and calculates the transforaiation between the two coordinate systems based on the 3D-3D point correspondences.
The pose (position and orientation) of the video cameras is then measured in reference to the common coordinate system. This is the task of the tracking means. In a preferred implementation, optical tracking is used due to its superior accuracy. A preferred implementation of optical tracking comprises rigidly attaching an additional video camera to the stereo pair of video cameras that provide the stereo view of the scene. This tracker video
"canTera" pornls" rn subs tarTtiallylhe πϊme dιrectidrras"th'e~ Btlie wo vtdeo~cameras. When the surgeon looks at the patient, the tracker video camera can see the aforementioned markers that locate the common coordinate system, and from the 2D locations of the markers in the tracker camera's image one can calculate the tracker camera's pose. As the video cameras are rigidly attached to each other, the poses of the other two cameras can be calculated from the tracker camera's pose, the relative camera poses having been determined in a prior calibration step. Such camera calibration is preferably based on 3D-2D point correspondences and is described, for example, in Roger Y. Tsai, "A versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the- Shelf TV Cameras and Lenses". IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4, August 1987, pages 323-344.
Figure 2 shows a flow diagram of the system when it operates in real-time mode, i.e. when it is displaying the augmented view in real time. The computing means 2-2 receives input from tracking systems, which are here separated into tracker camera (understood to be a head- mounted tracker camera) 2-4 and external tracking systems 2-6. The computing means perfomi pose calculations 2-8, based on this input and prior calibration data. The computing means also receives as input the real-lime video of the scene cameras 2-10 and has available the stored data for the 3D graphics 2-12. In its graphics subsystem 2-14, the computing means renders graphics and video into a composite augmented view, according to the pose information. Via the user interface 2-16, the user can select between different augmentation modes (e.g. the user can vary the transparency of the virtual structures or select a digital zoom for the rendering process). The display 2-18 displays the rendered augmented view to the user.
To allow for a comfortable and relaxed posture of the surgeon during the use of the system, the two video cameras that provide the stereo view of the scene point downward at an angle, whereby the surgeon can work on the patient without having to bend the head down into an uncomfortable position. See the pending patent application Ser. No. entitled
AUGMENTED REALITY VISUALIZATION DEVICE, filed September 17, 2001, Express Mail Label No. EL727968622US. in the names of Sauer and Bani-Hashemi, Attorney Docket No. 2001P14757US.
Figure 3 shows a photo of a stereoscopic video-see-through head-mounted display. It includes the stereoscopic display 3-2 and a pair of downward tilted video cameras 3-4 for capturing the scene (scene cameras). Furthermore, it includes a tracker camera 3-6 and an infrared illuminator in form of a ring of infrared LEDs 3-8. In another embodiment, the augmented view is recorded for documentation and/or for subsequent use in applications such as training.
It is contemplated that the augmented view can be provided for pre-operative planning for surgery.
In another embodiment, interactive annotation of the augmented view is provided to permit communication between a user of the head-mounted display and an observer or associate who watches the augmented view on a monitor, stereo monitor, or another head-mounted display so that the augmented view provided to the surgeon can be shared; for example, it can observed by neuroradiologist. The neuroradiologist can then point out, such as by way of an interface to the computer (mouse, 3D mouse, Trackball, etc.) certain features to the surgeon by adding extra graphics to the augmented view or highlighting existing graphics that is being displayed as part of the augmented view.
Figure 5 shows a diagram of a boom-mounted video-see-through display. The video-see- through display comprises a display and a video camera, respectively a stereo display and a stereo pair of video cameras. In the example, the video-see-through display 52 is suspended from a ceiling 50 by a boom 54. For tracking, tracking means 56 are attached to the video- see-through display, more specifically to the video cameras as it is their pose that needs to be determined for rendering a conectly registered augmented view. Tracking means can include a tracking camera -that works iir conjunction with active or passive optical markers that are placed in the scene. Alternatively, tracking means can include passive or active optical markers that work in conjunction with an external tracker camera. Also, different kind of tracking systems can be employed such as magnetic tracking, inertia! tracking, ultrasonic "tracking, etc. Mechanical tracking is possible by fitting the joints of the boom with encoders. However, optical tracking is preferred because of its accuracy.
Figure 6 shows elements of a system that employs a robotic arm 62, attached to a ceiling 60. The system includes a video camera respectively a stereo pair of video cameras 64. On a remote display and control station 66, the user sees an augmented video and controls the robot. The robot includes tools, e.g. a drill, that the user can position and activate remotely. Tracking means 68 enable the system to render an accurately augmented video view and to position the instruments correctly. Embodiments of the tracking means are the same as in the description of Figure 5. In an embodiment exhibiting remote use capability, a robot carries scene cameras. The tracking camera may then no longer be required as robot ami can be mechanically tracked. However, in order to establish the relationship between the robot and patient coordinate systems, the tracking camera can still be useful.
The user, sited in a remote location, can move the robot "head" around by remote control to gain appropriate views, look at the augmented views on a head-mounted display or other stereo viewing display or external monitor, preferably stereo, to diagnose and consult. The remote user may also be able to perform actual surgery via remote control of the robot, with or without help of personnel present at the patient site.
In another embodiment in accordance with the invention, a video- see-through head-mounted display has downward looking scene camera/cameras. The scene cameras are video cameras that provide a view of the scene, mono or stereo, allowing a comfortable work position. The downward angle of the camera /cameras is such that - in the preferred work posture - the head does not have to be tilted up or down to any substantial degree.
In another embodiment in accordance with the invention, a video-see-through display -comprises-an integrated tracker camera whereby the tracker camera is forward looking or is looking into substantially the same direction as the scene cameras, tracking landmarks that are positioned on or around the object of interest. The tracker camera can have a larger field of view than the scene cameras, and can work in limited wavelength range (for example, the infrared wavelength range). See the afore-mentioned pending patent application Ser. No. entitled AUGMENTED REALITY VISUALIZATION DEVICE, filed September 17, 2001, Express Mail Label No. EL727968622US, in the names of Sauer and Bahi-Hasheini, "Attorney Docket No." 2001P14757US, hereby incorporated herein by reference.
In accordance with another embodiment of the invention wherein retroreflective markers are used, a light source for illumination is placed close to or around the tracker camera lens. The wavelength of the light source is adapted to the wavelength range for which the tracker camera is sensitive. Alternatively, active markers, for example small lightsources such as LEDs can be utilized as markers.
Tracking systems with large cameras that work with retroreflective markers or active markers are commercially available. In accordance with another embodiment of the invention, a video-see-through display includes a digital zoom feature. The user can zoom in to see a magnified augmented view, interacting with the computer by voice or other interface, or telling an assistant to interact with the computer via keyboard or mouse or other interface.
It will be apparent that the present inventions provide certain useful characteristics and features in comparison with prior systems. For example, in reference to the system disclosed in the afore-mentioned U.S. Patent No. 5.740.802. video cameras are attached to head- mounted display in accordance with the present invention, thereby exhibiting a dynamic viewpoint, contrast with prior systems which provide a viewpoint, implicitly static or quasi-static, which is only "substantially" the same as the surgeon's viewpoint.
In contrast with a system which merely displays a live video of external surfaces of a patient and an augmented view to allow a surgeon to locate internal structures relative to visible exlema] surfaces, the present invention makes it unnecessary for the surgeon to look at an augmented view, then determine the relative positions of external and internal structures and thereafter orient himself based on the external structures, drawing upon his memory of the relative position of the internal structures.
The use of a "video-see-through" head mounted display in accordance with the present invention provides an augmented view in a more direct and intuitive way without the need for -the-user-to-look-baek-and- forth betwee moni-tor-and patient.- This- also results in better spatial perception because o_f kinetic (parallax) depth cues and.there is no need for the physician to orient himself with respect to surface landmarks, since he is directly guided by the augmented view.
In such a prior art system mixing is performed in the video domain wherein the graphics is converted into video format and then mixed with the live video such that the mixer arrangement creates a composite image with a movable window which is in a region in the composite image that shows predominantly the video image or the computer image. In contrast, an embodiment in accordance with the present invention does not require a movable window; however, such a movable window may be helpful in certain kinds of augmented views. In accordance with a principle of the present invention, a composite image is created in the computer graphics domain whereby the live video is converted into a digital representation in the computer and therein blended together with the graphics. Furthermore, in such a prior art system, internal structures are segmented and visualized as surface models; in accordance with the present invention. 3D images can be shown in surface or in volume representations.
The present invention has been described by way of exemplary embodiments. It will be understood by one of skill in the art to which it pertains that various changes, substitutions and the like may be made without departing from the spirit of the invention. Such changes are contemplated 1o be within the scope of the claims following.

Claims

CLAIMS What is claimed is:
1. A method for image-guided surgery comprising: capturing 3-dimensional (3D) volume data of at least a portion of a patient; processing said volume data so as to provide a graphical representation of said data; capturing a stereoscopic video view of a scene including said at least a portion of said patient; rendering said graphical representation and said stereoscopic video view in a blended manner so as to provide a stereoscopic augmented image: and
■ displaying said stereoscopic augmented image in a video-see-through display.
2. A method for image-guided surgery comprising: capturing 3-dimensional (3D) volume data of at least a portion of a patient in reference to a coordinate system; processing said volume data so as to provide a graphical representation of said data; capturing a stereoscopic video view of a scene including said at least a portion of said patient; measuring pose data of said stereoscopic video view in reference to said coordinate system; rendering said graphical representation and said stereoscopic video view in a blended manner in conjunction with said pose data so as to provide a stereoscopic augmented image; and displaying said stereoscopic augmented image in a video-see-through display.
3. A method for image-guided surgery in accordance with claim 1, wherein said step of capturing 3-dimensional (3D) volume data comprises obtaining magnetic-resonance imaging data.
4. A method for image-guided surgery in accordance with claim 1, wherein said step of processing said volume data comprises processing said data in a programmable computer.
5. A method for image-guided surgery in accordance with claim 1, wherein said step of capturing a stereoscopic video view comprises capturing a stereoscopic view by a pair of stereo cameras.
6. A method for image-guided surgery in accordance with claim 2. wherein said step of measuring pose data comprises measuring position and orientation of said pair of stereo cameras by way of a tracking device.
7. . A method for image-guided surgery in accordance with claim 1. wherein said step of rendering said graphical representation and said stereoscopic video view manner in conjunction with said pose data comprises utilizing video images, and where necessary, digitizing said video images, said camera pose information, and stored volume data captured in a previous step for providing said stereoscopic augmented image.
8. A method for image-guided surgery in accordance with claim 1, wherein said step of displaying said slereoscopic augmented image in a video-see-through display comprises displaying said stereoscopic augmented image in a head-mounted video-see-through display.
9. Apparatus for image-guided surgery comprising: means for capturing 3-dimensional (3D) volume data of at least a portion of a patient: means for processing said volume data so as to provide a graphical representation of said data: means for capturing a stereoscopic video view of a scene including said at least a portion of said patient; means for rendering said graphical representation and said stereoscopic video view in a blended manner way so as to provide a stereoscopic augmented image; and means for displaying said stereoscopic augmented image in a video-see-through display.
10. Apparatus for image-guided surgery comprising: means for capturing 3-dimensional (3D) volume data of at least a portion of a patient in reference to a coordinate system; means for processing said volume data so as to provide a graphical representation of said data; means for capturing a stereoscopic video view of a scene including said at least a portion of said patient; means for measuring pose data of said stereoscopic video view in reference to said coordinate system; means for rendering said graphical representation and said stereoscopic video view in a blended manner in conjunction with said pose data so as to provide a stereoscopic augmented image; and means for displaying said stereoscopic augmented image in a video-see-through display.
1 1 . Apparatus for image-guided surgery in accordance with claim 9, wherein said means for capturing 3-dimensional (3D) volume data comprises means for obtaining magnetic- resonance imaging data.
12. Apparatus for image-guided surgery in accordance with claim 9. wherein said means for processing said volume data comprises means for processing said data in a programmable computer.
13. Apparatus for image-guided surgery in accordance with claim 9. wherein said means for capturing a stereoscopic video view comprises means for capturing a stereoscopic view by a pair of stereo cameras.
14. Apparatus for image-guided surgery in accordance with claim 9, wherein said means for measuring pose data comprises means for measuring position and orientation of said pair of stereo cameras by way of a tracking device.
15. Apparatus image-guided surgery in accordance "with claim 9, wherein said means for rendering said graphical representation and said stereoscopic video view in a blended manner in conjunction with said pose data comprises means for utilizing video images, and where necessary, digitizing said video images, said camera pose information, and stored previously captured volume data captured for providing said stereoscopic augmented image.
16. Apparatus for image-guided surgery in accordance with claim 9. wherein said means for displaying said stereoscopic augmented image in a video-see-through display comprises a he-ajd-nιo rjιted__yideo-Sβe-r.tlιrough-disp]ay.
17. Apparatus for image-guided surgery in accordance with claim 9, including a set of markers in predetermined relationship to said patient for defining said coordinate system.
18. Apparatus for image-guided surgery in accordance with claim 17, wherein said markers are identifiable in said volume data.
19. Apparatus for image-guided surgery in accordance with claim 18, wherein said means for displaying said stereoscopic augmented image in a video-see-through display comprises a boom-mounted video-see-through display.
20. Apparatus for image-guided surgery comprising: medical imaging apparatus, said imaging apparatus being utilized for capturing 3- dimensional (3D) volume data of at least patient portions in reference to a coordinate system; a computer for processing said volume data so as to provide a graphical representation of said data; a stereo camera assembly for capturing a stereoscopic video view of a scene including said at least patient portions: a tracking system for measuring pose data of said stereoscopic video view in reference to said coordinate system: said computer being utilized for rendering said graphical representation and said slereoscopic video view in a blended way in conjunction with said pose data so as to provide a stereoscopic augmented image; and a head-mounted video-see-through display for displaying said stereoscopic augmented image]
21. Apparatus for image-guided surgery in accordance with claim 20, wherein said medical imaging apparatus is one of X-ray computed tomography apparatus, magnetic resonance imaging apparatus, and 3D ultrasound imaging apparatus.
22. Apparatus for image-guided surgery in accordance with claim 20, wherein said coordinate system is defined in relation to said patient.
23. Apparatus for image-guided surgery in accordance with claim 22, including markers in predetermined relationship to said patient.
24. Apparatus for image-guided surgery in accordance with claim 23, wherein said markers are identifiable in said volume data.
25. Apparatus for image-guided surgery in accordance with claim 20, wherein said computer comprises a set of networked computers.
26. Apparatus for image-guided surgery in accordance with claim 25, wherein said computer processes said volume data with optional user interaction, and provides at least one graphical representation of said patient portions, said graphical representation comprising at least one of volume representations and surface representations based on segmentation of said volume data.
27. Apparatus for image-guided surgery in accordance with claim 26, wherein said optional user interaction allows a user to, in any desired combination, selectively enhance, color, annotate, single out, and identify for guidance in surgical procedures, at least a portion of said patient portions.
28. Apparatus for image-guided surgery in accordance with claim 20. wherein said tracking system comprises an optical tracker.
29. Apparatus for image-guided surgery in accordance with claim 20, wherein said stereo camera assembly are adapted for operating in an angled swiveled orientation, including a downward-looking orientation for allowing a user to operate without having to tilt the head downward.
30. Apparatus for image-guided surgery in accordance with claim 28, wherein said optical tracker comprises a tracker video camera in predetermined coupled relationship with said stereo camera assembly.
31. Apparatus for image-guided surgery in accordance with claim 28. wherein said optical tracker comprises a tracker video camera faces in substantially the same direction as said stereo camera assembly for tracking landmarks around the center area of view of said stereo camera assembly.
32. Apparatus for image-guided surgery in accordance with claim 31. wherein said tracker video camera exhibits a larger field of view than said stereo camera assembly.
33. Apparatus for image-guided surgery in accordance with claim 31, wherein said landmarks comprise optical markers.
34-. - Apparatus for image-guided surgery in accordance with claim 31, wherein said -landmarks comprise reflective markers.
35. Apparatus for image-guided surgery in accordance with claim 34, wherein said reflective markers are illuminated by light of a wavelength suitable for said tracker video camera.
36. Apparatus for image-guided surgery in accordance with claim 20, wherein said video-see-through display comprises a zoom feature.
37. Apparatus for image-guided surgery in accordance with claim 31 , wherein said landmarks comprise light-emitting markers.
- 38. -Apparatus for image-guided surgery-in accordance with claim 20, wherein said augmented view can be any combination: stored, replayed, remotely viewed, and simultaneously replicated for at least one additional user.
39. Apparatus for image-guided surgery comprising: medical imaging apparatus, said imaging apparatus being utilized for capturing 3- dimensional (3D) volume data of at least patient portions in reference to a coordinate system: a computer for processing said volume data so as to provide a graphical representation of said data: a robot ami manipulator operable by user from a remote location: a stereo camera assembly mounted on said robot arm manipulator for capturing a stereoscopic video view of a scene including said patient; a tracking system for measuring pose data of said stereoscopic video view in reference to said coordinate system:
"" said computer being utilized fdr rendering said graphical representation and said stereoscopic video view in a blended way in conjunction with said pose data so as to provide a stereoscopic augmented image; and a head-mounted video-see-through display for displaying said stereoscopic augmented image at said remote location.
40. Apparatus for image-guided surgery in accordance with claim 39. wherein said optical, tracker comprises.a tracker -video camera in predetermined coupled relationship with said robot arm manipulator.
41. A method for image-guided surgery utilizing captured 3-dimensional (3D) volume data of at least a portion of a patient, said method comprising: processing said volume' data so as to provide a graphical representation of said data: capturing a stereoscopic video view of a scene including said at least a portion of said patient; rendering said graphical representation and said siereoscopic video view in a blended manner so as to provide a siereoscopic augmented image: and displaying said siereoscopic augmented image in a video-see-through display.
42. A method for image-guided surgery utilizing 3-dimensional (3D) volume data of at least a portion of a patient, said data having been captured in reference to a coordinate system, said method comprising: capturing 3-dimensional (3D) volume data of at least a portion of a patient processing said volume data so as to provide a graphical representation of said data; capturing a stereoscopic video view of a scene including said at least a portion of said patient; measuring pose data of said stereoscopic video view in reference to said coordinate system; rendering said graphical representation and said stereoscopic video view in a blended manner in conjunction with said pose data so as to provide a stereoscopic augmented image: and displaying said stereoscopic augmented image in a video-see-through display.
43. A method for image-guided surgery in accordance with claim 42, wherein said 3- dimensional (3D) volume data comprises magnetic-resonance imaging data.
44. A method for image-guided surgery in accordance with claim 42, wherein said step of processing said volume data comprises processing said data in a programmable computer.
45. A method for image-guided surgery in accordance with claim 42. wherein said step of capturing a stereoscopic video view comprises capturing a stereoscopic view by a pair of stereo cameras.
46. A method for image-guided surgery in accordance with claim 42, wherein said step of measuring pose data comprises measuring position and orientation of said pair of stereo cameras by way of a tracking device.
47. A method for image-guided surgery in accordance with claim 42, wherein said step of rendering said graphical representation and said stereoscopic video view in a blended way in conjunction with said pose data comprises utilizing video images, and where necessary, digitizing said video images, said camera pose in.o mation. and stored volume data captured m a previous step for providing said stereoscopic augmented image.
48. A method for image-guided surgery in accordance with claim 42, wherein said step of displaying said stereoscopic augmented image in a video-see-through display comprises displaying said siereoscopic augmented image in a head-mounted video-see-through display.
49. Apparatus for image-guided surgery utilizing captured 3-dimensional (3D) volume data of at least a portion of a patient, said apparatus comprising: means for processing said volume data so as to provide a graphical representation of said data; means for capturing a stereoscopic video view of a scene including said at least a portion of said patient; means for rendering said graphical representation and said stereoscopic video view in a blended manner so as to provide a stereoscopic augmented image; and means for displaying said stereoscopic augmented image in a video-see-through display.
50. Apparatus for image-guided surgery utilizing 3-dimensional (3D) volume data of at least a portion of a patient, said data having been captured in reference to a coordinate system, said apparatus comprising: means for processing said volume data so as to provide a graphical representation of said data: means for capturing a stereoscopic video view of a scene including said at least a portion of said patient; means for measuring pose data of said stereoscopic video view in reference to said coordinate system: means for rendering said graphical representation and said stereoscopic video view in a blended manner in conjunction with said pose data so as to provide a stereoscopic augmented image: and means for displaying said stereoscopic augmented image in a video-see-through display.
PCT/US2001/042506 2000-10-05 2001-10-05 Intra-operative image-guided neurosurgery with augmented reality visualization WO2002029700A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA002425075A CA2425075A1 (en) 2000-10-05 2001-10-05 Intra-operative image-guided neurosurgery with augmented reality visualization
JP2002533197A JP2004538538A (en) 2000-10-05 2001-10-05 Intraoperative image-guided neurosurgery and surgical devices with augmented reality visualization
EP01977904A EP1356413A2 (en) 2000-10-05 2001-10-05 Intra-operative image-guided neurosurgery with augmented reality visualization

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US23825300P 2000-10-05 2000-10-05
US60/238,253 2000-10-05
US27993101P 2001-03-29 2001-03-29
US09/971,554 US20020082498A1 (en) 2000-10-05 2001-10-05 Intra-operative image-guided neurosurgery with augmented reality visualization

Publications (2)

Publication Number Publication Date
WO2002029700A2 true WO2002029700A2 (en) 2002-04-11
WO2002029700A3 WO2002029700A3 (en) 2003-08-14

Family

ID=27737127

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/042506 WO2002029700A2 (en) 2000-10-05 2001-10-05 Intra-operative image-guided neurosurgery with augmented reality visualization

Country Status (4)

Country Link
US (1) US20020082498A1 (en)
EP (1) EP1356413A2 (en)
JP (1) JP2004538538A (en)
WO (1) WO2002029700A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004046430A1 (en) * 2004-09-24 2006-04-06 Siemens Ag System for visual situation-based real-time based surgeon support and real-time documentation and archiving of the surgeon's visually perceived support-based impressions during surgery
WO2006043238A1 (en) * 2004-10-22 2006-04-27 Koninklijke Philips Electronics N.V. Real time stereoscopic imaging apparatus and method
JP2007518521A (en) * 2004-01-20 2007-07-12 スミス アンド ネフュー インコーポレーテッド System and method for minimally invasive incision
DE102009018633A1 (en) 2009-04-17 2010-10-21 Technische Universität Dresden Method and device for intraoperative imaging of brain areas
US8743109B2 (en) 2006-08-31 2014-06-03 Kent State University System and methods for multi-dimensional rendering and display of full volumetric data sets
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures

Families Citing this family (203)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002526188A (en) * 1998-09-24 2002-08-20 スーパー ディメンション リミテッド System and method for determining the position of a catheter during a medical procedure inside the body
US7526112B2 (en) 2001-04-30 2009-04-28 Chase Medical, L.P. System and method for facilitating cardiac intervention
US7327862B2 (en) * 2001-04-30 2008-02-05 Chase Medical, L.P. System and method for facilitating cardiac intervention
US7215322B2 (en) * 2001-05-31 2007-05-08 Siemens Corporate Research, Inc. Input devices for augmented reality applications
US7198630B2 (en) * 2002-12-17 2007-04-03 Kenneth I. Lipow Method and apparatus for controlling a surgical robot to mimic, harmonize and enhance the natural neurophysiological behavior of a surgeon
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
FI111755B (en) * 2001-11-23 2003-09-15 Mapvision Oy Ltd Method and system for calibrating an artificial vision system
ATE261273T1 (en) * 2001-12-18 2004-03-15 Brainlab Ag PROJECTION OF PATIENT IMAGE DATA FROM TRANSLOX OR LAYER IMAGE CAPTURE METHOD ON VIDEO IMAGES
ATE275881T1 (en) * 2002-03-01 2004-10-15 Brainlab Ag OPERATIONAL LAMP WITH CAMERA SYSTEM FOR 3D REFERENCE
US11202676B2 (en) 2002-03-06 2021-12-21 Mako Surgical Corp. Neural monitor-based dynamic haptics
US8010180B2 (en) * 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
US7206626B2 (en) 2002-03-06 2007-04-17 Z-Kat, Inc. System and method for haptic sculpting of physical objects
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
JP3735086B2 (en) * 2002-06-20 2006-01-11 ウエストユニティス株式会社 Work guidance system
US6925357B2 (en) 2002-07-25 2005-08-02 Intouch Health, Inc. Medical tele-robotic system
US20040162637A1 (en) 2002-07-25 2004-08-19 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
DE10238011A1 (en) * 2002-08-20 2004-03-11 GfM Gesellschaft für Medizintechnik mbH Semi transparent augmented reality projection screen has pivoted arm to place image over hidden object and integral lighting
SE0203908D0 (en) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
US7693563B2 (en) 2003-01-30 2010-04-06 Chase Medical, LLP Method for image processing and contour assessment of the heart
US20050043609A1 (en) * 2003-01-30 2005-02-24 Gregory Murphy System and method for facilitating cardiac intervention
DE10305384A1 (en) 2003-02-11 2004-08-26 Kuka Roboter Gmbh Method and device for visualizing computer-aided information
US7063256B2 (en) * 2003-03-04 2006-06-20 United Parcel Service Of America Item tracking and processing systems and methods
DE20305278U1 (en) * 2003-04-02 2003-06-12 Daimler Chrysler Ag Device for taking into account the viewer's position when displaying 3D image content on 2D display devices
US7203277B2 (en) * 2003-04-25 2007-04-10 Brainlab Ag Visualization device and method for combined patient and object image data
US20050054910A1 (en) * 2003-07-14 2005-03-10 Sunnybrook And Women's College Health Sciences Centre Optical image-based position tracking for magnetic resonance imaging applications
US7463823B2 (en) * 2003-07-24 2008-12-09 Brainlab Ag Stereoscopic visualization device for patient image data and video images
DE102004011888A1 (en) * 2003-09-29 2005-05-04 Fraunhofer Ges Forschung Device for the virtual situation analysis of at least one intracorporeally introduced into a body medical instrument
DE102004011959A1 (en) * 2003-09-29 2005-05-12 Fraunhofer Ges Forschung Apparatus and method for repositionable positioning of an object relative to an intracorporeal body region
DE10345743A1 (en) * 2003-10-01 2005-05-04 Kuka Roboter Gmbh Method and device for determining the position and orientation of an image receiving device
DE10346615B4 (en) * 2003-10-08 2006-06-14 Aesculap Ag & Co. Kg Device for determining the position of a body part
US20070014452A1 (en) * 2003-12-01 2007-01-18 Mitta Suresh Method and system for image processing and assessment of a state of a heart
US7813836B2 (en) 2003-12-09 2010-10-12 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US7333643B2 (en) * 2004-01-30 2008-02-19 Chase Medical, L.P. System and method for facilitating cardiac intervention
US7561717B2 (en) * 2004-07-09 2009-07-14 United Parcel Service Of America, Inc. System and method for displaying item information
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
EP1621153B1 (en) * 2004-07-28 2007-08-15 BrainLAB AG Stereoscopic visualisation apparatus for the combination of scanned and video images
DE102005005242A1 (en) * 2005-02-01 2006-08-10 Volkswagen Ag Camera offset determining method for motor vehicle`s augmented reality system, involves determining offset of camera position and orientation of camera marker in framework from camera table-position and orientation in framework
US20060184003A1 (en) * 2005-02-03 2006-08-17 Lewin Jonathan S Intra-procedurally determining the position of an internal anatomical target location using an externally measurable parameter
WO2006086223A2 (en) * 2005-02-08 2006-08-17 Blue Belt Technologies, Inc. Augmented reality device and method
DE102005009437A1 (en) * 2005-03-02 2006-09-07 Kuka Roboter Gmbh Method and device for fading AR objects
FR2889761A1 (en) * 2005-08-09 2007-02-16 Total Immersion Sa SYSTEM FOR USER TO LOCATE A CAMERA FOR QUICKLY ADJUSTED INSERTION OF VIRTUAL IMAGE IMAGES IN VIDEO IMAGES OF CAMERA-CAPTURED ACTUAL ELEMENTS
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
KR100726028B1 (en) * 2005-12-14 2007-06-08 한양대학교 산학협력단 Augmented reality projection system of affected parts and method therefor
US9636188B2 (en) * 2006-03-24 2017-05-02 Stryker Corporation System and method for 3-D tracking of surgical instrument in relation to patient body
US20070236514A1 (en) * 2006-03-29 2007-10-11 Bracco Imaging Spa Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation
US8060181B2 (en) * 2006-04-07 2011-11-15 Brainlab Ag Risk assessment for planned trajectories
US9492237B2 (en) 2006-05-19 2016-11-15 Mako Surgical Corp. Method and apparatus for controlling a haptic device
US9323055B2 (en) * 2006-05-26 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
ES2300204B1 (en) * 2006-11-16 2009-05-01 The Movie Virtual, S.L. SYSTEM AND METHOD FOR THE DISPLAY OF AN INCREASED IMAGE APPLYING INCREASED REALITY TECHNIQUES.
FR2911463B1 (en) * 2007-01-12 2009-10-30 Total Immersion Sa REAL-TIME REALITY REALITY OBSERVATION DEVICE AND METHOD FOR IMPLEMENTING A DEVICE
US20080218331A1 (en) 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
US8265793B2 (en) 2007-03-20 2012-09-11 Irobot Corporation Mobile robot for telecommunication
KR100877114B1 (en) 2007-04-20 2009-01-09 한양대학교 산학협력단 Medical image providing system and method of providing medical image using the same
JP5335201B2 (en) * 2007-05-08 2013-11-06 キヤノン株式会社 Diagnostic imaging equipment
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US10875182B2 (en) * 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
WO2009122273A2 (en) 2008-04-03 2009-10-08 Superdimension, Ltd. Magnetic interference detection system and method
US8179418B2 (en) 2008-04-14 2012-05-15 Intouch Technologies, Inc. Robotic based health care system
US8170241B2 (en) 2008-04-17 2012-05-01 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
WO2009147671A1 (en) 2008-06-03 2009-12-10 Superdimension Ltd. Feature-based registration method
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
WO2010067267A1 (en) * 2008-12-09 2010-06-17 Philips Intellectual Property & Standards Gmbh Head-mounted wireless camera and display unit
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
EP2236104B1 (en) * 2009-03-31 2013-06-19 BrainLAB AG Medicinal navigation image output with virtual primary images and real secondary images
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US11399153B2 (en) * 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
JP5650248B2 (en) * 2010-02-01 2015-01-07 コビディエン エルピー Region expansion algorithm
US11154981B2 (en) 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US8918213B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US20120019511A1 (en) * 2010-07-21 2012-01-26 Chandrasekhar Bala S System and method for real-time surgery visualization
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
CN104898652B (en) 2011-01-28 2018-03-13 英塔茨科技公司 Mutually exchanged with a moveable tele-robotic
EP2500816B1 (en) 2011-03-13 2018-05-16 LG Electronics Inc. Transparent display apparatus and method for operating the same
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9288468B2 (en) 2011-06-29 2016-03-15 Microsoft Technology Licensing, Llc Viewing windows for video streams
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
WO2013176758A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US8996175B2 (en) 2012-06-21 2015-03-31 Rethink Robotics, Inc. Training and operating industrial robots
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9615728B2 (en) 2012-06-27 2017-04-11 Camplex, Inc. Surgical visualization system with camera tracking
US10176635B2 (en) 2012-06-28 2019-01-08 Microsoft Technology Licensing, Llc Saving augmented realities
WO2014032041A1 (en) * 2012-08-24 2014-02-27 Old Dominion University Research Foundation Method and system for image registration
CA2882388C (en) 2012-08-31 2020-10-20 Sloan-Kettering Institute For Cancer Research Particles, methods and uses thereof
IL221863A (en) * 2012-09-10 2014-01-30 Elbit Systems Ltd Digital system for surgical video capturing and display
US20140081659A1 (en) 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
EP2958481A4 (en) 2013-02-20 2017-03-08 Sloan-Kettering Institute for Cancer Research Wide field raman imaging apparatus and associated methods
US9782159B2 (en) 2013-03-13 2017-10-10 Camplex, Inc. Surgical visualization systems
US10288881B2 (en) * 2013-03-14 2019-05-14 Fresenius Medical Care Holdings, Inc. Wearable interface for remote monitoring and control of a medical device
US9483917B2 (en) 2013-03-15 2016-11-01 Segars California Partners, Lp Non-contact alarm volume reduction
WO2015008470A2 (en) * 2013-07-16 2015-01-22 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
KR101536115B1 (en) * 2013-08-26 2015-07-14 재단법인대구경북과학기술원 Method for operating surgical navigational system and surgical navigational system
US10028651B2 (en) 2013-09-20 2018-07-24 Camplex, Inc. Surgical visualization systems and displays
WO2015042483A2 (en) 2013-09-20 2015-03-26 Camplex, Inc. Surgical visualization systems
US11103122B2 (en) * 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US10912947B2 (en) 2014-03-04 2021-02-09 Memorial Sloan Kettering Cancer Center Systems and methods for treatment of disease via application of mechanical force by controlled rotation of nanoparticles inside cells
DE102014206004A1 (en) * 2014-03-31 2015-10-01 Siemens Aktiengesellschaft Triangulation-based depth and surface visualization
WO2015179446A1 (en) * 2014-05-20 2015-11-26 BROWND, Samuel, R. Systems and methods for mediated-reality surgical visualization
CN106687146A (en) 2014-07-28 2017-05-17 纪念斯隆-凯特琳癌症中心 Metal(loid) chalcogen nanoparticles as universal binders for medical isotopes
IL235073A (en) * 2014-10-07 2016-02-29 Elbit Systems Ltd Head-mounted displaying of magnified images locked on an object of interest
WO2016090336A1 (en) 2014-12-05 2016-06-09 Camplex, Inc. Surgical visualization systems and displays
JP2016115965A (en) * 2014-12-11 2016-06-23 ソニー株式会社 Medical spectacle type display device, information processing device, and information processing method
US10154239B2 (en) 2014-12-30 2018-12-11 Onpoint Medical, Inc. Image-guided surgery with surface reconstruction and augmented reality visualization
JP6709796B2 (en) * 2015-02-20 2020-06-17 コヴィディエン リミテッド パートナーシップ Operating room and surgical site recognition
KR101734094B1 (en) 2015-03-09 2017-05-11 국립암센터 Augmented Reality Image Projection System
US11819273B2 (en) 2015-03-17 2023-11-21 Raytrx, Llc Augmented and extended reality glasses for use in surgery visualization and telesurgery
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
US11154378B2 (en) 2015-03-25 2021-10-26 Camplex, Inc. Surgical visualization systems and displays
EP3274912B1 (en) * 2015-03-26 2022-05-11 Biomet Manufacturing, LLC System for planning and performing arthroplasty procedures using motion-capture data
WO2016205915A1 (en) * 2015-06-22 2016-12-29 Synaptive Medical (Barbados) Inc. System and method for mapping navigation space to patient space in a medical procedure
EP3317035A1 (en) 2015-07-01 2018-05-09 Memorial Sloan Kettering Cancer Center Anisotropic particles, methods and uses thereof
US10105187B2 (en) 2015-08-27 2018-10-23 Medtronic, Inc. Systems, apparatus, methods and computer-readable storage media facilitating surgical procedures utilizing augmented reality
JP6641122B2 (en) * 2015-08-27 2020-02-05 キヤノン株式会社 Display device, information processing device, and control method therefor
DE102015216917A1 (en) 2015-09-03 2017-03-09 Siemens Healthcare Gmbh System for presenting an augmented reality about an operator
ITUB20155830A1 (en) 2015-11-23 2017-05-23 R A W Srl "NAVIGATION, TRACKING AND GUIDE SYSTEM FOR THE POSITIONING OF OPERATOR INSTRUMENTS"
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
DE102015226669B4 (en) * 2015-12-23 2022-07-28 Siemens Healthcare Gmbh Method and system for outputting augmented reality information
EP4327769A2 (en) 2016-03-12 2024-02-28 Philipp K. Lang Devices and methods for surgery
US11058495B2 (en) 2016-04-27 2021-07-13 Biomet Manufacturing, Llc Surgical system having assisted optical navigation with dual projection system
US10531926B2 (en) 2016-05-23 2020-01-14 Mako Surgical Corp. Systems and methods for identifying and tracking physical objects during a robotic surgical procedure
JP2019534717A (en) * 2016-08-16 2019-12-05 インサイト メディカル システムズ インコーポレイテッド System for sensory enhancement in medical procedures
EP3512452A1 (en) * 2016-09-16 2019-07-24 Zimmer, Inc. Augmented reality surgical technique guidance
CN106297471A (en) * 2016-10-25 2017-01-04 深圳市科创数字显示技术有限公司 The removable cornea intelligent operation training system that AR and VR combines
WO2018078470A1 (en) * 2016-10-25 2018-05-03 Novartis Ag Medical spatial orientation system
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US11751944B2 (en) 2017-01-16 2023-09-12 Philipp K. Lang Optical guidance for surgical, medical, and dental procedures
US9892564B1 (en) * 2017-03-30 2018-02-13 Novarad Corporation Augmenting real-time views of a patient with three-dimensional data
JP2020512116A (en) * 2017-03-31 2020-04-23 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Markerless robot tracking system, control device, and method
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
US10471478B2 (en) 2017-04-28 2019-11-12 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same
WO2018203304A1 (en) * 2017-05-05 2018-11-08 Scopis Gmbh Surgical navigation system
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
JP6909632B2 (en) * 2017-05-16 2021-07-28 タクボエンジニアリング株式会社 Teaching method for painting robots
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
EP3470006B1 (en) 2017-10-10 2020-06-10 Holo Surgical Inc. Automated segmentation of three dimensional bony structure images
EP3445048A1 (en) 2017-08-15 2019-02-20 Holo Surgical Inc. A graphical user interface for a surgical navigation system for providing an augmented reality image during operation
US10987016B2 (en) * 2017-08-23 2021-04-27 The Boeing Company Visualization system for deep brain stimulation
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
WO2019051464A1 (en) 2017-09-11 2019-03-14 Lang Philipp K Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion
US11058497B2 (en) 2017-12-26 2021-07-13 Biosense Webster (Israel) Ltd. Use of augmented reality to assist navigation during medical procedures
US11114199B2 (en) 2018-01-25 2021-09-07 Mako Surgical Corp. Workflow systems and methods for enhancing collaboration between participants in a surgical procedure
US11348257B2 (en) 2018-01-29 2022-05-31 Philipp K. Lang Augmented reality guidance for orthopedic and other surgical procedures
WO2019152617A1 (en) * 2018-02-03 2019-08-08 The Johns Hopkins University Calibration system and method to align a 3d virtual scene and 3d real world for a stereoscopic head-mounted display
PL233986B1 (en) * 2018-02-13 2019-12-31 Uniwersytet Warminsko Mazurski W Olsztynie Device for interaction with spatial objects
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
WO2019211741A1 (en) * 2018-05-02 2019-11-07 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11357576B2 (en) 2018-07-05 2022-06-14 Dentsply Sirona Inc. Method and system for augmented reality guided surgery
EP3608870A1 (en) 2018-08-10 2020-02-12 Holo Surgical Inc. Computer assisted identification of appropriate anatomical structure for medical device placement during a surgical procedure
WO2020041615A1 (en) * 2018-08-22 2020-02-27 Magic Leap, Inc. Patient viewing system
US11191609B2 (en) 2018-10-08 2021-12-07 The University Of Wyoming Augmented reality based real-time ultrasonography image rendering for surgical assistance
US10939977B2 (en) 2018-11-26 2021-03-09 Augmedics Ltd. Positioning marker
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
EP3899642A1 (en) 2018-12-20 2021-10-27 Snap Inc. Flexible eyewear device with dual cameras for generating stereoscopic images
EP3689229A1 (en) 2019-01-30 2020-08-05 DENTSPLY SIRONA Inc. Method and system for visualizing patient stress
EP3690609B1 (en) 2019-01-30 2021-09-22 DENTSPLY SIRONA Inc. Method and system for controlling dental machines
EP3689287B1 (en) 2019-01-30 2022-07-27 DENTSPLY SIRONA Inc. System for proposing and visualizing dental treatments
EP3689218B1 (en) 2019-01-30 2023-10-18 DENTSPLY SIRONA Inc. Method and system for guiding an intra-oral scan
US11553969B1 (en) 2019-02-14 2023-01-17 Onpoint Medical, Inc. System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures
US11857378B1 (en) 2019-02-14 2024-01-02 Onpoint Medical, Inc. Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets
EP3696650A1 (en) 2019-02-18 2020-08-19 Siemens Healthcare GmbH Direct volume haptic rendering
CN113597362A (en) * 2019-03-25 2021-11-02 Abb瑞士股份有限公司 Method and control device for determining a relation between a robot coordinate system and a movable device coordinate system
US10910096B1 (en) 2019-07-31 2021-02-02 Allscripts Software, Llc Augmented reality computing system for displaying patient data
WO2021062375A1 (en) * 2019-09-27 2021-04-01 Raytrx, Llc Augmented and extended reality glasses for use in surgery visualization and telesurgery
US11210865B2 (en) 2019-10-03 2021-12-28 International Business Machines Corporation Visually interacting with three dimensional data in augmented or virtual reality
US10965931B1 (en) 2019-12-06 2021-03-30 Snap Inc. Sensor misalignment compensation
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
WO2021168449A1 (en) 2020-02-21 2021-08-26 Raytrx, Llc All-digital multi-option 3d surgery visualization system and control
US11389252B2 (en) 2020-06-15 2022-07-19 Augmedics Ltd. Rotating marker for image guided surgery
US11449137B2 (en) * 2021-02-12 2022-09-20 Rockwell Collins, Inc. Soldier and surface vehicle heads-up display imagery compensation system to align imagery with surroundings
US11445165B1 (en) 2021-02-19 2022-09-13 Dentsply Sirona Inc. Method, system and computer readable storage media for visualizing a magnified dental treatment site
US11786206B2 (en) 2021-03-10 2023-10-17 Onpoint Medical, Inc. Augmented reality guidance for imaging systems
CN113133828B (en) * 2021-04-01 2023-12-01 上海复拓知达医疗科技有限公司 Interactive registration system, method, electronic device and readable storage medium for surgical navigation
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter
DE102022118714A1 (en) 2022-07-26 2024-02-01 B. Braun New Ventures GmbH Tracking operating frame, navigation system and navigation method
DE102022118990A1 (en) 2022-07-28 2024-02-08 B. Braun New Ventures GmbH Navigation system and navigation method with annotation function
CN115619790B (en) * 2022-12-20 2023-05-02 北京维卓致远医疗科技发展有限责任公司 Hybrid perspective method, system and equipment based on binocular positioning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998038908A1 (en) * 1997-03-03 1998-09-11 Schneider Medical Technologies, Inc. Imaging device and method
WO2000055676A1 (en) * 1999-03-17 2000-09-21 The Microoptical Corporation Compact image display system for eyeglasses or other head-borne frames

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0646263B1 (en) * 1993-04-20 2000-05-31 General Electric Company Computer graphic and live video system for enhancing visualisation of body structures during surgery
US5531227A (en) * 1994-01-28 1996-07-02 Schneider Medical Technologies, Inc. Imaging device and method
WO1995020343A1 (en) * 1994-01-28 1995-08-03 Schneider Medical Technologies, Inc. Imaging device and method
US6235038B1 (en) * 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998038908A1 (en) * 1997-03-03 1998-09-11 Schneider Medical Technologies, Inc. Imaging device and method
WO2000055676A1 (en) * 1999-03-17 2000-09-21 The Microoptical Corporation Compact image display system for eyeglasses or other head-borne frames

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PETERS T M ET AL: "Integration of stereoscopic DSA and 3D MRI for image-guided neurosurgery." COMPUTERIZED MEDICAL IMAGING AND GRAPHICS: THE OFFICIAL JOURNAL OF THE COMPUTERIZED MEDICAL IMAGING SOCIETY. UNITED STATES 1994 JUL-AUG, vol. 18, no. 4, July 1994 (1994-07), pages 289-299, XP002238675 ISSN: 0895-6111 *
STATE A ET AL: "TECHNOLOGIES FOR AUGMENTED REALITY SYSTEMS: REALIZING ULTRASOUND-GUIDED NEEDLE BIOPSIES" COMPUTER GRAPHICS PROCEEDINGS. ANNUAL CONFERENCE SERIES. SIGGRAPH, XX, XX, 1996, pages 439-446, XP001109775 cited in the application *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007518521A (en) * 2004-01-20 2007-07-12 スミス アンド ネフュー インコーポレーテッド System and method for minimally invasive incision
DE102004046430A1 (en) * 2004-09-24 2006-04-06 Siemens Ag System for visual situation-based real-time based surgeon support and real-time documentation and archiving of the surgeon's visually perceived support-based impressions during surgery
WO2006043238A1 (en) * 2004-10-22 2006-04-27 Koninklijke Philips Electronics N.V. Real time stereoscopic imaging apparatus and method
US8743109B2 (en) 2006-08-31 2014-06-03 Kent State University System and methods for multi-dimensional rendering and display of full volumetric data sets
DE102009018633A1 (en) 2009-04-17 2010-10-21 Technische Universität Dresden Method and device for intraoperative imaging of brain areas
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US11478310B2 (en) 2018-06-19 2022-10-25 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures
US11571263B2 (en) 2018-06-19 2023-02-07 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11645531B2 (en) 2018-06-19 2023-05-09 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11657287B2 (en) 2018-06-19 2023-05-23 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures

Also Published As

Publication number Publication date
JP2004538538A (en) 2004-12-24
WO2002029700A3 (en) 2003-08-14
EP1356413A2 (en) 2003-10-29
US20020082498A1 (en) 2002-06-27

Similar Documents

Publication Publication Date Title
US20020082498A1 (en) Intra-operative image-guided neurosurgery with augmented reality visualization
CN109758230B (en) Neurosurgery navigation method and system based on augmented reality technology
US5526812A (en) Display system for enhancing visualization of body structures during medical procedures
EP1395194B1 (en) A guide system
US7774044B2 (en) System and method for augmented reality navigation in a medical intervention procedure
CA2486525C (en) A guide system and a probe therefor
Bichlmeier et al. The virtual mirror: a new interaction paradigm for augmented reality environments
Sielhorst et al. Advanced medical displays: A literature review of augmented reality
US6919867B2 (en) Method and apparatus for augmented reality visualization
Fischer et al. Medical Augmented Reality based on Commercial Image Guided Surgery.
WO1998038908A1 (en) Imaging device and method
Navab et al. Laparoscopic virtual mirror new interaction paradigm for monitor based augmented reality
CA2523727A1 (en) Surgical navigation imaging system
WO2002080773A1 (en) Augmentet reality apparatus and ct method
Saucer et al. A head-mounted display system for augmented reality image guidance: towards clinical evaluation for imri-guided nuerosurgery
Philip et al. Stereo augmented reality in the surgical microscope
CN114727848A (en) Visualization system and method for ENT procedures
JP2023526716A (en) Surgical navigation system and its application
EP0629963A2 (en) A display system for visualization of body structures during medical procedures
Suthau et al. A concept work for Augmented Reality visualisation based on a medical application in liver surgery
Vogt Real-Time Augmented Reality for Image-Guided Interventions
Bichlmeier et al. Evaluation of the virtual mirror as a navigational aid for augmented reality driven minimally invasive procedures
CA2425075A1 (en) Intra-operative image-guided neurosurgery with augmented reality visualization
Paloc et al. Computer-aided surgery based on auto-stereoscopic augmented reality
Bichlmeier et al. The tangible virtual mirror: New visualization paradigm for navigated surgery

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2425075

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2001977904

Country of ref document: EP

Ref document number: 2002533197

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 2001977904

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 2001977904

Country of ref document: EP