CA2118532A1 - Automatic ultrasonic localization of targets implanted in a portion of the anatomy - Google Patents

Automatic ultrasonic localization of targets implanted in a portion of the anatomy

Info

Publication number
CA2118532A1
CA2118532A1 CA002118532A CA2118532A CA2118532A1 CA 2118532 A1 CA2118532 A1 CA 2118532A1 CA 002118532 A CA002118532 A CA 002118532A CA 2118532 A CA2118532 A CA 2118532A CA 2118532 A1 CA2118532 A1 CA 2118532A1
Authority
CA
Canada
Prior art keywords
fiducial marker
transducer
implanted
acoustic impedance
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002118532A
Other languages
French (fr)
Inventor
Judith T. Lewis
Robert L. Galloway, Jr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2118532A1 publication Critical patent/CA2118532A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00199Electrical control of surgical instruments with a console, e.g. a control panel with a display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/067Measuring instruments not otherwise provided for for measuring angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3925Markers, e.g. radio-opaque or breast lesions markers ultrasonic
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/916Ultrasound 3-D imaging

Abstract

ABSTRACT OF THE DISCLOSURE
In a coordinate system defined by a coordinate space digitizer, the location of an implanted symmetric object may automatically be determined based on differences between the characteristic acoustic impedances of at least a portion of an implanted object and one or more materials surrounding that object. An A-mode ultrasound transducer may be attached to a coordinate space digitizer or pointing device such that an attached computer tracks the position and orientation of the ultrasound transducer. The reflected radio-frequency (rf) ultrasound signals may automatically be analyzed along with the transducer position and orientation corresponding to each received rf signal to detect the position of the implanted object. The time delay between received ultrasound echoes is used to compute the depth of the object from the transducer. This allows for an accurate determination of the three-dimensional coordinates of the implanted object.

Description

2118~32 ", , .

AVTOMATIC ULTRASONIC LOCALIZATION OF TARGET~
IMPLANTED IN A PORTION OF THE ANATOMY

BACKGROUND OF THE INVENTION
The present invention relates to locating a position of an obJect implanted in a portion of the anatomy, the object having at least a portion which is in a v~cinity of a material having a different acoustic impedance than an acoustic impedance of that portion of the object. More particularly, the S present invention relates to the localization of implanted targets using amplitude-mode (A-mode) ultrasound techniques and a coordinate space digitizer. Extendable to other applications, the automatic implanted target localization approach may be used to locate small fluid-filled polymer cylinders implanted in human skull, which are preferably flush with the 10 surface of the skull, beneath the scalp and subcutaneous fat. These permanently-implanted cylinders are intended to serve as fiducial markers for the registration of tomographic image volurnes with physical space in neurosurgical cases, as well as for tracking a patient over time.

2ll853~

Tomographic imaging modalities, such as computer tomography (CI'), magnetic resonance imaging (MRI~, and positron emission tomography (PET), p~oduce a three-dimensional image volume as a set of three-dimensional "slices". These image volumes contain the information necessary 5 for surgical or radiotherapeutic planning. Such information may include the three-dimensional distances between sites of interest such as tumors, the location of blood vessels which must be avoided, and lesion margin delineation often superior to that discernible by visual tissue inspection.
However, these three-dimensional volumes are at an arbitrary 10 orientation to each other as well as to the physical patient anatomy. As a result, correlating and comparing the same location in the images is difficult.
Also, the surgeon is unable to accurately correlate the detailed image information with the physical anatomy during surgery as a guidance tool.
The correlation of multiple three-dimensional volumes or spaces is 15 referred to as space registration. Space registration establishes a one-to-one mapping between points in the image sets or between points in one or more image sets and physical space. The transformation between coordinate systems is calculated based on the location of a set of at least three common landmarks, or ffducial markers, in each representation. The actual fiducial 20 point is the geometric center of the fiducial marker. The accuracy of the mapping ~etween coordinate systems depends on the accuracy with which the coordinates of the fiducial markers' centers are known in each three-dimensional space.

The fiducial markers provide a frame of reference to make image-to-image or image-to-physical space registration possible. A general technique for using fiducial markers to obtain registration of image data across time is set forth in U.S. Patent No. 4,991,579 to Allen et al., which is incorporated herein by reference.
Fiducial markers for accurate image-to-physical space registration must be rigidly located and must be composed of ma~erials making them visible in the imaging modalities of interest. U.S. Patent Application Serial No.
08/017,167 to McCrory et al., which is incorporated herein by reference, X
10 describes the design and composition of fiduicial markers for neurosurgical image registration and image-physical space registration, as well as a method for localizing the center of the imaged markers in the image volumes. This patent application describes two types of markers including temporary markers anchored in the skull for rigidity but with the image-visible portion 15 protruding above the scalp, and perrnanent markers implanted into the skull beneath the scalp and subcutaneous fat.
Internal permanent fiducial markers allow the comparison of images over time for follow-up therapy. Permanent markers also allow the delivery of fractionated radiotherapy, in which small doses of radiation are 20 administered in multiple sessions to maximize the dose delivered to the target lesion while minimizing the damage to surrounding tissue. Fractionated radiotherapy requires the same fiducial framework to be present, in an unchanged position, for each treatment so that the radiation beam may be properly directed relative to the fiducial markers as determined from the pre-21i~532 treatment images. Temporary markers and stereotactic frames can neitherremain in position long enough nor be re-affLxed accurately to satisfy this requirement.
In addition to a fiducial marking approach, image-to-physical space S registration requires a method for establishing the coordinate system in physical space. Several coordinate space digitizers including specific pointing devices have been developed which de~me a coordinate space and pass the three-dimensional coordinates of the endpoint to an attached computer. As an example, an articulated arm has joints that track the angular position of 10 each link, allowing the attached computer to calculate the endpoint coordinates. A less cumbersome alternative is a wand with infrared light emitting diodes (LEDs) along its shaft in which the LEDs are strobed in sequence and an infrared camera attached to a computer notes the location of the LEDs relative to a set of .eference LEDs in a fixed position.
Also required of a system which correlates neurological image space and physical space is a means for locating the center of the fiducial markers in the coordinate system defined by the pointing device. For markers visible outside the scalp, the fiducial position can be recorded simply by touching the fiducial marker with the pointing device. For permanent, subcutaneously-20 implanted markers, however, locating the precise three-dimensional position of the marker is much more challenging. A target localization method to address this task has previously become necessary in the field of registration of image volumes with physical space in neurosurgical cases.

~ , .

Once the markers have been located in the preoperative image sets stored on a computer as well as in the physical space defined by a pointing device attached to the same computer, the system display can provide interactive surgical guidance. The location of the endpoint of the pointing S device is indicated on a display of the appropriate slice through the image volumes. Such an interactive, image-guided surgery system using an articulated arm as a pointing device is described in U.S. Patent No. 5,142,930 to Allen e~ al., which is incorporated herein by reference. `~
U.S. Patent No. 5,197,476 to Nowacki et al., which is also incorporated 10 herein by reference, discloses an ultrasound probe coupled with an infrared I~D-based pointing device to locate a target in a living body. A three-dimensional frame containing a plurality of infrared lights is placed on a table. A computer strobes the infrared lights and the position of the infrared lights is monitored by a pair of infrared sensitive cameras and stored in the 15 computer. A hand held ultrasonic probe ic provided with a plurality of infrared lights so that the probe can be monitored by the cameras. The computer compares the positions of the probe lights with the initial position of the frame infrared lights to accurately determine the position of the probe so that the position of the target in the body may be displayed on a computer 20 monitor. The approach disclosed in the Nowacki et al. patent employs a brightness-mode (B-mode) ultrasound imaging and requires a trained expert to visually recognize when the desired target is displayed.
2~ 18~32 , I

SUMMARY OF THE INVENTION
The present invention relates to a me~hod and apparatus for automatically Is~calizing the three-dimensional coordinates of a symmetrically-shaped implanted target. The ultrasonic method is based on differences in 5 the characteristic acoustic impedance (Z0) of the target and that of the surrounding material or surrounding materials as well as on the time delays between ultrasonic echoes.
The present invention uses A-mode ultrasound which produces one-dimensional, time domain "signals". Echoes received by the transducer are 10 indicated as deflections in the signal and the amplitude of these deflections is proportional to the difference in acoustic impedances of the materials which form the interface from which the echo arose. In contrast, B-mode ultrasound imaging produces a two-dimensional image such as that provided on a video display, where each displayed line of the image corresponds to a 15 single A-mode signal acquired from adjacent positions. The brightness or intensity of each pixel or dot along that line corresponds to the amplitude of received echoes.
According to an embodiment of the present invention, an A-mode ultrasonic transducer is attached to a pointing device which digitizes the 20 coordinates of physical space. The automatic target detection algorithm is implemented on a computer that receives the reflected ultrasound signals along with the transducer position and orientation information. A position of the target or fiducial marker in a coordinate system of the pointing device is 21~8532 determined in response tO the received echoes and the position and orientation of the transducer.
The present invention is based on the following ultrasonic principles.
When an uitrasonic pulse is emitted from a transducer, it propagates through 5 a material until it reaches an interface with a material of different characteristic acoustic impedance. At this interface, a portion of the power of the pulse is reflected back to be received by the transducer and the remainder propagates on to deeper interfaces. The ratio of reflected to transrnitted power is proportional to the square of the difference in the 10 characteristic acoustic impedances of the two materials. The time delay between the echoes received by the transducer may be multiplied by the speed of sound in the intervening material to obtain the distance traveled by the ultrasonic pulse and the speed of sound in the intervening material.
The present invention provides accurate localization, appropriate for 15 localizing the center of small targets such as those a few millimeters in diameter. Accuracy is critical for the described application of the invention to image-to-physical space registration. AII example of a fiducial marker design for that application is a cylinder, 3 mm in diameter, 4 mm in height, consisting of a polymer shell and an aqueous solution. Such a marker would 20 be implanted into the skull flush with the surface of the skull, beneath the skin and subcutaneous fat.

BRIEF DESCRlPrlON OF THiE DRAWINGS
Other features and advantages of the present invention will become apparent from the follo ving description taken in conjunction with the attached drawings.

FIG. 1 illustrates an A-mode ultrasound transducer arranged with associated hardware according to an embodiment of the present invention.

FIG. 2 is a mechanical articulated arrn which may be used as a 10 pointing device in an image-guided neurosurgery system.

FIG. 3 is a schematic of the components of an optical pointing device system.

FlG. 4 illustrates the acquisition of signal and position information from an ultrasound transducer coupled to a pointing device according to an embodiment of the present invention.

FIG. 5 illustrate:s a position of a transducer piston near a target and 20 the use of an ultrasound standoff pad in placing the target in the optimal range of the ultrasound beam.

~118532 , .

FI&. 6, which includes FIG. 6(a)9 FIG. 6(b), and FIC;. 6(c), illustrates different relative positiolls of a transducer and a target during an ultra~soundlocation method according to an embodiment of the present invention.

S FIG. 7(a), FIG. 7(b) and F~G. 7(c), which together comprise FIG. 7, illustrate a correspondence between the positions illustrated in FIG. 6(a), FIG. 6(b) and FIG. 6(c) and corresponding received reflections including a target interface echo signal on which the automatic localization tschnique according to the present invention is based.
FIG. 8, which includes FIG. 8(a) and FIG. 8(b), illustrates a ]ow-impedance implanted fiducial marker for a neurosurgical space registration application of the present invention.

FIG. 9, which includes FIG. 9(a) and FIG. 9(b), illustrates expected received signals for the relative position~s of the transducer piston and the 5ducial marker illustrated in FIG. 8(a) and 8(b), respectively.

FIG. 10 illustrates an embodiment of the present invention in which a 20 low impedance fiducial marker with a high impedance layer is used.

FIG. 11 illustrates expected received signals for the relative positions of the transducer piston and the fiducial marker illustrated in FIG. 10.

, , , FIG. 12 illustrates an embodiment of the present invention relating to surgical applications thereof.

FIG. 13 illustrates a flow chart of an automatic ultrasonic localization 5 method of implanted targets according to an embodiment of the present invention.

FIG. 14 illustrates one embodiment of an automatic signal analysis deternunation which may be used in implementing an embodiment of the 10 present invention.
.
FIG. 15 illustrates a fiducial marker which may be used in implementing an embodiment of the present invention.

15 DETAILED DESCRIPI'ION
According to an embodiment of the present invention, an A-mode ultrasound transducer possessing a rotationally-symmetric beam (for example, a circular, single-crSstal piston transducer) is attached to a pointing device such that the connected computer may track the coordinates and orientation 20 of the transducer as the transducer is moved through space.
FI~. 1 illustrates an example of an A-mode ultrasound transducer 20 with a beam (not illustrated) having a circular cross-section. In response to a trigger pulse, a pulser/receiver 22 provides an electrical trigger signal to the transducer 20 via a cable 24. In response to the electrical trigger signal 21~8532 transducer 20 emits an ultrasonic pulse. The transducer 20 then receives reflected acoustic waves and converts them to electrical time-domain signals, passing them to the pulser/receiver 22 via cable 24. These signals may be transferred to a computer 26 via an analog-to-digital (A/D) converter 28.
5 The trigger pulse signal input to pulser/receiver 22 may be provided by computer 26.
According to an embodiment of the present invention, an operator translates the transducer 20 along a surface of interest perpendicular to the surface and with the aid of a coupling medium, such as water, acoustic 10 scanning gel, or mineral oil. The computer 26 pulses the transducer 20 via pulser/receiver 22 and receives the reflected echoes along with the corresponding transducer location and orientation as determined by the pointing device system. As the emitted acoustic waves travel through layers of material with similar characteristic acoustic impedances (Z), little power 15 is reflected back to the transducer 20. As a result those received echoes have a small amplitude.
When the ultrasound beam reaches an interface with an object whose acoustic impedance ZO is much different, a larger-amplitude echo results.
Therefore, if an implanted target is composed of a material with an acoustic 20 impedance ZO that is much greater or much less than the layers above or below it, a "target interface echo" will result. If enough prior knowledge exists to en~sure that no other similar reflectors can be located in the expected depth range of the imbedded target, then the target can be automatically 2118~32 detected according to the present invention based on the presence of such a target interface echo.
The arnplitude of the target interface echo will be a maximum when the largest portion of the ultrasound beam reaches and is reflected from that 5 interface. If the ultrasound beam has a diameter greater than or approximately the same as the diameter of the target, the maximum echo arnplitude occurs when the transducer is centered over the target. This theory, coupled with the position of the transducer corresponding to each received signal, may be used to map out the position of the target in the 10 lateral plane, the plane perpendicular to the ultrasound beam. To then determine the depth of the target from the transducer face, the present invention detects the time delay between received echoes. Based on the knowledge of the speed of sound in the intervening materials, the direction of the transducer's orientation may be extrapolated along to estimate the three-15 dimensional location of the target.
One example of a pointing device which may be used to implement anembodiment of the present invention is an articulated arm employing encoders at each joint which track the angles bet veen the arm's segments.
FIG. 2 illustrates an example of such a pointing device system to which an 20 ultrasound transducer may be attached according to an embodiment of the present invention. FIG. 2 illustrates an articula$ed arm 34 such as that used in the interactive, image-guided neurosurgery systern described in U.S. Patent No. 5,142,930 to Allen et al. As illustrated in FIG. 2, an extemal arrn 34 is fixed to a base 36, which is movably fixed to some location. The arm 34 21i853~

carries a tool 38 including an end tip 40 which is changeable and, for purposes of the present invention, is a pointing device which may include an attached ultrasound unit. A sensor (not illus~rated in FIG. 2) which comprises an ultrasonic detector may be attached in place of the tool 38 and S the end tip 40 of the tool 38 in order to practice tbe present invention~ The arm 34 has two arm lengths 40A, 40B. The first arm length 40A is coupled to the base by two gimbal joints 42. The first a~n length 40A therefore has two degrees of motion, as provided by the two gimbal joints 42.
A second arm length 40B is coupled to the first arrn length 40A by a 10 second pair of gimbal joints 42. The second pair of gimbal joints 42 provides the second arm length 40B with two additional degrees of motion. The second arm length 40B therefore has four degrees of motion relative to the base 36 of the arm 34.
A tool holder 44 is coupled to the second arm length 40B through a 15 pair of gimbal joints 42. The tool holder 44 can hold any of a number of different tools, including a pointer, an ultrasound unit, a surgical laser, a biopsy probe, a radiation beam collimator, etc. In an embodiment of the present invention, the tool held by the tool holder 44 is an ultrasound unit (not illustrated in FIG. 2). A third pair of gimbal joints 42 provides the tool 20 38 with two additional degrees of motion, so that the tool 38 has 6 degrees of motion relative to the base 36.
The exact positioning of the tool 38 relative to the base 36 may be monitored by optical encoders 46. One optical encoder 46 is assigned to each gimbal joint 42. Each gimbal joint 42 is individually rotated around its 21I8~32 , ~
pivot and the optical encoder 46 detern~ines the precise amount of rotation of the gimbal joint 42 around its pivot. The information from each of the six optical encoders 46 is provided to a programmable compu~er (not illustrated in FIG. 2) via wires 48. The programmable computer can therefore precisely 5 track the movement of the tool 38 relative to the base 36 by keeping track of the individual rotations of the gimbal joints 42 around their pivots.
Another example of a pointing device which may be used to implement the present invention uses infrared light emitting diodes (LEDs) along its shaft. The LEDs are strobed in sequence and an infrared camera 10 attached to a computer notes the location of the LEDs relative to a set of reference LEDs in a fixed position. Such a system which digitizes the coordinates of physical space is the Optotrak/3020 (Northern Digital, Inc., Waterloo, Ontario), illustrated in FIC~. 3. One implementation of this system is described in U.S. Patent No. 5,197,476 issued to Nowacki et al.
The optical system illustrated in FIG. 3 tracks the coordinates and onentation of an object 50 which is illustrated as a pointer, but may be replaced by an A-mode ultrasound transducer to implement an embodiment of the present invention. A shaft 52 extends from object 50 with a plurality of infrared light emitting diodes (LEDs) on its surface. A cable connects the 20 probe 54 (consisting of shaft 52 and end-effector 50) to a strober 56. The probe 54 may be easily and freely manipulated by a user. Mounted in a fixed position near the region of interest is a set of reference infrared LEDs 58 referred to as a rigid body. This set of reference LEDs 58 establishes the origin of a three-dimensional coordinate system in physical space.

2~18S32 The infrared LEDs 52 and 5B are strobed in sequence by strober 56 and are visible to a position sensor 60. Position sensor 60 includes three linear charge coupled device (CCD) cameras 60A, 60B, 60C with cylindrical optics placed in front of the CCDs to compress the field of view into a single 5 line. The output of each CCD 60A, 60B, 60C is captured by a separate processor (noi illustrated) which extracts the object's position in that v~ew. A
fourth processor (not illustrated) receives the output of each of the three separate individual processors and ~riangulates the location of an infrared LED and passes it to a computer 62. Since the infrared LEDs on the shaft 10 52 are pulsed in a known sequence and configuration, and the length of the end-effector 50 is known, the computer 62 is able to calculate the time-distinct location of the end^effector 50 with respect to the rigid body of reference infrared LEDs 58.
The present invention is not limited to embodiments of the present invention using the pointing devices illustrated in FIG. 2 and FlG. 3. In the present invention, an ultrasound transducer may be attached to any pointing device such that the coordinates and orientation of the transducer are continuously passed to the connected computer, as illustrated, for example, in FIG. 4.
In FIG. 4, each time a transducer 72 is pulsed by a pulser/receiver 74 in response to a trigger signal 76 from computer 78, an ultrasound signal 80 of specific time duration is acquired from pulser/receiver 74, digitized by an A/D converter 82 which may be included in computer 78, and stored as a signal ~i) in a memory 88 of the computer 78. The rate at which the A/D

2118~32 converter 82 samples the analog ultrasound signal 80 must be sufficient to adequately represent the signal. The sampling frequency should be at least twice the frequency bandwidth of the ultrasound signal. Simultaneously, the position and orientation of the transducer 72 corresponding to the current 5 signal are acquired from a pointing device 84. A transducer position calculator 86 calculates a transducer position (i) in respor~e to the data provided frorn pointing device 84 which is stored in memory 88 along with the signal (i). In this manner, signal/position pairs are acquired while a region of an object or anatomy are scanned with the transducer assembly.
To gather signal/position pairs from an object or patient once that object or patient is secured in a fixed position, the transducer assembly is held by a user and moved across a surface of interest, keeping the transducer perpendicular to the surface. FIG. 5 illustrates a position of a transducer piston 92 near a target 94. An ultrasound stand-off pad 96 is used in placing 15 the target 94 in an optimal range of an ultrasound beam 98 (illustrated by a dotted line) of the transducer piston 92. Contact between a transducer face 100 and the surface 102 is maintained with the aid of a coupling medium, such as acoustic gel or water. The ultrasonic stand-off pad 96 may be used to place the expected target depth in an optimal range of the ultrasound beam 20 98 depending on the expected depth of the implanted target 94 and the type of transducer used. Stand-off 96 may be a layer or "pad" of a gelatinous material or a water path.
In the present invention as described above in reference to FlG.4, as the signal/position pairs are acquired9 the signals are automatically analyzed.

The signals are analyzed to detect echoes arising from interfaces between materials of different characteristic acoustic impedance (~0) as described above. FIG. 6, which includes FIG. 6(a), FIG. 6(b) and FIG. 6(c), illustrates this concept as applied to a low-Z target 110 (e.g., a cylindrical fiducial S marker) implanted in a high-Zo material 112 (e.g., the human skull) covered by two layers 114, 116 of low-Z0 material (e.g., scalp and fat). FIG. 6(a), FIG. 6(b) and FIG. 6(c~ illustrate the relative position of the transducer piston 118 and the target 110. Fig. 7(a), FIG. 7(b) and FIG. 7(c) illustrate corresponding received reflections of the ultrasound signals for the different 10 relative positions of the transducer piston 118 and target 110 of FIG. 6(a), FIG. 6(b) and FIG. 6(c), respectively. As illustrated, for this case the target interface echo is the echo arising from the interface between the distal surface of the target and the high-Zo divot in which the target sits. In FIG.
7(a), FIG. 7(b) and FIG. 7(c), the time interval (from tO to t~) corresponds to 15 the depth of the target interface echo. This time interval is the time interval in which the target interface echo is expected in this example. The reflection signal illustrated in FIIG. 7(c) is the signal used to determine the depth of the target from the transd~cer since it corresponds to the accurate relative positioning of the transducer piston 118 and target 110 illustrated in FIG.
20 6(c). The computer 26, 54, or 78, for example, extracts the portion of each signal between tO and tl and the result is referred to as the windowed signal.
Since the amplitude of the extracted windowed signal is proportional to the proximity of the transducer beam 120 to the center of the target 110, a measure of each windowed signal's amplitude is calculated. For signals -4, ~C~ ",; ~ ,",~ " ~ ", "~ "~ ~

whose measured amplitude exceeds a given threshold, the transducer coordinates corresponding to that signal are weighted with that measured amplitude. A centroid is then calculated which estimates the coordinates of the transducer face when it was centered over the target. The signal or S signals acquired nearest that centroid are examined and the time index of the leading edge of the target interface echo is detected. Based on the knowledge of the speed of sound in the materials and the geometry of the ~;
target, the depth of the target from the transducer face is determined. By adding this depth to the transducer position corresponding to that signal 10 along the orientation of the transducer (also stored along with the signal), the estimated three-dimensional coordinates of the target may be calculated.
As an example, the present invention may be used in image-to-physical space registration for neurosurgical guidance. Once the ~lducial markers are implanted in the patient, the preoperative image sets may be acquired.
15 These fiducial markers may be between 1 rnrn and 10 mm in diameter and are composed of biocompatible materials which are visible in the imaging modalities of interest. The markers are implanted into the skull, flush with the surface of the skull.
Possible marker scenarios include in FIG. 8(a) and FIG. 8(b) a marker 20 130 in which small cylinders with a thin shell of low-Z polymer material filled with a water-based solution, or in FIG. 10 a marker 140 which is identical to marker 130 except that a high-Zo layer 142 such as a glass marker or "cap" is included on the end of the cylinder which is to be implanted first (i.e., the end of the marker distal to the surface of the skull).

2118~32 FIG. 9(a) and FIG. 9(b) illustrate expected received echo signals in response to the respective positions of the transducer piston and fiducial marker 130 illustrated in FIG. 8(a) and FIG. 8(b), respectively. The received signal in FIG. 9(b) includes a marker/bone divot echo portion which is not 5 included in the received signal illustrated in FIG. 9(a). This marker/bone divot echo portion is the portion of the signal which is extracted as the windowed signal as discussed above.
As mentioned above, FIG. 10 illustrates a fiducial marker 140 identical to marker 130 except that a high-Zo layer 142 such as a glass marker "cap" is 10 included on a distal surface of marker 140. FIS~. 11 illustrates the expected received signal in response to the transducer piston and fiducial marker 140 arrangement of FIG. 10. The signal illustrated in FIG. 11 includes a marker contents/marker "cap" echo portion which is the portion of the signal extracted as the windowed signal.
In the method of the present invention, each image volume is analyzed to determine the location of the center of each fiducial marker within the three-dimensional coordinate system of the image volume (three or more markers are preferably used in order to more accurately determine coordinate positions). The image volumes and the ffducial locations in each 20 volume are stored on a computer.
In the operating roorn, the patient 150 illustrated in FIG. 12 is fixed in position for surgery, usually with a head clamp. A pointing device such as interactive, image-guided (IIG) arm 152 is connected to a computer lS4 holding the image sets and image fiducial locations. An A-mode ultrasound 21~8~32 transducer 156 is attached to the pointing device 152. A fiducial marker 158 is implanted below the scalp 150A and subcutaneous fat 150B into the skull lSOC of the patient 1S0. An example of a transducer 156 which may be used in this arrangernent is a 10 MHz short-focus piston with a 3 mm-diameter S crystal. The pointing device 152 is set up such that the computer 154 continuously calculates the three-dimensional coordinates of the center of the transducer face and the orientation of the transducer. The computer 154 receives the signals from transducer 156 and performs a feature extraction on the received signals in which, for example, the marker/bone divot echo 10 portion or marker contents/marker "cap" echo portion of the signal is extracted. Then computer 154 determines a center position of the marker 158 in response to arm position data received from the pointing device 152 and in response to the feature extraction signal.
In order to implement the present invention, an operator holds the 15 transducer assembly 156 and places it on the scalp 150A of the patient 150 while maintaining contact between the transducer and scalp with acoustic gel and an ultrasonic standoff 16Q. The operator starts a program within computer 154 that pulses the transducer 156 and receives the reflected signals along with the transducer position information. The operator slowly moves 20 the transducer 156 while signal/position information is collected. The received signals are analyzed to isolate the transducer positions for which the ultrasound beam passes through the marker 158. For space registration purposes, it is the coordinates of the center of the marker 158 which are of interest. As illustrated in FIG. 12, an estimate of the center of the marker . , ~118~32 158 may be deterrnined using a centroid/extrapolation process and stored in the computer 154.
This process is repeated until all of the markers (preferably three or more) have been localized. To confirm accuracy of the marker center S estimates, the distances between the marker centers are compared to the distances bet veen marker centers as determined from, for example, preoperative irnages. If the error is within an acceptable range, a transformation is calculated mapping the surgical space coordinates of the pointing device into equivalent locations in the image sets.
The transducer 156 is removed from the pointing device 152, and associated hardware and software are updated via computer 154 to reflect the resulting change in length of the pointing device 152. The surgeon may then point at locations of interest on the patient and look at a display 162 of the computer 154 to see the corresponding location indicated on the preoperative 15 image sets. In this way, accurate, interactive surgical guidance is provided to the surgeon.
FIG. 13 illustrates a flowchart of an ultrasound method of locating implanted targets according to an embodiment of the present invention. In step 172 the ultrasound transducer is coupled to the pointing device systenL
20 The patient or object to be scanned is placed in a fixed position, for exarnple, by clamping the patient or object to a table (step 174). In step 17~, the transducer is coupled to the surface (for example, the scalp of the patient) with gel or water. The user then holds the transducer assembly against the surface in a marmer perpendicular to the surface (step 178). The computer is ~ , 21~8532 used to pulse the transducer and store the reeeived reflected signal along with a transducer position and orientation in step 180. In step 182, the user slowly moves the transducer along the surface. A deterrnination is made in step 184 as to whether the user is finished. If the user is not finished, the 5 computer again pulses the transducer and stores the reflected signals in step 180. Once all pulsing and storing of signal information has been ~inished as determined in step 184, the computer provides an automatic signal analysis in step 186 which is based on the amplitude and depth of received echo signals.
A~ a result, the center position of the target may be accurately determined.
FIG. 14 illustrates one embodiment in which the automatic signal analysis determination of step 186 in FIG. 13 may be implemented. It is noted that other methods of automatic signal analysis may be implemented within the scope and spirit of the present invention and FIG. 14 merely illustrates an embodiment thereof.
FIG. 14 includes a step 192 of collecting signals and transducer coordinates in the marker region, including the reflected ultrasound signals and the ultrasound transducer position data. In order to perform this step, the reflected signals are analyzed to detect echoes arising from interfaces between materials of different acoustic impedance. Such reflected ultrasound 20 signals include, for example, those signals illustrated in FIG. 7(a), FIG. 7(b), FIG. 7(c), FIC;. 9(a), FIG. 9(b) and FIG. 11. Step 194 relates to applying a window to the collected signal after a surface echo occurs therein. This window corresponds to the portion between to and tl in FIG. 7(a~, FIG. 7(b) and FIG. 7(c). The windowed portion of the signal is extracted from the reflected signal. The amplitude of the extracted windowed signal corresponds to the proximity of the transducer beam to the center of the target. The standard deviation aj of the windowed signal is calculated in step 196 as an indication of the proximity of the transducer to the center of the target. The S weighted centroid of the coordinates of the signal are determined in step 198, for example, according to the following formulas.

i x~ thresholO
xc ~ (ai-threshok~

i Yi (a~-thresh (a j -threshold) i Z~ (oi-thres)~ola ~ (a~-~hreshold) in which, x;, yj and z; correspond to the ultrasound transducer position data received from the pointing device, signal oj relates to the standard deviation calculated in step 196. The "threshold" value is calculated based on the maximum standard deviation oj of a signal which did not pass through the 20 marker. By subtracting this threshold value from each standard deviation value oj, the effect on the centroid calculation of collecting signal/position data outside the marker region is minimized. The coordinates xc, Yc and Zc are the coordinates of the transducer face when it was centered over the ~.:

21 18~32 ,~

target. Together, steps 194, 196 and 198 perform the localization used to determine the lateral position of the center of the marker.
In step 200, a time delay of the marker echo is located nearest the position of the centroid coordinates (xc, Yc, Zc) to detect the leading edge of 5 the target interface echo. Step 202 determines the depth of the marker center, for example, according to the formula:

dC = dd--S ~mh where dc is the depth from the marker center to the face of the transducer, dd is the distance from the transducer face to the target interface (e.g., the interface between the marker and the divot in which the marker sits), and mh is the marker height. The depth of the target from the transducer face may be determined based s)n the speed of sound in the 15 materials and the geometry of the target. By adding this depth to the transducer position corresponding to that signal along the orientation of the transducer (also storecl along with the signal), the estimated three-dimensional coordinates of the target may be calculated. Step 204 calculates the three-dimensional coordinates for the marker center based on the depth 20 of the marker, the determined transducer position, and the ultrasound transducer position data. Steps 200, 202 and 204 together perform depth localization of the marker for determining the depth of the marker from the position of the transducer.

211~32 While the signal analysis determination steps illustrated in FIG. 14 have heen described as set forth above, it is pointed out that the present invention is not specifically lirnited to this marker localization method. The present invention could be practiced by locating the position of the target or 5 fiducial marker using another method which accurately determines the position of the marker.
FIG. 15 illustrates a fiducial marker 300 which may be used in implementing the present invention. The fiducial marker 300 may be left implanted, for example, entirely beneath the skin for extended periods of 10 time. The marker 300 comprises a cylinder 302 defining a space 304 into which may be placed, for example, one or more imaging agents. A cylindrical shape is preferred for marker 300, because this shape minimizes the size of the incision that must be made for the marker's insertion. It is also the shape that best corresponds to the hole that may be drilled in a bone to 15 accommodate the marker. In any case, it is preferred that the marker at least be symmetrical in shape. The body of the cylinder is sealed off with a cap 306 or is otherw~se sealed. The body is preferably constructed of an organic polymer known to be well tolerated by the body for extended periods of time, such as polymethyl methacrylate (PMMA), high density polyethylene, 20 or ceramics such as zirconium oxide and alurninum oxide. The entire marker assembly is small enough for long-term implantation into bone without causing distortion of the bone over time. One exemplary size provides for the marker to be 4mm in diameter and 3mm in height. The marker may be implanted in a human skull, flush with the surface of the skull, beneath the :

fat and subcutaneous fat. Additionally, the cap 306 of the marker may include a high acoustic impedance layer such as a glass marker or "cap"
similar to the marker illustrated in FIG. 10.
The cylindrical marker defined above is only one example of possible S target composition, geometry and implantation amenable to localization using the present invention. Possible embodiments of implanted targets which may be used to implement the present invention include the following: A low-Z
target implanted under one or more low-Z layers but into a high-Z layer, flush with the surface of that layer. The target echo in that case is the 10 interface between the target and the high-Zo divot in which the target sits.
Additional possibilities include a low-Z0 target implanted in high Z0 materials, and a high-Z target implanted in a low-Z0 material.
The present invention may be embodied in other speci~lc forms other than that specifically disclosed in this application without departing from its 15 spirit or essential characteristics. For example, while a specific fiducial marker has been described in reference to FIG. 15, the present invention is not limited to localization of this particularly described implanted marker and -may be implemented to detect, for example, any implanted target.

Claims (50)

1. A method for automatically locating a position of a first object, comprising steps of:
attaching a transducer to a coordinate space digitizer having a defined coordinate system;
pulsing said transducer while translating it along a surface of interest near said first object;
receiving echoes from said pulsed transducer arising from a difference in an acoustic impedance of a first portion of the first object and an acoustic impedance of a material located near said first portion of the first object;
and determining a position in said defined coordinate system of said first object in response to said received echoes and a position and orientation of said transducer.
2. A method according to Claim 1, wherein said first object is a fiducial marker.
3. A method according to Claim 2, wherein said fiducial marker is implanted in a bone.
4. A method according to Claim 3, wherein said bone is a skull of a patient and said surface of interest is a scalp of said patient.
5. A method according to Claim 1, wherein said material located near said first portion of the first object is a second object having an acoustic impedance different from the acoustic impedance of the first portion of the first object.
6. A method according to Claim 1, wherein said material located near said first portion of the first object is a second portion of the first object having an acoustic impedance different from the acoustic impedance of the first portion of the first object.
7. A method according to Claim 1, wherein said pulsing and receiving steps are repeated for a plurality of positions and orientations of said transducer and said determining step determines a position of said first object in response to said plurality of receiving steps and said plurality of positions and orientations of said transducer.
8. A method according to Claim 1, wherein said determining step further comprises steps of:
determining a time delay between said received echoes; and determining a physical depth of said first object from said transducer.
9. A method according to Claim 1, wherein said method is used for registering physical space with image volumes.
10. A method according to Claim 9, wherein said method is used in interactive, image-guided surgery or in fractionated radiotherapy.
11. A method according to Claim 1, wherein said transducer is an ultrasound transducer.
12. A method according to Claim 1, wherein said determining step determines a position of said first object in response to said received echoes, a correspondence between a time delay between received echoes and a physical depth between said first object and said transducer, and a position and orientation of said transducer.
13. A method according to Claim 1, further comprising a step of determining a position of said first object in response to said received echoes and a corresponding position and orientation of said transducer which corresponds to a position of said transducer during said pulsing step.
14. A method according to Claim 1, wherein said determining step comprises steps of:
extracting a windowed surface echo signal from said received echoes;
calculating a standard deviation of said extracted windowed surface echo signal;
calculating a weighted centroid of positions of said transducer;

locating a time delay of said windowed surface echo signal nearest the weighted centroid;
calculating a depth of a center of the first object in response to a depth of an interface in which said first object is located and a height of said first object; and calculating coordinates in said coordinate system of the center of the first object in response to said calculated depth and said time delay.
15. A method according to Claim 1, wherein said receiving step receives echoes from said pulsed transducer arising from differences in acoustic impedances of said first portion and different portions within said first object.
16. A method according to Claim 1, wherein said acoustic impedance of said first portion of said object is lower than said acoustic impedance of said material located near said first portion of said first object.
17. A method according to Claim 1, wherein said first object is a symmetric object.
18. A method according to Claim 17, wherein said first object is cylindrical in shape.
19. A method for automatic amplitude-mode ultrasound location of an implanted fiducial marker, comprising steps of:
attaching an ultrasound transducer to a coordinate space digitizer having a defined coordinate system;
pulsing said ultrasound transducer while translating it along a surface of interest near said implanted fiducial marker;
receiving echoes from said pulsed ultrasound transducer arising from a difference in an acoustic impedance of a first portion of the implanted fiducial marker and an acoustic impedance of a material located near said first portion of the implanted fiducial marker; and determining a position in said defined coordinate system of said implanted fiducial marker in response to said received echoes and a position and orientation of said ultrasound transducer.
20. A method according to Claim 19, wherein said fiducial marker is implanted in a bone.
21. A method according to Claim 20, wherein said bone is a skull of a patient and said surface of interest is a scalp of said patient.
22. A method according to Claim 19, wherein said material located near said first portion of the implanted fiducial marker is a material in which said implanted fiducial marker is implanted.
23. A method according to Claim 19, wherein said material located near said first portion of the implanted fiducial marker is an object separate from said implanted fiducial marker having an acoustic impedance different from the acoustic impedance of the first portion of the implanted fiducial marker.
24. A method according to Claim 19, wherein said material located near said first portion of the implanted fiducial marker is a second portion of the implanted fiducial marker having an acoustic impedance different from the acoustic impedance of the first portion of the implanted fiducial marker.
25. A method according to Claim 24, wherein the second portion of the implanted fiducial marker is a cap of said implanted fiducial marker.
26. A method according to Claim 25, wherein the acoustic impedance of the cap is higher than the acoustic impedance of the first portion of the implanted fiducial marker.
27. A method according to Claim 19, wherein said pulsing and receiving steps are repeated for a plurality of positions and orientations of said transducer and said determining step determines a position of said implanted fiducial marker in response to said plurality of receiving steps and said plurality of positions and orientations of said transducer.
28. A method according to Claim 19, wherein said determining step further comprises steps of:
determining a time delay between said received echoes; and determining a physical depth of said implanted fiducial marker from said transducer.
29. A method according to Claim 19, wherein said method is used for registering physical space with image volumes.
30. A method according to Claim 29, wherein said method is used in interactive, image-guided surgery or in fractionated radiotherapy.
31. A method according to Claim 19, wherein said determining step determines a position of said implanted fiducial marker in response to said received echoes, a correspondence between a time delay between received echoes and a physical depth between said implanted fiducial marker and said transducer, and a position and orientation of said ultrasound transducer.
32. A method according to Claim 19, further comprising a step of determining a position of said implanted fiducial marker in response to said received echoes and a corresponding position and orientation of said transducer which corresponds to a position of said ultrasound transducer during said pulsing step.
33 33. A method according to Claim 19, wherein said determining step comprises steps of:
extracting a windowed surface echo signal from said received echoes;
calculating a standard deviation of said extracted windowed surface echo signal;
calculating a weighted centroid of positions of said ultrasound transducer;
locating a time delay of said windowed surface echo signal nearest the weighted centroid;
calculating a depth of a center of the implanted fiducial marker in response to a depth of an interface in which said implanted fiducial marker is located and a height of said implanted fiducial marker; and calculating coordinates in said coordinate system of the center of the implanted fiducial marker in response to said calculated depth and said time delay.
34. A method according to Claim 19, wherein said receiving step receives echoes from said pulsed ultrasound transducer arising from differences in acoustic impedances of said first portion and different portions within said implanted fiducial marker.
35. A method according to Claim 19, wherein said acoustic impedance of said first portion of said implanted fiducial marker is lower than said acoustic impedance of said material located near said implanted fiducial marker.
36. A method according to Claim 19, wherein said implanted fiducial marker is a symmetric object.
37. A method according to Claim 36, wherein said implanted fiducial marker is cylindrical in shape.
38. A method according to Claim 19, wherein said coordinate space digitizer includes a pointing device.
39. A system for locating a position of a first object, comprising:
a coordinate space digitizer having a defined coordinate system;
an ultrasound transducer connected with said coordinate space digitizer;
a pulser pulsing said ultrasound transducer while translating it along a surface of interest located near said first object;
a receiver receiving echoes from said pulsed ultrasound transducer arising from a difference in an acoustic impedance of a first portion of the first object and an acoustic impedance of a material located near said first portion of the first object; and means for determining a position in said defined coordinate system of said first object in response to said received echoes and a position and orientation of said ultrasound transducer.
40. A system according to Claim 39, wherein said first object is an implanted fiducial marker.
41. A system according to Claim 40, wherein said material is a second object in which said implanted fiducial marker is implanted.
42. A system according to Claim 40, said implanted fiducial marker having a housing, said housing containing a cavity, said cavity comprising:
said first portion; and a second portion of the implanted fiducial marker, said second portion including said material located near said first portion.
43. A system according to Claim 42, wherein said second portion is a cap of the implanted fiducial marker.
44. A fiducial marker assembly comprising an imaging marker assembly having a housing, said housing containing a cavity, said cavity comprising:
a first portion including a first material having a first acoustic impedance; and a second portion including a second material having a second acoustic impedance which is different than said first acoustic impedance.
45. A fiducial marker assembly according to Claim 44, wherein said second portion comprises a cap of said imaging marker assembly.
46. A fiducial marker assembly according to Claim 45, wherein said cap is a glass marker cap.
47. A fiducial marker assembly according to Claim 44, wherein said second portion comprises a layer near an end of said fiducial marker.
48. A fiducial marker assembly according to Claim 47, wherein said second acoustic impedance is higher than said first acoustic impedance.
49. A fiducial marker assembly according to Claim 44, wherein said fiducial marker is a permanent implantable fiducial marker.
50. A fiducial marker assembly according to Claim 44, wherein said fiducial marker assembly is for use in an ultrasound detection technique.
CA002118532A 1993-10-21 1994-10-20 Automatic ultrasonic localization of targets implanted in a portion of the anatomy Abandoned CA2118532A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US139,139 1993-10-21
US08/139,139 US5394875A (en) 1993-10-21 1993-10-21 Automatic ultrasonic localization of targets implanted in a portion of the anatomy

Publications (1)

Publication Number Publication Date
CA2118532A1 true CA2118532A1 (en) 1995-04-22

Family

ID=22485288

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002118532A Abandoned CA2118532A1 (en) 1993-10-21 1994-10-20 Automatic ultrasonic localization of targets implanted in a portion of the anatomy

Country Status (4)

Country Link
US (1) US5394875A (en)
EP (1) EP0650075A3 (en)
JP (2) JPH07255723A (en)
CA (1) CA2118532A1 (en)

Families Citing this family (265)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2652928B1 (en) 1989-10-05 1994-07-29 Diadix Sa INTERACTIVE LOCAL INTERVENTION SYSTEM WITHIN A AREA OF A NON-HOMOGENEOUS STRUCTURE.
ES2115776T3 (en) 1992-08-14 1998-07-01 British Telecomm POSITION LOCATION SYSTEM.
US5829444A (en) 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5803089A (en) 1994-09-15 1998-09-08 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US7500952B1 (en) * 1995-06-29 2009-03-10 Teratech Corporation Portable ultrasound imaging system
US8241217B2 (en) * 1995-06-29 2012-08-14 Teratech Corporation Portable ultrasound imaging data
US5590658A (en) * 1995-06-29 1997-01-07 Teratech Corporation Portable ultrasound imaging system
US6351659B1 (en) * 1995-09-28 2002-02-26 Brainlab Med. Computersysteme Gmbh Neuro-navigation system
US5636255A (en) * 1996-03-05 1997-06-03 Queen's University At Kingston Method and apparatus for CT image registration
US5690113A (en) * 1996-06-14 1997-11-25 Acuson Corporation Method and apparatus for two dimensional ultrasonic imaging
US6135961A (en) * 1996-06-28 2000-10-24 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US7819807B2 (en) * 1996-06-28 2010-10-26 Sonosite, Inc. Balance body ultrasound system
US6575908B2 (en) * 1996-06-28 2003-06-10 Sonosite, Inc. Balance body ultrasound system
US6416475B1 (en) * 1996-06-28 2002-07-09 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6962566B2 (en) 2001-04-19 2005-11-08 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
FR2751109B1 (en) * 1996-07-09 1998-10-09 Ge Medical Syst Sa PROCEDURE FOR LOCATING AN ELEMENT OF INTEREST CONTAINED IN A THREE-DIMENSIONAL OBJECT, IN PARTICULAR DURING AN EXAMINATION OF STEREOTAXIS IN MAMMOGRAPHY
EP0832610A3 (en) * 1996-09-30 1999-06-16 Picker International, Inc. Trackable guide for surgical tool
EP0937263B1 (en) 1996-11-07 2003-05-07 TomTec Imaging Systems GmbH Method and apparatus for ultrasound image reconstruction
US5886775A (en) * 1997-03-12 1999-03-23 M+Ind Noncontact digitizing imaging system
WO1999002098A1 (en) * 1997-07-07 1999-01-21 Takeshi Ohdaira Lesioned site detector for celiotomy and laparoscopic surgery
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US8668737B2 (en) 1997-10-10 2014-03-11 Senorx, Inc. Tissue marking implant
US7637948B2 (en) 1997-10-10 2009-12-29 Senorx, Inc. Tissue marking implant
US6325758B1 (en) * 1997-10-27 2001-12-04 Nomos Corporation Method and apparatus for target position verification
US6021343A (en) 1997-11-20 2000-02-01 Surgical Navigation Technologies Image guided awl/tap/screwdriver
US6348058B1 (en) 1997-12-12 2002-02-19 Surgical Navigation Technologies, Inc. Image guided spinal surgery guide, system, and method for use thereof
US6347241B2 (en) 1999-02-02 2002-02-12 Senorx, Inc. Ultrasonic and x-ray detectable biopsy site marker and apparatus for applying it
US6161034A (en) * 1999-02-02 2000-12-12 Senorx, Inc. Methods and chemical preparations for time-limited marking of biopsy sites
US6529765B1 (en) 1998-04-21 2003-03-04 Neutar L.L.C. Instrumented and actuated guidance fixture for sterotactic surgery
US6298262B1 (en) 1998-04-21 2001-10-02 Neutar, Llc Instrument guidance for stereotactic surgery
US6546277B1 (en) 1998-04-21 2003-04-08 Neutar L.L.C. Instrument guidance system for spinal and other surgery
US6135958A (en) * 1998-08-06 2000-10-24 Acuson Corporation Ultrasound imaging system with touch-pad pointing device
US6351662B1 (en) 1998-08-12 2002-02-26 Neutar L.L.C. Movable arm locator for stereotactic surgery
US6282437B1 (en) 1998-08-12 2001-08-28 Neutar, Llc Body-mounted sensing system for stereotactic surgery
US6477400B1 (en) 1998-08-20 2002-11-05 Sofamor Danek Holdings, Inc. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US6033415A (en) * 1998-09-14 2000-03-07 Integrated Surgical Systems System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system
DE19841859A1 (en) * 1998-09-14 2000-04-06 Deutsches Krebsforsch Method of positioning a body part for treatment on a medical device
DE19848765C2 (en) 1998-10-22 2000-12-21 Brainlab Med Computersyst Gmbh Position verification in camera images
US6973202B2 (en) * 1998-10-23 2005-12-06 Varian Medical Systems Technologies, Inc. Single-camera tracking of an object
US7158610B2 (en) * 2003-09-05 2007-01-02 Varian Medical Systems Technologies, Inc. Systems and methods for processing x-ray images
US6279579B1 (en) 1998-10-23 2001-08-28 Varian Medical Systems, Inc. Method and system for positioning patients for medical treatment procedures
US6621889B1 (en) 1998-10-23 2003-09-16 Varian Medical Systems, Inc. Method and system for predictive physiological gating of radiation therapy
US6980679B2 (en) * 1998-10-23 2005-12-27 Varian Medical System Technologies, Inc. Method and system for monitoring breathing activity of a subject
EP1123138B1 (en) 1998-10-23 2004-04-28 Varian Medical Systems, Inc. Method and system for physiological gating of radiation therapy
US6937696B1 (en) * 1998-10-23 2005-08-30 Varian Medical Systems Technologies, Inc. Method and system for predictive physiological gating
US6322567B1 (en) 1998-12-14 2001-11-27 Integrated Surgical Systems, Inc. Bone motion tracking system
US6430434B1 (en) 1998-12-14 2002-08-06 Integrated Surgical Systems, Inc. Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers
CA2356271A1 (en) 1998-12-23 2000-07-06 Image Guided Technologies, Inc. A hybrid 3-d probe tracked by multiple sensors
US9669113B1 (en) 1998-12-24 2017-06-06 Devicor Medical Products, Inc. Device and method for safe location and marking of a biopsy cavity
US6356782B1 (en) 1998-12-24 2002-03-12 Vivant Medical, Inc. Subcutaneous cavity marking device and method
US6371904B1 (en) * 1998-12-24 2002-04-16 Vivant Medical, Inc. Subcutaneous cavity marking device and method
US6862470B2 (en) 1999-02-02 2005-03-01 Senorx, Inc. Cavity-filling biopsy site markers
US20080039819A1 (en) * 2006-08-04 2008-02-14 Senorx, Inc. Marker formed of starch or other suitable polysaccharide
US8498693B2 (en) 1999-02-02 2013-07-30 Senorx, Inc. Intracorporeal marker and marker delivery device
US7983734B2 (en) 2003-05-23 2011-07-19 Senorx, Inc. Fibrous marker and intracorporeal delivery thereof
US20090030309A1 (en) * 2007-07-26 2009-01-29 Senorx, Inc. Deployment of polysaccharide markers
US6725083B1 (en) 1999-02-02 2004-04-20 Senorx, Inc. Tissue site markers for in VIVO imaging
US8361082B2 (en) 1999-02-02 2013-01-29 Senorx, Inc. Marker delivery device with releasable plug
US9820824B2 (en) 1999-02-02 2017-11-21 Senorx, Inc. Deployment of polysaccharide markers for treating a site within a patent
US7651505B2 (en) 2002-06-17 2010-01-26 Senorx, Inc. Plugged tip delivery for marker placement
US6106464A (en) * 1999-02-22 2000-08-22 Vanderbilt University Apparatus and method for bone surface-based registration of physical space with tomographic images and for guiding an instrument relative to anatomical sites in the image
US6501981B1 (en) * 1999-03-16 2002-12-31 Accuray, Inc. Apparatus and method for compensating for respiratory and patient motions during treatment
US6778850B1 (en) * 1999-03-16 2004-08-17 Accuray, Inc. Frameless radiosurgery treatment system and method
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US6491699B1 (en) * 1999-04-20 2002-12-10 Surgical Navigation Technologies, Inc. Instrument guidance method and system for image guided surgery
DE19917867B4 (en) 1999-04-20 2005-04-21 Brainlab Ag Method and device for image support in the treatment of treatment objectives with integration of X-ray detection and navigation system
US6575991B1 (en) 1999-06-17 2003-06-10 Inrad, Inc. Apparatus for the percutaneous marking of a lesion
US6626899B2 (en) 1999-06-25 2003-09-30 Nidus Medical, Llc Apparatus and methods for treating tissue
US11331150B2 (en) 1999-10-28 2022-05-17 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6499488B1 (en) 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
US6381485B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies, Inc. Registration of human anatomy integrated for electromagnetic localization
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6493573B1 (en) * 1999-10-28 2002-12-10 Winchester Development Associates Method and system for navigating a catheter probe in the presence of field-influencing objects
US6500119B1 (en) * 1999-12-01 2002-12-31 Medical Tactile, Inc. Obtaining images of structures in bodily tissue
US20010034530A1 (en) 2000-01-27 2001-10-25 Malackowski Donald W. Surgery system
WO2001064124A1 (en) 2000-03-01 2001-09-07 Surgical Navigation Technologies, Inc. Multiple cannula image guided tool for image guided procedures
US6497134B1 (en) 2000-03-15 2002-12-24 Image Guided Technologies, Inc. Calibration of an instrument
US6447438B1 (en) 2000-04-05 2002-09-10 Spectrasonics Imaging, Inc. Apparatus and method for locating therapeutic seeds implanted in a human body
US6535756B1 (en) 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
US7085400B1 (en) 2000-06-14 2006-08-01 Surgical Navigation Technologies, Inc. System and method for image based sensor calibration
CA2659518A1 (en) * 2000-11-20 2002-05-30 Senorx, Inc. Tissue site markers for in vivo imaging
US6657160B2 (en) * 2001-01-25 2003-12-02 The Regents Of The University Of California Laser peening of components of thin cross-section
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US7769430B2 (en) * 2001-06-26 2010-08-03 Varian Medical Systems, Inc. Patient visual instruction techniques for synchronizing breathing with a medical procedure
US6733458B1 (en) 2001-09-25 2004-05-11 Acuson Corporation Diagnostic medical ultrasound systems and methods using image based freehand needle guidance
US6687530B2 (en) * 2001-12-21 2004-02-03 General Electric Company Method and system for tracking small coils using magnetic resonance
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US6990368B2 (en) 2002-04-04 2006-01-24 Surgical Navigation Technologies, Inc. Method and apparatus for virtual digital subtraction angiography
US7998062B2 (en) 2004-03-29 2011-08-16 Superdimension, Ltd. Endoscope structures and techniques for navigating to a target in branched structure
AR039475A1 (en) * 2002-05-01 2005-02-23 Wyeth Corp 6-ALQUILIDEN-PENEMS TRICICLICOS AS BETA-LACTAMASA INHIBITORS
US9314228B2 (en) * 2002-05-31 2016-04-19 Vidacare LLC Apparatus and method for accessing the bone marrow
US7720522B2 (en) * 2003-02-25 2010-05-18 Medtronic, Inc. Fiducial marker devices, tools, and methods
US20040019265A1 (en) * 2002-07-29 2004-01-29 Mazzocchi Rudy A. Fiducial marker devices, tools, and methods
US7787934B2 (en) * 2002-07-29 2010-08-31 Medtronic, Inc. Fiducial marker devices, tools, and methods
ATE433304T1 (en) * 2002-08-08 2009-06-15 Brainlab Ag PATIENT POSITIONING SYSTEM FOR RADIATION THERAPY/RADIO SURGERY BASED ON MAGNETIC TRACKING OF AN IMPLANT
US7166114B2 (en) 2002-09-18 2007-01-23 Stryker Leibinger Gmbh & Co Kg Method and system for calibrating a surgical tool and adapter thereof
JP3902109B2 (en) * 2002-10-02 2007-04-04 本田技研工業株式会社 Infrared camera characteristics confirmation jig
US7620444B2 (en) * 2002-10-05 2009-11-17 General Electric Company Systems and methods for improving usability of images for medical applications
US20060036158A1 (en) 2003-11-17 2006-02-16 Inrad, Inc. Self-contained, self-piercing, side-expelling marking apparatus
US7697972B2 (en) 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US7599730B2 (en) 2002-11-19 2009-10-06 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20040106869A1 (en) * 2002-11-29 2004-06-03 Ron-Tech Medical Ltd. Ultrasound tracking device, system and method for intrabody guiding procedures
US6837854B2 (en) * 2002-12-18 2005-01-04 Barbara Ann Karmanos Cancer Institute Methods and systems for using reference images in acoustic image processing
US7660623B2 (en) * 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US7542791B2 (en) 2003-01-30 2009-06-02 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
US20060281991A1 (en) * 2003-05-09 2006-12-14 Fitzpatrick J M Fiducial marker holder system for surgery
US20050119562A1 (en) * 2003-05-23 2005-06-02 Senorx, Inc. Fibrous marker formed of synthetic polymer strands
US7877133B2 (en) * 2003-05-23 2011-01-25 Senorx, Inc. Marker or filler forming fluid
US9504477B2 (en) 2003-05-30 2016-11-29 Vidacare LLC Powered driver
US20050033157A1 (en) * 2003-07-25 2005-02-10 Klein Dean A. Multi-modality marking material and method
US7662157B2 (en) * 2003-08-21 2010-02-16 Osteomed L.P. Bone anchor system
US7313430B2 (en) 2003-08-28 2007-12-25 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
US20050053267A1 (en) * 2003-09-05 2005-03-10 Varian Medical Systems Technologies, Inc. Systems and methods for tracking moving targets and monitoring object positions
US8571639B2 (en) * 2003-09-05 2013-10-29 Varian Medical Systems, Inc. Systems and methods for gating medical procedures
EP2113189B1 (en) 2003-09-15 2013-09-04 Covidien LP System of accessories for use with bronchoscopes
EP2316328B1 (en) 2003-09-15 2012-05-09 Super Dimension Ltd. Wrap-around holding device for use with bronchoscopes
DE10346615B4 (en) * 2003-10-08 2006-06-14 Aesculap Ag & Co. Kg Device for determining the position of a body part
US7835778B2 (en) * 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US20050273002A1 (en) 2004-06-04 2005-12-08 Goosen Ryan L Multi-mode imaging marker
US7771436B2 (en) * 2003-12-10 2010-08-10 Stryker Leibinger Gmbh & Co. Kg. Surgical navigation tracker, system and method
US7873400B2 (en) * 2003-12-10 2011-01-18 Stryker Leibinger Gmbh & Co. Kg. Adapter for surgical navigation trackers
US7532201B2 (en) * 2003-12-30 2009-05-12 Liposonix, Inc. Position tracking device
US8764725B2 (en) 2004-02-09 2014-07-01 Covidien Lp Directional anchoring mechanism, method and applications thereof
DE102004011744A1 (en) * 2004-03-03 2005-09-22 Aesculap Ag & Co. Kg A surgical or medical device and method for calibrating an ultrasonic sensor
US7770453B2 (en) * 2004-03-04 2010-08-10 Ludwiczak Damian R Vibrating debris remover
EP1720480A1 (en) 2004-03-05 2006-11-15 Hansen Medical, Inc. Robotic catheter system
US7976539B2 (en) * 2004-03-05 2011-07-12 Hansen Medical, Inc. System and method for denaturing and fixing collagenous tissue
US20070073306A1 (en) * 2004-03-08 2007-03-29 Ryan Lakin Cutting block for surgical navigation
US7641660B2 (en) * 2004-03-08 2010-01-05 Biomet Manufacturing Corporation Method, apparatus, and system for image guided bone cutting
US7166852B2 (en) * 2004-04-06 2007-01-23 Accuray, Inc. Treatment target positioning system
US7567834B2 (en) * 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
DE102004026525A1 (en) * 2004-05-25 2005-12-22 Aesculap Ag & Co. Kg Method and device for the non-invasive determination of prominent structures of the human or animal body
US8290570B2 (en) * 2004-09-10 2012-10-16 Stryker Leibinger Gmbh & Co., Kg System for ad hoc tracking of an object
US20060074305A1 (en) * 2004-09-30 2006-04-06 Varian Medical Systems Technologies, Inc. Patient multimedia display
US8007448B2 (en) * 2004-10-08 2011-08-30 Stryker Leibinger Gmbh & Co. Kg. System and method for performing arthroplasty of a joint and tracking a plumb line plane
US7632235B1 (en) 2004-11-22 2009-12-15 Pacesetter, Inc. System and method for measuring cardiac output via thermal dilution using an implantable medical device with an external ultrasound power delivery system
IL166115A (en) * 2005-01-03 2012-06-28 Dan Adam Interactive ultrasound-based depth measurement for medical applications
US7623250B2 (en) * 2005-02-04 2009-11-24 Stryker Leibinger Gmbh & Co. Kg. Enhanced shape characterization device and method
US10357328B2 (en) 2005-04-20 2019-07-23 Bard Peripheral Vascular, Inc. and Bard Shannon Limited Marking device with retractable cannula
WO2006118915A2 (en) * 2005-04-29 2006-11-09 Vanderbilt University System and methods of using image-guidance for providing an access to a cochlear of a living subject
US8066642B1 (en) 2005-05-03 2011-11-29 Sonosite, Inc. Systems and methods for ultrasound beam forming data control
EP1906858B1 (en) 2005-07-01 2016-11-16 Hansen Medical, Inc. Robotic catheter system
US9119541B2 (en) * 2005-08-30 2015-09-01 Varian Medical Systems, Inc. Eyewear for patient prompting
DE102005043828A1 (en) * 2005-09-13 2007-03-22 H.C. Starck Gmbh Process for the preparation of electrolytic capacitors
US20070073136A1 (en) * 2005-09-15 2007-03-29 Robert Metzger Bone milling with image guided surgery
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US8052658B2 (en) 2005-10-07 2011-11-08 Bard Peripheral Vascular, Inc. Drug-eluting tissue marker
US9168102B2 (en) 2006-01-18 2015-10-27 Medtronic Navigation, Inc. Method and apparatus for providing a container to a sterile environment
US8095198B2 (en) 2006-01-31 2012-01-10 Warsaw Orthopedic. Inc. Methods for detecting osteolytic conditions in the body
US8078282B2 (en) * 2006-02-01 2011-12-13 Warsaw Orthopedic, Inc Implantable tissue growth stimulator
US20070238992A1 (en) * 2006-02-01 2007-10-11 Sdgi Holdings, Inc. Implantable sensor
US7993269B2 (en) * 2006-02-17 2011-08-09 Medtronic, Inc. Sensor and method for spinal monitoring
US20070197895A1 (en) * 2006-02-17 2007-08-23 Sdgi Holdings, Inc. Surgical instrument to assess tissue characteristics
US7918796B2 (en) * 2006-04-11 2011-04-05 Warsaw Orthopedic, Inc. Volumetric measurement and visual feedback of tissues
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
EP1854425A1 (en) * 2006-05-11 2007-11-14 BrainLAB AG Position determination for medical devices with redundant position measurement and weighting to prioritise measurements
JP4868959B2 (en) * 2006-06-29 2012-02-01 オリンパスメディカルシステムズ株式会社 Body cavity probe device
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US8064987B2 (en) 2006-10-23 2011-11-22 C. R. Bard, Inc. Breast marker
US20100031014A1 (en) * 2006-12-06 2010-02-04 Shuji Senda Information concealing device, method, and program
EP3542748B1 (en) 2006-12-12 2023-08-16 C. R. Bard, Inc. Multiple imaging mode tissue marker
ES2432572T3 (en) 2006-12-18 2013-12-04 C.R. Bard, Inc. Biopsy marker with imaging properties generated in situ
EP2129316A2 (en) * 2007-02-26 2009-12-09 Koninklijke Philips Electronics N.V. Pointing device for medical imaging
US20080228072A1 (en) * 2007-03-16 2008-09-18 Warsaw Orthopedic, Inc. Foreign Body Identifier
EP1987774A1 (en) * 2007-05-03 2008-11-05 BrainLAB AG Measurement of sonographic acoustic velocity using a marker device
US10201324B2 (en) 2007-05-04 2019-02-12 Delphinus Medical Technologies, Inc. Patient interface system
US7953247B2 (en) * 2007-05-21 2011-05-31 Snap-On Incorporated Method and apparatus for wheel alignment
US9743906B2 (en) * 2007-05-31 2017-08-29 University Of Windsor Ultrasonic device for cosmetological human nail applications
JP2009056299A (en) 2007-08-07 2009-03-19 Stryker Leibinger Gmbh & Co Kg Method of and system for planning surgery
US8905920B2 (en) 2007-09-27 2014-12-09 Covidien Lp Bronchoscope adapter and method
WO2009094646A2 (en) 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8311610B2 (en) 2008-01-31 2012-11-13 C. R. Bard, Inc. Biopsy tissue marker
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
WO2009122273A2 (en) 2008-04-03 2009-10-08 Superdimension, Ltd. Magnetic interference detection system and method
EP3155995B1 (en) * 2008-04-03 2019-08-07 Visualase, Inc. Systems for thermal therapy
DE102008023218A1 (en) * 2008-05-10 2009-11-12 Aesculap Ag Method and device for examining a body with an ultrasound head
EP2297673B1 (en) 2008-06-03 2020-04-22 Covidien LP Feature-based registration method
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
US8932207B2 (en) 2008-07-10 2015-01-13 Covidien Lp Integrated multi-functional endoscopic tool
JP4582218B2 (en) * 2008-07-28 2010-11-17 ソニー株式会社 Stereoscopic image display device and manufacturing method thereof
JP4582219B2 (en) * 2008-07-28 2010-11-17 ソニー株式会社 Stereoscopic image display device and manufacturing method thereof
US20100033557A1 (en) * 2008-07-28 2010-02-11 Sony Corporation Stereoscopic image display and method for producing the same
JP2010032675A (en) * 2008-07-28 2010-02-12 Sony Corp Method for manufacturing stereoscopic image display, and stereoscopic image display
JP4525808B2 (en) * 2008-07-28 2010-08-18 ソニー株式会社 Stereoscopic image display device and manufacturing method thereof
US10667727B2 (en) * 2008-09-05 2020-06-02 Varian Medical Systems, Inc. Systems and methods for determining a state of a patient
US20100061596A1 (en) * 2008-09-05 2010-03-11 Varian Medical Systems Technologies, Inc. Video-Based Breathing Monitoring Without Fiducial Tracking
US9327061B2 (en) 2008-09-23 2016-05-03 Senorx, Inc. Porous bioabsorbable implant
US8165658B2 (en) 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
US8175681B2 (en) 2008-12-16 2012-05-08 Medtronic Navigation Inc. Combination of electromagnetic and electropotential localization
US8670818B2 (en) 2008-12-30 2014-03-11 C. R. Bard, Inc. Marker delivery device for tissue marker placement
US8444564B2 (en) 2009-02-02 2013-05-21 Jointvue, Llc Noninvasive diagnostic system
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8641621B2 (en) * 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8611984B2 (en) 2009-04-08 2013-12-17 Covidien Lp Locatable catheter
US9254123B2 (en) 2009-04-29 2016-02-09 Hansen Medical, Inc. Flexible and steerable elongate instruments with shape control and support elements
US7898353B2 (en) 2009-05-15 2011-03-01 Freescale Semiconductor, Inc. Clock conditioning circuit
US8494614B2 (en) 2009-08-31 2013-07-23 Regents Of The University Of Minnesota Combination localization system
US8494613B2 (en) 2009-08-31 2013-07-23 Medtronic, Inc. Combination localization system
US8758263B1 (en) 2009-10-31 2014-06-24 Voxel Rad, Ltd. Systems and methods for frameless image-guided biopsy and therapeutic intervention
WO2011159834A1 (en) 2010-06-15 2011-12-22 Superdimension, Ltd. Locatable expandable working channel and method
US10512451B2 (en) 2010-08-02 2019-12-24 Jointvue, Llc Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US20120191079A1 (en) 2011-01-20 2012-07-26 Hansen Medical, Inc. System and method for endoluminal and translumenal therapy
US9693752B2 (en) * 2011-06-21 2017-07-04 Rescon Ltd Non-resistive contact electrosonic sensor systems
US20130030363A1 (en) 2011-07-29 2013-01-31 Hansen Medical, Inc. Systems and methods utilizing shape sensing fibers
WO2013025613A1 (en) 2011-08-12 2013-02-21 Jointvue, Llc 3-d ultrasound imaging device and methods
CA2852233C (en) 2011-10-14 2023-08-08 Jointvue, Llc Real-time 3-d ultrasound reconstruction of knee and its implications for patient specific implants and 3-d joint injections
WO2013116240A1 (en) 2012-01-30 2013-08-08 Inneroptic Technology, Inc. Multiple medical device guidance
WO2013124735A1 (en) 2012-02-22 2013-08-29 Rescon Ltd Non-resistive contact electrical systems and methods for visualizing the structure and function of objects or systems
WO2013165380A2 (en) * 2012-04-30 2013-11-07 Empire Technology Development, Llc Infrared guide stars for endoscopic orienteering
US20140148673A1 (en) 2012-11-28 2014-05-29 Hansen Medical, Inc. Method of anchoring pullwire directly articulatable region in catheter
KR101371384B1 (en) 2013-01-10 2014-03-07 경북대학교 산학협력단 Tracking system and method for tracking using the same
US10123770B2 (en) 2013-03-13 2018-11-13 Delphinus Medical Technologies, Inc. Patient support system
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US20140277334A1 (en) 2013-03-14 2014-09-18 Hansen Medical, Inc. Active drives for robotic catheter manipulators
US9326822B2 (en) 2013-03-14 2016-05-03 Hansen Medical, Inc. Active drives for robotic catheter manipulators
US9408669B2 (en) 2013-03-15 2016-08-09 Hansen Medical, Inc. Active drive mechanism with finite range of motion
US20140276936A1 (en) 2013-03-15 2014-09-18 Hansen Medical, Inc. Active drive mechanism for simultaneous rotation and translation
USD716451S1 (en) 2013-09-24 2014-10-28 C. R. Bard, Inc. Tissue marker for intracorporeal site identification
USD715442S1 (en) 2013-09-24 2014-10-14 C. R. Bard, Inc. Tissue marker for intracorporeal site identification
USD715942S1 (en) 2013-09-24 2014-10-21 C. R. Bard, Inc. Tissue marker for intracorporeal site identification
USD716450S1 (en) 2013-09-24 2014-10-28 C. R. Bard, Inc. Tissue marker for intracorporeal site identification
US10046140B2 (en) 2014-04-21 2018-08-14 Hansen Medical, Inc. Devices, systems, and methods for controlling active drive systems
CN104013423B (en) * 2014-05-09 2016-07-13 杨松 B ultrasonic scanheads, B ultrasonic scanning system and B ultrasonic scan method
US10952593B2 (en) 2014-06-10 2021-03-23 Covidien Lp Bronchoscope adapter
US10285667B2 (en) 2014-08-05 2019-05-14 Delphinus Medical Technologies, Inc. Method for generating an enhanced image of a volume of tissue
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10843012B2 (en) * 2014-10-22 2020-11-24 Otsuka Medical Devices Co., Ltd. Optimized therapeutic energy delivery
US10539675B2 (en) 2014-10-30 2020-01-21 Seno Medical Instruments, Inc. Opto-acoustic imaging system with detection of relative orientation of light source and acoustic receiver using acoustic waves
US10925579B2 (en) 2014-11-05 2021-02-23 Otsuka Medical Devices Co., Ltd. Systems and methods for real-time tracking of a target tissue using imaging before and during therapy delivery
WO2016071325A1 (en) * 2014-11-06 2016-05-12 Koninklijke Philips N.V. Skin treatment system
RU2578298C1 (en) * 2014-11-24 2016-03-27 Самсунг Электроникс Ко., Лтд. Ultra-bandwidth device for determining profile of living organism tissue layers and corresponding method
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10426555B2 (en) 2015-06-03 2019-10-01 Covidien Lp Medical instrument with sensor for use in a system and method for electromagnetic navigation
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US9962134B2 (en) 2015-10-28 2018-05-08 Medtronic Navigation, Inc. Apparatus and method for maintaining image quality while minimizing X-ray dosage of a patient
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US10478254B2 (en) 2016-05-16 2019-11-19 Covidien Lp System and method to access lung tissue
US10463439B2 (en) 2016-08-26 2019-11-05 Auris Health, Inc. Steerable catheter with shaft load distributions
US11241559B2 (en) 2016-08-29 2022-02-08 Auris Health, Inc. Active drive for guidewire manipulation
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
CN106556837A (en) * 2016-11-09 2017-04-05 哈尔滨工程大学 A kind of ultra-short baseline localization method for quaternary space battle array
US11478662B2 (en) 2017-04-05 2022-10-25 Accuray Incorporated Sequential monoscopic tracking
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11219489B2 (en) 2017-10-31 2022-01-11 Covidien Lp Devices and systems for providing sensors in parallel with medical tools
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
CA3031832A1 (en) * 2018-01-29 2019-07-29 Queen's University At Kingston Apparatus and method for tracking a volume in a three-dimensional space
CN108507736B (en) * 2018-04-04 2019-07-26 北京理工大学 A kind of hydro-pneumatic spring accumulator condition detecting system based on ultrasound and method
US10806339B2 (en) 2018-12-12 2020-10-20 Voxel Rad, Ltd. Systems and methods for treating cancer using brachytherapy
TWI708591B (en) * 2019-12-06 2020-11-01 財團法人金屬工業研究發展中心 Three-dimensional real-time positioning method for orthopedic surgery

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4316391A (en) * 1979-11-13 1982-02-23 Ultra Med, Inc. Flow rate measurement
US4373532A (en) * 1980-07-07 1983-02-15 Palo Alto Medical Research Foundation Ultrasonic marker for physiologic diagnosis and method of using same
CH661199A5 (en) * 1983-12-22 1987-07-15 Sulzer Ag MARKING IMPLANT.
US4763652A (en) * 1986-04-16 1988-08-16 Northgate Research, Inc. Aiming system for kidney stone disintegrator
US4932414A (en) * 1987-11-02 1990-06-12 Cornell Research Foundation, Inc. System of therapeutic ultrasound and real-time ultrasonic scanning
US4951653A (en) * 1988-03-02 1990-08-28 Laboratory Equipment, Corp. Ultrasound brain lesioning system
CA2028065C (en) * 1989-11-15 1996-06-11 George S. Allen Method and apparatus for imaging the anatomy
DE4040307C2 (en) * 1990-12-17 1996-03-07 Eden Medizinische Elektronik G Device for positioning a Doppler signal generator and / or sensor
US5161536A (en) * 1991-03-22 1992-11-10 Catheter Technology Ultrasonic position indicating apparatus and methods
US5201715A (en) * 1991-11-19 1993-04-13 Mcghan Medical Corporation Implantable devices having ultrasonic echographic signature

Also Published As

Publication number Publication date
EP0650075A3 (en) 1999-01-07
EP0650075A2 (en) 1995-04-26
JP2004243140A (en) 2004-09-02
US5394875A (en) 1995-03-07
JPH07255723A (en) 1995-10-09

Similar Documents

Publication Publication Date Title
US5394875A (en) Automatic ultrasonic localization of targets implanted in a portion of the anatomy
US6106464A (en) Apparatus and method for bone surface-based registration of physical space with tomographic images and for guiding an instrument relative to anatomical sites in the image
US6678545B2 (en) System for determining the position in a scan image corresponding to the position of an imaging probe
US6678546B2 (en) Medical instrument guidance using stereo radiolocation
CA2161126C (en) System for locating relative positions of objects
US6585651B2 (en) Method and device for percutaneous determination of points associated with the surface of an organ
EP0699050B1 (en) Indicating the position of a probe
EP1306060B1 (en) System for carrying out surgery biopsy and ablation of a tumor or other physical anomaly
US20020183608A1 (en) Method and device for instrument, bone segment, tissue and organ navigation
JPH06254172A (en) Method to determine position of organ of patient at least about two image pickup devices
Maurer et al. AcouStick: a tracked A-mode ultrasonography system for registration in image-guided surgery
Schreiner et al. An ultrasonic approach to localization of fiducial markers for interactive, image-guided neurosurgery. II. Implementation and automation
Schreiner et al. Technique for the three-dimensional localization of implanted fiducial markers
Hsu First Year Progress Report

Legal Events

Date Code Title Description
FZDE Discontinued