Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20050143651 A1
Publication typeApplication
Application numberUS 11/068,342
Publication dateJun 30, 2005
Filing dateFeb 28, 2005
Priority dateAug 19, 2002
Also published asEP1391181A1, EP1913893A2, EP1913893A3, EP1913893B1, US6892090, US20040034300
Publication number068342, 11068342, US 2005/0143651 A1, US 2005/143651 A1, US 20050143651 A1, US 20050143651A1, US 2005143651 A1, US 2005143651A1, US-A1-20050143651, US-A1-2005143651, US2005/0143651A1, US2005/143651A1, US20050143651 A1, US20050143651A1, US2005143651 A1, US2005143651A1
InventorsLaurent Verard, Paul Kessman, Mark Hunter
Original AssigneeLaurent Verard, Paul Kessman, Mark Hunter
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method and apparatus for virtual endoscopy
US 20050143651 A1
Abstract
A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electromagnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Images(4)
Previous page
Next page
Claims(51)
1. A navigation system to track a surgical instrument relative to a patient, comprising:
a tracking subsystem operable to capture in real-time position data indicative of the position of the surgical instrument;
a data processor adapted to receive scan data representative of a region of interest of a given patient and the position data from the tracking subsystem, the data processor being operable to render an image of the region of interest from a point of view which relates to position of the surgical instrument, the image being derived from the scan data; and
a display in data communication with the data processor, the display being operable to display the image of the patient.
2. The navigation system of claim 1, further comprising:
a timing signal generator operable to generate and transmit a timing signal that correlates to at least one anatomical function of the patient;
wherein the tracking subsystem is operable to receive the timing signal from the timing signal generator, the tracking subsystem operable to capture position data indicative of the position of the surgical instrument and to report the position data in response to the timing signal received from the timing signal generator;
wherein the data processor is adapted to receive scan image data representative of an internal region of interest within a given patient and the position data from the tracking subsystem, the data processor being operable to render a volumetric perspective image of the internal region of interest from the scan image data and to superimpose an indicia of the surgical instrument onto the volumetric perspective image based on the position data received from the tracking subsystem.
3. The navigation system of claim 2 wherein the timing signal is generated at a repetitive point within each cycle of either a cardiac cycle or a respiratory cycle of the patient, thereby minimizing any jitter of the surgical instrument in the volumetric perspective image which may be caused by the cardiac cycle or the respiratory cycle of the patient.
4. The navigation system of claim 2 wherein the data processor is further operable to the track position of the surgical instrument as it is moved within the region of interest and to update the corresponding position of the indicia of the surgical instrument in the volumetric perspective image of the patient.
5. The navigation system of claim 1 wherein the data processor is further operable to track in real-time the location and orientation of the surgical instrument as it is moved within the region of interest and the display is further operable to display the location and orientation of the surgical instrument.
6. The navigation system of claim 1, wherein the scan data includes two dimensional scan data, three dimensional scan data, four dimensional scan data, or combinations thereof.
7. The navigation system of claim 1, wherein the scan data can be obtained preoperatively, intra-operatively, from an atlas map, or combinations thereof.
8. The navigation system of claim 1, wherein the tracking subsystem is operable to capture real time position data indicative of a position of the surgical instrument in the patient, an orientation of the surgical instrument in the patient, or combinations thereof.
9. The navigation system of claim 1, wherein the tracking subsystem is an electromagnetic tracking subsystem including a tracking sensor interconnected with the surgical instrument to assist in determining a position and orientation of the surgical instrument relative to the patient.
10. The navigation system of claim 1, wherein the rendered image displayed on the display is operable to be from the point of view of the surgical instrument, a point of view at a angle relative to the surgical instrument, or combinations thereof.
11. The navigation system of claim 5, wherein the data processor is operable to create a map of the area through which the surgical instrument is moved by tracking the real time location and orientation of the surgical instrument over time.
12. The navigation system of claim 11, wherein the map is displayed on the display.
13. The navigation system of claim 1, wherein the data processor is operable to compensate for error in the tracking subsystem and/or inaccuracies caused by anatomical shift caused during acquisition of the scanned data.
14. The navigation system of claim 1, further comprising:
an imaging device operable to create the scanned data representative of a region of interest of a given patient.
15. The navigation system of claim 14, wherein said imaging device includes a magnetic resonance imaging scanner, a computed tomography scanner, an ultrasound system, a positron emission tomography, or combinations thereof.
16. The navigation system of claim 1, further comprising:
a disposable surgical instrument.
17. The navigation system of claim 1, further comprising:
a surgical instrument selected from a group consisting of a guide wire, a pointer probe, a stent, a seed, an implant, an endoscope, a catheter, or combinations thereof.
18. The navigation system of claim 1, wherein said tracking subsystem includes wireless communication with the surgical instrument.
19. The navigation system of claim 1, wherein said tracking subsystem is an electromagnetic tracking subsystem, an optical tracking subsystem, a sonic tracking subsystem, an infrared tracking subsystem, a radiation tracking subsystem, or combinations thereof.
20. The navigation system of claim 1, further comprising:
an accuracy enhancing subsystem operable to enhance visualization and/or refined accuracy of the displayed image data;
wherein the enhanced accuracy subsystem is operable to compensate for an error in the tracking subsystem when tracking the surgical instrument through a selected vessel.
21. A surgical instrument navigation system to display a virtual image from the point of view of a surgical instrument within a patient, comprising:
a tracking subsystem operable to capture position data indicative of the position of the surgical instrument;
a data processor adapted to receive scan image data representative of an internal region of interest within a given patient and the position data from the tracking subsystem, the data processor being operable to render an image of the internal region of interest from the scan image data and to superimpose an indicia of the surgical instrument onto the rendered image based on the position data received from the tracking subsystem; and
a display in data communication with the data processor, the display being operable to display the rendered image of the patient.
22. The surgical instrument navigation system of claim 21, wherein the rendered volumetric perspective image of the internal region of interest of the patient is from a point of view of the surgical instrument and is displayed on the display.
23. The surgical instrument navigation system of claim 22, wherein the tracking subsystem is operable to track both a position and an orientation of the surgical instrument to allow said data processor to render a volumetric prospective image of the internal region of interest from a point of view of the surgical instrument.
24. The surgical instrument navigation system of claim 21, wherein the scan image data is at least one of two dimensional, three dimensional, four dimensional, or combinations thereof.
25. The surgical instrument navigation system of claim 24, wherein the data processor is operable to render a volumetric perspective image of the internal region of interest based upon at least one scanned image data set of the patient.
26. The surgical instrument navigation system of claim 21, wherein tracking the subsystem includes a tracking sensor interconnected with the surgical instrument and a localizing device operable to determine a position of the tracking sensor;
wherein said tracking subsystem is operable to determine at least a position, an orientation, or combinations thereof of the tracking sensor.
27. The surgical instrument navigation system of claim 21, wherein the data processor is operable to render at least one of a volumetric perspective image, a surface rendered image, or combinations thereof.
28. The surgical instrument navigation system of claim 21, wherein the rendered image includes an image from a point of view other than a point of view of the surgical instrument.
29. The surgical instrument navigation system of claim 21, further comprising:
a secondary image;
wherein the data processor is operable to render a secondary image of the area of interest; and
wherein the display is operable to display the rendered secondary image.
30. The surgical instrument navigation system of claim 21, wherein the data processor is operable to render a four dimensional image of the patient;
wherein a change of the patient over time is illustrated on the rendered image.
31. The surgical instrument navigation system of claim 21, further comprising:
a disposable surgical instrument.
32. The surgical instrument navigation system of claim 21, wherein said tracking subsystem is an electromagnetic tracking subsystem, an optical tracking subsystem, an infrared tracking subsystem, a sonic tracking subsystem, a radiation tracking subsystem, or combinations thereof.
33. The surgical instrument navigation system of claim 21, further comprising:
at least one surgical instrument selected from a group consisting of a guide wire, a pointer probe, a stent, a seed, an implant, an endoscope, a catheter, or combinations thereof.
34. The surgical instrument navigation system of claim 21, further comprising:
an imaging device operable to create the scan image data.
35. A method of creating image data representative of a point of view of a surgical instrument relative to a patient for display on a display, comprising:
obtaining image data of a patient;
tracking the surgical instrument;
determining a position of the surgical instrument;
determining an orientation of the surgical instrument;
creating image data relating to a point of view of the surgical instrument within the patient; and
displaying the created image data illustrating the point of view of the surgical instrument within the patient.
36. The method of claim 35, wherein obtaining image data of the patient includes obtaining two dimensional image data, three dimensional image data, four dimensional data, or combinations thereof.
37. The method of claim 36, wherein displaying the created image data includes displaying a changeover time of the image data from the four dimensional image data.
38. The method of claim 35, wherein tracking a surgical instrument, includes determining a position of a tracking sensor interconnected with the surgical instrument with a localizing device.
39. The method of claim 35, further comprising:
creating secondary image data relating to a point of view external to an area of interest of the patient and a surgical instrument.
40. The method of claim 35, wherein creating an image data relating to a point of view of the surgical instrument within the patient includes correcting for errors in at least one of determining the position of the surgical instrument, determining an orientation of the surgical instrument, a change in the position of the patient from a present time to the time of obtaining the image data of a patient, or combinations thereof.
41. The method of claim 40, wherein correcting for errors includes:
using statistical methods to determine if a surgical instrument has potentially punctured a vessel.
42. The method of claim 41, further comprising:
transmitting the indication error to a user.
43. The method of claim 40, wherein correcting for errors includes determining a likely position of the surgical instrument within a vessel based upon image data of the patient and a probable location of the surgical instrument.
44. The method of claim 35, wherein obtaining image data of a patient includes obtaining an atlas map image data.
45. The method of claim 35, further comprising:
displaying an indicia of the position of a portion of the surgical instrument relative to the obtained image data of the patient.
46. The method of claim 35, further comprising:
rendering a volumetric prospective image, rendering a surface rendered image of the region of interest, or combinations thereof.
47. The method of claim 35, further comprising:
displaying a secondary view relative to the displayed created image data illustrated in the point of the view of the surgical instrument within the patient.
48. The method of claim 35, further comprising:
interconnecting a timing signal generator with the patient to track a change in the patient over time;
wherein determining a position of the surgical instrument and determining orientation of the surgical instrument includes accounting for the change in the patient determined by the timing signal generator.
49. The method of claim 35, wherein the timing device measures a respirator cycle, a cardiac cycle, or combinations thereof to allow for a reduction of jitter and/or flutter in displaying the created image data.
50. The method of claim 35 further comprising:
obtaining the image data using at least one of a magnetic resonance imaging system, a computed tomography imaging system, a positron emission tomography imaging system, an ultrasound imaging system, or combinations thereof.
51. The method of claim 35, further comprising:
selecting a disposable surgical instrument.
Description
    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a continuation of U.S. patent application Ser. No. 10/223,847 filed on Aug. 19, 2002. The disclosure of the above application is incorporated herein by reference.
  • FIELD
  • [0002]
    The present teachings relates generally to surgical instrument navigation systems and, more particularly, to a system that visually simulates a virtual volumetric scene of a body cavity from a point of view of a surgical instrument residing in a patient.
  • BACKGROUND
  • [0003]
    Precise imaging of portions of the anatomy is an increasingly important technique in the medical and surgical fields. In order to lessen the trauma to a patient caused by invasive surgery, techniques have been developed for performing surgical procedures within the body through small incisions with minimal invasion. These procedures generally require the surgeon to operate on portions of the anatomy that are not directly visible, or can be seen only with difficulty. Furthermore, some parts of the body contain extremely complex or small structures and it is necessary to enhance the visibility of these structures to enable the surgeon to perform more delicate procedures. In addition, planning such procedures required the evaluation of the location and orientation of these structures within the body in order to determine the optimal surgical trajectory.
  • [0004]
    Endoscopy is one commonly employed technique for visualizing internal regions of interest within a patient. Flexible endoscopes enable surgeons to visually inspect a region prior to or during surgery. However, flexible endoscopes are relatively expensive, limited in flexibility due to construction and obscured by blood and other biological materials.
  • [0005]
    Therefore, it is desirable to provide a cost effective alternative technique for visualizing an internal regions of interest within a patient.
  • SUMMARY
  • [0006]
    A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the patient. The surgical instrument navigation system generally includes: a surgical instrument, such as a guide wire or catheter; a tracking subsystem that captures real-time position data indicative of the position (location and/or orientation) of the surgical instrument; a data processor which is operable to render a volumetric image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric image of the patient. The surgical instrument navigation system may also include an imaging device which is operable to capture 2D and/or 3D volumetric scan data representative of an internal region of interest within a given patient.
  • [0007]
    For a more complete understanding of the present teachings, reference may be made to the following specification and to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0008]
    FIG. 1 is a diagram of an exemplary surgical instrument navigation system according to various embodiments;
  • [0009]
    FIG. 2 is a flowchart that depicts a technique for simulating a virtual volumetric scene of a body cavity from a point of view of a surgical instrument positioned within the patient according to various embodiments;
  • [0010]
    FIG. 3 is an exemplary display from the surgical instrument navigation system according to various embodiments;
  • [0011]
    FIG. 4 is a flowchart that depicts a technique for synchronizing the display of an indicia or graphical representation of the surgical instrument with cardiac or respiratory cycle of the patient according to various embodiments; and
  • [0012]
    FIG. 5 is a flowchart that depicts a technique for generating four-dimensional image data that is synchronized with the patient according to various embodiments.
  • DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS
  • [0013]
    FIG. 1 is a diagram of an exemplary surgical instrument navigation system 10. According to various embodiments, the surgical instrument navigation system 10 is operable to visually simulate a virtual volumetric scene within the body of a patient, such as an internal body cavity, from a point of view of a surgical instrument 12 residing in the cavity of a patient 13. To do so, the surgical instrument navigation system 10 is primarily comprised of a surgical instrument 12, a data processor 16 having a display 18, and a tracking subsystem 20. The surgical instrument navigation system 10 may further include (or accompanied by) an imaging device 14 that is operable to provide image data to the system.
  • [0014]
    The surgical instrument 12 is preferably a relatively inexpensive, flexible and/or steerable catheter that may be of a disposable type. The surgical instrument 12 is modified to include one or more tracking sensors that are detectable by the tracking subsystem 20. It is readily understood that other types of surgical instruments (e.g., a guide wire, a pointer probe, a stent, a seed, an implant, an endoscope, etc.) are also within the scope of the present teachings. It is also envisioned that at least some of these surgical instruments may be wireless or have wireless communications links. It is also envisioned that the surgical instruments may encompass medical devices which are used for exploratory purposes, testing purposes or other types of medical procedures.
  • [0015]
    Referring to FIG. 2, the imaging device 14 is used to capture volumetric scan data 32 representative of an internal region of interest within the patient 13. The three-dimensional scan data is preferably obtained prior to surgery on the patient 13. In this case, the captured volumetric scan data may be stored in a data store associated with the data processor 16 for subsequent processing. However, one skilled in the art will readily recognize that the principles of the present teachings may also extend to scan data acquired during surgery. It is readily understood that volumetric scan data may be acquired using various known medical imaging devices 14, including but not limited to a magnetic resonance imaging (MRI) device, a computed tomography (CT) imaging device, a positron emission tomography (PET) imaging device, a 2D or 3D fluoroscopic imaging device, and 2D, 3D or 4D ultrasound imaging devices. In the case of a two-dimensional ultrasound imaging device or other two-dimensional image acquisition device, a series of two-dimensional data sets may be acquired and then assembled into volumetric data as is well known in the art using a two-dimensional to three-dimensional conversion.
  • [0016]
    A dynamic reference frame 19 is attached to the patient proximate to the region of interest within the patient 13. To the extent that the region of interest is a vessel or a cavity within the patient, it is readily understood that the dynamic reference frame 19 may be placed within the patient 13. To determine its location, the dynamic reference frame 19 is also modified to include tracking sensors detectable by the tracking subsystem 20. The tracking subsystem 20 is operable to determine position data for the dynamic reference frame 19 as further described below.
  • [0017]
    The volumetric scan data is then registered as shown at 34. Registration of the dynamic reference frame 19 generally relates information in the volumetric scan data to the region of interest associated with the patient. This process is referred to as registering image space to patient space. Often, the image space must also be registered to another image space. Registration is accomplished through knowledge of the coordinate vectors of at least three non-collinear points in the image space and the patient space.
  • [0018]
    Registration for image guided surgery can be completed by different known techniques. First, point-to-point registration is accomplished by identifying points in an image space and then touching the same points in patient space. These points are generally anatomical landmarks that are easily identifiable on the patient. Second, surface registration involves the user's generation of a surface in patient space by either selecting multiple points or scanning, and then accepting the best fit to that surface in image space by iteratively calculating with the data processor until a surface match is identified. Third, repeat fixation devices entail the user repeatedly removing and replacing a device (i.e., dynamic reference frame, etc.) in known relation to the patient or image fiducials of the patient. Fourth, automatic registration by first attaching the dynamic reference frame to the patient prior to acquiring image data. It is envisioned that other known registration procedures are also within the scope of the present teachings, such as that disclosed in U.S. Ser. No. 09/274,972, filed on Mar. 23, 1999, entitled “NAVIGATIONAL GUIDANCE VIA COMPUTER-ASSISTED FLUOROSCOPIC IMAGING”, which is hereby incorporated by reference.
  • [0019]
    During surgery, the surgical instrument 12 is directed by the surgeon to the region of interest within the patient 13. The tracking subsystem 20 preferably employs electromagnetic sensing to capture position data 37 indicative of the location and/or orientation of the surgical instrument 12 within the patient. The tracking subsystem 20 may be defined as a localizing device 22 and one or more electromagnetic sensors 24 may be integrated into the items of interest, such as the surgical instrument 12. In one embodiment, the localizing device 22 is comprised of three or more field generators (transmitters) mounted at known locations on a plane surface and the electro-magnetic sensor (receivers) 24 is further defined as a single coil of wire. The positioning of the field generators (transmitter), and the sensors (receivers) may also be reversed, such that the generators are associated with the surgical instrument 12 and the receivers are positioned elsewhere. Although not limited thereto, the localizing device 22 may be affixed to an underneath side of the operating table that supports the patient.
  • [0020]
    In operation, the field generators generate magnetic fields which are detected by the sensor. By measuring the magnetic fields generated by each field generator at the sensor, the location and orientation of the sensor may be computed, thereby determining position data for the surgical instrument 12. Although not limited thereto, exemplary electromagnetic tracking subsystems are further described in U.S. Pat. Nos. 5,913,820; 5,592,939; and 6,374,134 which are incorporated herein by reference. In addition, it is envisioned that other types of position tracking devices are also within the scope of the present teachings. For instance, non line-of-sight tracking subsystem 20 may be based on sonic emissions or radio frequency emissions. In another instance, a rigid surgical instrument, such as a rigid endoscope may be tracked using a line-of-sight optical-based tracking subsystem (i.e., LED's, passive markers, reflective markers, etc).
  • [0021]
    Position data such as location and/or orientation data from the tracking subsystem 20 is in turn relayed to the data processor 16. The data processor 16 is adapted to receive position/orientation data from the tracking subsystem 20 and operable to render a volumetric perspective image and/or a surface rendered image of the region of interest. The volumetric perspective and/or surface image is rendered 36 from the scan data 32 using rendering techniques well known in the art. The image data may be further manipulated 38 based on the position/orientation data for the surgical instrument 12 received from tracking subsystem 20. Specifically, the volumetric perspective or surface rendered image is rendered from a point of view which relates to position of the surgical instrument 12. For instance, at least one electromagnetic sensor 24 may be positioned at the tip of the surgical instrument 12, such that the image is rendered from a leading point on the surgical instrument. In this way, the surgical instrument navigation system 10 according to various embodiments is able, for example, to visually simulate a virtual volumetric scene of an internal cavity from the point of view of the surgical instrument 12 residing in the cavity without the use of an endoscope. It is readily understood that tracking two or more electromagnetic sensors 24 which are embedded in the surgical instrument 12 enables orientation of the surgical instrument 12 to be determined by the system 10.
  • [0022]
    As the surgical instrument 12 is moved by the surgeon within the region of interest, its position and orientation are tracked and reported on a real-time basis by the tracking subsystem 20. The volumetric perspective image may then be updated by manipulating 38 the rendered image data 36 based on the position of the surgical instrument 12. The manipulated volumetric perspective image is displayed 40 on a display device 18 associated with the data processor 16. The display 18 is preferably located such that it can be easily viewed by the surgeon during the medical procedure. In one embodiment, the display 18 may be further defined as a heads-up display or any other appropriate display. The image may also be stored by data processor 16 for later playback, should this be desired.
  • [0023]
    It is envisioned that the primary perspective image 38 of the region of interest may be supplemented by other secondary images. For instance, known image processing techniques may be employed to generate various multi-planar images of the region of interest. Alternatively, images may be generated from different view points as specified by a user 39, including views from outside of the vessel or cavity or views that enable the user to see through the walls of the vessel using different shading or opacity. In another instance, the location data of the surgical instrument may be saved and played back in a movie format. It is envisioned that these various secondary images may be displayed simultaneously with or in place of the primary perspective image.
  • [0024]
    In addition, the surgical instrument 12 may be used to generate real-time maps corresponding to an internal path traveled by the surgical instrument or an external boundary of an internal cavity. Real-time maps are generated by continuously recording the position of the instrument's localized tip and its full extent. A real-time map is generated by the outermost extent of the instrument's position and minimum extrapolated curvature as is known in the art. The map may be continuously updated as the instrument is moved within the patient, thereby creating a path or a volume representing the internal boundary of the cavity. It is envisioned that the map may be displayed in a wire frame form, as a shaded surface or other three-dimensional computer display modality independent from or superimposed on the volumetric perspective image 38 of the region of interest. It is further envisioned that the map may include data collected from a sensor embedded into the surgical instrument, such as pressure data, temperature data or electro-physiological data. In this case, the map may be color coded to represent the collected data.
  • [0025]
    FIG. 3 illustrates another type of secondary image 28 which may be displayed in conjunction with the primary perspective image 38. In this instance, the primary perspective image is an interior view of an air passage within the patient 13. The secondary image 28 is an exterior view of the air passage which includes an indicia or graphical representation 29 that corresponds to the location of the surgical instrument 12 within the air passage. In FIG. 3, the indicia 29 is shown as a crosshairs. It is envisioned that other indicia may be used to signify the location of the surgical instrument in the secondary image. As further described below, the secondary image 28 is constructed by superimposing the indicia 29 of the surgical instrument 12 onto the manipulated image data 38.
  • [0026]
    Referring to FIG. 4, the display of an indicia of the surgical instrument 12 on the secondary image may be synchronized with an anatomical function, such as the cardiac or respiratory cycle, of the patient. In certain instances, the cardiac or respiratory cycle of the patient may cause the surgical instrument 12 to flutter or jitter within the patient. For instance, a surgical instrument 12 positioned in or near a chamber of the heart will move in relation to the patient's heart beat. In these instance, the indicia of the surgical instrument 12 will likewise flutter or jitter on the displayed image 40. It is envisioned that other anatomical functions which may effect the position of the surgical instrument 12 within the patient are also within the scope of the present teachings.
  • [0027]
    To eliminate the flutter of the indicia on the displayed image 40, position data for the surgical instrument 12 is acquired at a repetitive point within each cycle of either the cardiac cycle or the respiratory cycle of the patient. As described above, the imaging device 14 is used to capture volumetric scan data 42 representative of an internal region of interest within a given patient. A secondary image may then be rendered 44 from the volumetric scan data by the data processor 16.
  • [0028]
    In order to synchronize the acquisition of position data for the surgical instrument 12, the surgical instrument navigation system 10 may further include a timing signal generator 26. The timing signal generator 26 is operable to generate and transmit a timing signal 46 that correlates to at least one of (or both) the cardiac cycle or the respiratory cycle of the patient 13. For a patient having a consistent rhythmic cycle, the timing signal might be in the form of a periodic clock signal. Alternatively, the timing signal may be derived from an electrocardiogram signal from the patient 13. One skilled in the art will readily recognize other techniques for deriving a timing signal that correlate to at least one of the cardiac or respiratory cycle or other anatomical cycle of the patient.
  • [0029]
    As described above, the indicia of the surgical instrument 12 tracks the movement of the surgical instrument 12 as it is moved by the surgeon within the patient 13. Rather than display the indicia of the surgical instrument 12 on a real-time basis, the display of the indicia of the surgical instrument 12 is periodically updated 48 based on the timing signal from the timing signal generator 26. In one exemplary embodiment, the timing generator 26 is electrically connected to the tracking subsystem 20. The tracking subsystem 20 is in turn operable to report position data for the surgical instrument 12 in response to a timing signal received from the timing signal generator 26. The position of the indicia of the surgical instrument 12 is then updated 50 on the display of the image data. It is readily understood that other techniques for synchronizing the display of an indicia of the surgical instrument 12 based on the timing signal are within the scope of the present teachings, thereby eliminating any flutter or jitter which may appear on the displayed image 52. It is also envisioned that a path (or projected path) of the surgical instrument 12 may also be illustrated on the displayed image data 52.
  • [0030]
    According to various embodiments the surgical instrument navigation system 10 may be further adapted to display four-dimensional image data for a region of interest as shown in FIG. 5. In this case, the imaging device 14 is operable to capture volumetric scan data 62 for an internal region of interest over a period of time, such that the region of interest includes motion that is caused by either the cardiac cycle or the respiratory cycle of the patient 13. A volumetric perspective view of the region may be rendered 64 from the volumetric scan data 62 by the data processor 16 as described above. The four-dimensional image data may be further supplemented with other patient data, such as temperature or blood pressure, using coloring coding techniques.
  • [0031]
    In order to synchronize the display of the volumetric perspective view in real-time with the cardiac or respiratory cycle of the patient, the data processor 16 is adapted to receive a timing signal from the timing signal generator 26. As described above, the timing signal generator 26 is operable to generate and transmit a timing signal that correlates to either the cardiac cycle or the respiratory cycle of the patient 13. In this way, the volumetric perspective image may be synchronized 66 with the cardiac or respiratory cycle of the patient 13. The synchronized image 66 is then displayed 68 on the display 18 of the system. The four-dimensional synchronized image may be either (or both of) the primary image rendered from the point of view of the surgical instrument or the secondary image depicting the indicia of the position of the surgical instrument 12 within the patient 13. It is readily understood that the synchronization process is also applicable to two-dimensional image data acquire over time.
  • [0032]
    To enhance visualization and refine accuracy of the displayed image data, the surgical navigation system can use prior knowledge such as the segmented vessel structure to compensate for error in the tracking subsystem or for inaccuracies caused by an anatomical shift occurring since acquisition of scan data. For instance, it is known that the surgical instrument 12 being localized is located within a given vessel and, therefore should be displayed within the vessel. Statistical methods can be used to determine the most likely location within the vessel with respect to the reported location and then compensate so the display accurately represents the instrument 12 within the center of the vessel. The center of the vessel can be found by segmenting the vessels from the three-dimensional datasets and using commonly known imaging techniques to define the centerline of the vessel tree. Statistical methods may also be used to determine if the surgical instrument 12 has potentially punctured the vessel. This can be done by determining the reported location is too far from the centerline or the trajectory of the path traveled is greater than a certain angle (worse case 90 degrees) with respect to the vessel. Reporting this type of trajectory (error) is very important to the clinicians. The tracking along the center of the vessel may also be further refined by correcting for motion of the respiratory or cardiac cycle, as described above.
  • [0033]
    The surgical instrument navigation system according to various embodiments may also incorporate atlas maps. It is envisioned that three-dimensional or four-dimensional atlas maps may be registered with patient specific scan data or generic anatomical models. Atlas maps may contain kinematic information (e.g., heart models) that can be synchronized with four-dimensional image data, thereby supplementing the real-time information. In addition, the kinematic information may be combined with localization information from several instruments to provide a complete four-dimensional model of organ motion. The atlas maps may also be used to localize bones or soft tissue which can assist in determining placement and location of implants.
  • [0034]
    While the teachings has been described in its presently preferred form, it will be understood that the teachings is capable of modification without departing from the spirit of the teachings as set forth in the appended claims.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US1576781 *Apr 22, 1924Mar 16, 1926Philips Herman BFluoroscopic fracture apparatus
US3016899 *Nov 3, 1958Jan 16, 1962Carl B StenvallSurgical instrument
US3017887 *Jan 19, 1960Jan 23, 1962William T HeyerStereotaxy device
US3073310 *Aug 5, 1957Jan 15, 1963Zenon R MocarskiSurgical instrument positioning device
US3367326 *Jun 15, 1965Feb 6, 1968Calvin H. FrazierIntra spinal fixation rod
US3644825 *Dec 31, 1969Feb 22, 1972Texas Instruments IncMagnetic detection system for detecting movement of an object utilizing signals derived from two orthogonal pickup coils
US3868565 *Jul 30, 1973Feb 25, 1975Jack KuipersObject tracking and orientation determination means, system and process
US3941127 *Oct 3, 1974Mar 2, 1976Froning Edward CApparatus and method for stereotaxic lateral extradural disc puncture
US4182312 *May 20, 1977Jan 8, 1980Mushabac David RDental probe
US4256112 *Feb 12, 1979Mar 17, 1981David Kopf InstrumentsHead positioner
US4314251 *Jul 30, 1979Feb 2, 1982The Austin CompanyRemote object position and orientation locater
US4317078 *Oct 15, 1979Feb 23, 1982Ohio State University Research FoundationRemote position and orientation detection employing magnetic flux linkage
US4319136 *Nov 9, 1979Mar 9, 1982Jinkins J RandolphComputerized tomography radiograph data transfer cap
US4368536 *Nov 19, 1980Jan 11, 1983Siemens AktiengesellschaftDiagnostic radiology apparatus for producing layer images
US4431005 *May 7, 1981Feb 14, 1984Mccormick Laboratories, Inc.Method of and apparatus for determining very accurately the position of a device inside biological tissue
US4506676 *Sep 10, 1982Mar 26, 1985Duska Alois ARadiographic localization technique
US4571834 *May 16, 1985Feb 25, 1986Orthotronics Limited PartnershipKnee laxity evaluator and motion module/digitizer arrangement
US4572198 *Jun 18, 1984Feb 25, 1986Varian Associates, Inc.Catheter for use with NMR imaging systems
US4638798 *Sep 10, 1980Jan 27, 1987Shelden C HunterStereotactic method and apparatus for locating and treating or removing lesions
US4642786 *May 25, 1984Feb 10, 1987Position Orientation Systems, Ltd.Method and apparatus for position and orientation measurement using a magnetic field and retransmission
US4645343 *Jun 4, 1984Feb 24, 1987U.S. Philips CorporationAtomic resonance line source lamps and spectrophotometers for use with such lamps
US4649504 *May 22, 1984Mar 10, 1987Cae Electronics, Ltd.Optical position and orientation measurement techniques
US4651732 *Apr 11, 1985Mar 24, 1987Frederick Philip RThree-dimensional light guidance system for invasive procedures
US4653509 *Jul 3, 1985Mar 31, 1987The United States Of America As Represented By The Secretary Of The Air ForceGuided trephine samples for skeletal bone studies
US4719419 *Jul 15, 1985Jan 12, 1988Harris Graphics CorporationApparatus for detecting a rotary position of a shaft
US4722056 *Feb 18, 1986Jan 26, 1988Trustees Of Dartmouth CollegeReference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope
US4722336 *Jan 25, 1985Feb 2, 1988Michael KimPlacement guide
US4727565 *Nov 13, 1984Feb 23, 1988Ericson Bjoern EMethod of localization
US4797907 *Aug 7, 1987Jan 10, 1989Diasonics Inc.Battery enhanced power generation for mobile X-ray machine
US4803976 *Apr 8, 1988Feb 14, 1989SynthesSighting instrument
US4821731 *Dec 8, 1987Apr 18, 1989Intra-Sonix, Inc.Acoustic image system and method
US4905698 *Sep 13, 1988Mar 6, 1990Pharmacia Deltec Inc.Method and apparatus for catheter location determination
US4989608 *Apr 28, 1989Feb 5, 1991Ratner Adam VDevice construction and method facilitating magnetic resonance imaging of foreign objects in a body
US4991579 *Nov 10, 1987Feb 12, 1991Allen George SMethod and apparatus for providing related images over time of a portion of the anatomy using fiducial implants
US5002058 *Apr 18, 1989Mar 26, 1991Intra-Sonix, Inc.Ultrasonic transducer
US5079699 *Aug 9, 1989Jan 7, 1992Picker International, Inc.Quick three-dimensional display
US5086401 *May 11, 1990Feb 4, 1992International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US5094241 *Jan 19, 1990Mar 10, 1992Allen George SApparatus for imaging the anatomy
US5097839 *Feb 13, 1990Mar 24, 1992Allen George SApparatus for imaging the anatomy
US5099845 *May 23, 1990Mar 31, 1992Micronix Pty Ltd.Medical instrument location means
US5178164 *Mar 29, 1991Jan 12, 1993Allen George SMethod for implanting a fiducial implant into a patient
US5178621 *Dec 10, 1991Jan 12, 1993Zimmer, Inc.Two-piece radio-transparent proximal targeting device for a locking intramedullary nail
US5186174 *Jan 29, 1990Feb 16, 1993G. M. PiaffProcess and device for the reproducible optical representation of a surgical operation
US5187475 *Jun 10, 1991Feb 16, 1993Honeywell Inc.Apparatus for determining the position of an object
US5188126 *Mar 25, 1992Feb 23, 1993Fabian Carl ESurgical implement detector utilizing capacitive coupling
US5190059 *Mar 25, 1992Mar 2, 1993Fabian Carl ESurgical implement detector utilizing a powered marker
US5197476 *Oct 11, 1991Mar 30, 1993Christopher NowackiLocating target in human body
US5279309 *Jul 27, 1992Jan 18, 1994International Business Machines CorporationSignaling device and method for monitoring positions in a surgical operation
US5345938 *Sep 28, 1992Sep 13, 1994Kabushiki Kaisha ToshibaDiagnostic apparatus for circulatory systems
US5377678 *Jul 14, 1993Jan 3, 1995General Electric CompanyTracking system to follow the position and orientation of a device with radiofrequency fields
US5383454 *Jul 2, 1992Jan 24, 1995St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US5385146 *Jan 8, 1993Jan 31, 1995Goldreyer; Bruce N.Orthogonal sensing for use in clinical electrophysiology
US5385148 *Jul 30, 1993Jan 31, 1995The Regents Of The University Of CaliforniaCardiac imaging and ablation catheter
US5386828 *Aug 17, 1993Feb 7, 1995Sims Deltec, Inc.Guide wire apparatus with location sensing member
US5389101 *Apr 21, 1992Feb 14, 1995University Of UtahApparatus and method for photogrammetric surgical localization
US5391199 *Jul 20, 1993Feb 21, 1995Biosense, Inc.Apparatus and method for treating cardiac arrhythmias
US5394457 *Oct 7, 1993Feb 28, 1995Leibinger GmbhDevice for marking body sites for medical examinations
US5480422 *Sep 23, 1994Jan 2, 1996Biosense, Inc.Apparatus for treating cardiac arrhythmias
US5483961 *Aug 29, 1994Jan 16, 1996Kelly; Patrick J.Magnetic field digitizer for stereotactic surgery
US5485849 *Jan 31, 1994Jan 23, 1996Ep Technologies, Inc.System and methods for matching electrical characteristics and propagation velocities in cardiac tissue
US5487391 *Jan 28, 1994Jan 30, 1996Ep Technologies, Inc.Systems and methods for deriving and displaying the propagation velocities of electrical events in the heart
US5487729 *Mar 25, 1994Jan 30, 1996Cordis CorporationMagnetic guidewire coupling for catheter exchange
US5487757 *Feb 21, 1995Jan 30, 1996Medtronic CardiorhythmMulticurve deflectable catheter
US5490196 *Mar 18, 1994Feb 6, 1996Metorex International OyMulti energy system for x-ray imaging applications
US5494034 *Jun 15, 1994Feb 27, 1996Georg SchlondorffProcess and device for the reproducible optical representation of a surgical operation
US5592939 *Jun 14, 1995Jan 14, 1997Martinelli; Michael A.Method and system for navigating a catheter probe
US5595193 *Jun 6, 1995Jan 21, 1997Walus; Richard L.Tool for implanting a fiducial marker
US5596228 *Jul 28, 1995Jan 21, 1997Oec Medical Systems, Inc.Apparatus for cooling charge coupled device imaging systems
US5600330 *Jul 12, 1994Feb 4, 1997Ascension Technology CorporationDevice for measuring position and orientation using non-dipole magnet IC fields
US5603318 *Oct 29, 1993Feb 18, 1997University Of Utah Research FoundationApparatus and method for photogrammetric surgical localization
US5704897 *Aug 2, 1993Jan 6, 1998Truppe; Michael J.Apparatus and method for registration of points of a data field with respective points of an optical image
US5711299 *Jan 26, 1996Jan 27, 1998Manwaring; Kim H.Surgical guidance method and system for approaching a target within a body
US5713946 *Oct 28, 1996Feb 3, 1998Biosense, Inc.Apparatus and method for intrabody mapping
US5715822 *Sep 28, 1995Feb 10, 1998General Electric CompanyMagnetic resonance devices suitable for both tracking and imaging
US5715836 *Feb 15, 1994Feb 10, 1998Kliegis; UlrichMethod and apparatus for planning and monitoring a surgical operation
US5718241 *Jun 7, 1995Feb 17, 1998Biosense, Inc.Apparatus and method for treating cardiac arrhythmias with no discrete target
US5830145 *Dec 24, 1996Nov 3, 1998Cardiovascular Imaging Systems, Inc.Enhanced accuracy of three-dimensional intraluminal ultrasound (ILUS) image reconstruction
US5865846 *May 15, 1997Feb 2, 1999Bryan; VincentHuman spinal disc prosthesis
US5868674 *Nov 22, 1996Feb 9, 1999U.S. Philips CorporationMRI-system and catheter for interventional procedures
US5868675 *May 10, 1990Feb 9, 1999Elekta Igs S.A.Interactive system for local intervention inside a nonhumogeneous structure
US5871445 *Sep 7, 1995Feb 16, 1999St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US5871455 *Apr 23, 1997Feb 16, 1999Nikon CorporationOphthalmic apparatus
US5871487 *Mar 10, 1997Feb 16, 1999Cytotherpeutics, Inc.Microdrive for use in stereotactic surgery
US5873822 *Apr 24, 1996Feb 23, 1999Visualization Technology, Inc.Automatic registration system for use with position tracking and imaging system for use in medical applications
US5885218 *Nov 7, 1997Mar 23, 1999Scimed Life Systems, Inc.Method and apparatus for spatial filtering in an intravascular ultrasound imaging system
US6016439 *Oct 9, 1997Jan 18, 2000Biosense, Inc.Method and apparatus for synthetic viewpoint imaging
US6019724 *Feb 8, 1996Feb 1, 2000Gronningsaeter; AageMethod for ultrasound guidance during clinical procedures
US6019725 *Mar 7, 1997Feb 1, 2000Sonometrics CorporationThree-dimensional tracking and imaging system
US6024695 *May 6, 1999Feb 15, 2000International Business Machines CorporationSystem and method for augmentation of surgery
US6038468 *Sep 28, 1998Mar 14, 2000Roke Manor Research Ltd.Catheter localisation system
US6172499 *Oct 29, 1999Jan 9, 2001Ascension Technology CorporationEddy current error-reduced AC magnetic position measurement system
US6175756 *Dec 15, 1998Jan 16, 2001Visualization Technology Inc.Position tracking and imaging system for use in medical applications
US6175757 *Feb 2, 1998Jan 16, 2001General Electric CompanyLuminal mapping
US6341231 *Oct 11, 2000Jan 22, 2002Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US6346940 *Feb 27, 1998Feb 12, 2002Kabushiki Kaisha ToshibaVirtualized endoscope system
US6351659 *Aug 28, 1997Feb 26, 2002Brainlab Med. Computersysteme GmbhNeuro-navigation system
US6436073 *Aug 9, 2000Aug 20, 2002Joseph M. Von TeichertButterfly anchor for an infusion set
US6511417 *Sep 1, 1999Jan 28, 2003Olympus Optical Co., Ltd.System for detecting the shape of an endoscope using source coils and sense coils
US6511418 *Mar 30, 2001Jan 28, 2003The Board Of Trustees Of The Leland Stanford Junior UniversityApparatus and method for calibrating and endoscope
US6517478 *Mar 30, 2001Feb 11, 2003Cbyon, Inc.Apparatus and method for calibrating an endoscope
US6522324 *Feb 17, 1999Feb 18, 2003Koninklijke Philips Electronics N.V.Deriving an iso-surface in a multi-dimensional data field
US6522907 *Jan 21, 2000Feb 18, 2003British Telecommunications Public Limited CompanySurgical navigation
US6694162 *Dec 4, 2001Feb 17, 2004Brainlab AgNavigated microprobe
US20020010384 *Mar 30, 2001Jan 24, 2002Ramin ShahidiApparatus and method for calibrating an endoscope
US20030032878 *Aug 27, 2002Feb 13, 2003The Board Of Trustees Of The Leland Stanford Junior UniversityMethod and apparatus for volumetric image navigation
Non-Patent Citations
Reference
1 *Lorigo et al., Codimension-two geodesic active contours for the segmentation of tubular structures, Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference, page(s): 444 - 451 vol.1
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7623679Dec 13, 2006Nov 24, 2009Accuray IncorporatedTemporal smoothing of a deformation model
US7812290Apr 30, 2007Oct 12, 2010Boston Scientific Scimed, Inc.Resonator for medical device
US7822244 *Sep 25, 2006Oct 26, 2010Brainlab AgSegmenting medical image data sets
US7838806Nov 2, 2007Nov 23, 2010Boston Scientific Scimed, Inc.Resonator with adjustable capacitor for medical device
US7853307Jun 26, 2008Dec 14, 2010Veran Medical Technologies, Inc.Methods, apparatuses, and systems useful in conducting image guided interventions
US7871369Mar 31, 2009Jan 18, 2011Boston Scientific Scimed, Inc.Cardiac sleeve apparatus, system and method of use
US7920909Apr 25, 2006Apr 5, 2011Veran Medical Technologies, Inc.Apparatus and method for automatic image guided accuracy verification
US8046048Jul 29, 2008Oct 25, 2011Boston Scientific Scimed, Inc.Resonator with adjustable capacitance for medical device
US8046052Mar 24, 2010Oct 25, 2011Medtronic Navigation, Inc.Navigation system for cardiac therapies
US8058593Aug 31, 2009Nov 15, 2011Boston Scientific Scimed, Inc.Resonator for medical device
US8060185 *Oct 5, 2009Nov 15, 2011Medtronic Navigation, Inc.Navigation system for cardiac therapies
US8066759Aug 19, 2005Nov 29, 2011Boston Scientific Scimed, Inc.Resonator for medical device
US8150495Nov 7, 2008Apr 3, 2012Veran Medical Technologies, Inc.Bodily sealants and methods and apparatus for image-guided delivery of same
US8175681Dec 16, 2008May 8, 2012Medtronic Navigation Inc.Combination of electromagnetic and electropotential localization
US8369930Jun 16, 2010Feb 5, 2013MRI Interventions, Inc.MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US8396532Jun 16, 2010Mar 12, 2013MRI Interventions, Inc.MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US8401616Sep 23, 2011Mar 19, 2013Medtronic Navigation, Inc.Navigation system for cardiac therapies
US8467853Nov 14, 2011Jun 18, 2013Medtronic Navigation, Inc.Navigation system for cardiac therapies
US8483801Nov 8, 2010Jul 9, 2013Veran Medical Technologies, Inc.Methods, apparatuses, and systems useful in conducting image guided interventions
US8494613Jul 27, 2010Jul 23, 2013Medtronic, Inc.Combination localization system
US8494614Jul 27, 2010Jul 23, 2013Regents Of The University Of MinnesotaCombination localization system
US8696549Aug 22, 2011Apr 15, 2014Veran Medical Technologies, Inc.Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US8731641May 7, 2012May 20, 2014Medtronic Navigation, Inc.Combination of electromagnetic and electropotential localization
US8768433Dec 21, 2012Jul 1, 2014MRI Interventions, Inc.MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US8781186May 5, 2011Jul 15, 2014Pathfinder Therapeutics, Inc.System and method for abdominal surface matching using pseudo-features
US8801601 *Nov 16, 2012Aug 12, 2014Intuitive Surgical Operations, Inc.Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
US8825133Jan 24, 2013Sep 2, 2014MRI Interventions, Inc.MRI-guided catheters
US8855396Jan 28, 2011Oct 7, 2014Siemens Medical Solutions Usa, Inc.System for detecting an invasive anatomical instrument
US8886288Jan 10, 2013Nov 11, 2014MRI Interventions, Inc.MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US8913084Dec 21, 2012Dec 16, 2014Volcano CorporationMethod and apparatus for performing virtual pullback of an intravascular imaging device
US9078685Jul 5, 2012Jul 14, 2015Globus Medical, Inc.Method and system for performing invasive medical procedures using a surgical robot
US9138165Feb 22, 2013Sep 22, 2015Veran Medical Technologies, Inc.Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
US9218663Feb 26, 2011Dec 22, 2015Veran Medical Technologies, Inc.Apparatus and method for automatic image guided accuracy verification
US9218664May 9, 2011Dec 22, 2015Veran Medical Technologies, Inc.Apparatus and method for image guided accuracy verification
US9259290Jun 8, 2010Feb 16, 2016MRI Interventions, Inc.MRI-guided surgical systems with proximity alerts
US9439735Jun 8, 2010Sep 13, 2016MRI Interventions, Inc.MRI-guided interventional systems that can track and generate dynamic visualizations of flexible intrabody devices in near real time
US20060287705 *May 24, 2005Dec 21, 2006Boston Scientific Scimed, Inc.Resonator for medical device
US20070049789 *Aug 29, 2005Mar 1, 2007Boston Scientific Scimed, Inc.Cardiac sleeve apparatus, system and method of use
US20070076932 *Sep 25, 2006Apr 5, 2007Andreas BlumhoferSegmenting medical image data sets
US20070213809 *Apr 30, 2007Sep 13, 2007Jan WeberResonator for medical device
US20080061788 *Nov 2, 2007Mar 13, 2008Boston Scientific Scimed, Inc.Resonator with adjustable capacitor for medical device
US20080081991 *Sep 28, 2006Apr 3, 2008West Jay BRadiation treatment planning using four-dimensional imaging data
US20080144908 *Dec 13, 2006Jun 19, 2008West Jay BTemporal smoothing of a deformation model
US20080167545 *Jan 9, 2007Jul 10, 2008Oliver MeissnerClinical workflow for combined 2D/3D diagnostic and therapeutic phlebograph examinations using a robotic angiography system
US20080234576 *Mar 23, 2007Sep 25, 2008General Electric CompanySystem and method to track movement of a tool in percutaneous replacement of a heart valve
US20080290958 *Jul 29, 2008Nov 27, 2008Torsten ScheuermannResonator with adjustable capacitance for medical device
US20090187064 *Mar 31, 2009Jul 23, 2009Boston Scientific Scimed, Inc.Cardiac sleeve apparatus, system and method of use
US20090216645 *Feb 23, 2009Aug 27, 2009What's In It For Me.Com LlcSystem and method for generating leads for the sale of goods and services
US20090319025 *Aug 31, 2009Dec 24, 2009Boston Scientific Scimed, Inc.Resonator for medical device
US20100249506 *Mar 26, 2009Sep 30, 2010Intuitive Surgical, Inc.Method and system for assisting an operator in endoscopic navigation
US20100295931 *Mar 31, 2010Nov 25, 2010Robert SchmidtMedical navigation image output comprising virtual primary images and actual secondary images
US20110137152 *Dec 3, 2009Jun 9, 2011General Electric CompanySystem and method for cooling components of a surgical navigation system
US20120059248 *Aug 22, 2011Mar 8, 2012Troy HolsingApparatus and method for airway registration and navigation
US20130144124 *Nov 16, 2012Jun 6, 2013Intuitive Surgical Operations, Inc.Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
WO2014099535A1 *Dec 11, 2013Jun 26, 2014Volcano CorporationMethod and apparatus for performing virtual pullback of an intravascular imaging device
WO2015123401A1 *Feb 12, 2015Aug 20, 2015Edda Technology, Inc.Method and system for displaying a timing signal for surgical instrument insertion in surgical procedures
Classifications
U.S. Classification600/424
International ClassificationA61B17/00, A61B19/00
Cooperative ClassificationA61B90/36, A61B34/20, A61B2017/00703, A61B2034/2051, A61B2034/101, A61B2034/2072, A61B2090/367, A61B2090/365, A61B90/361, A61B2034/2055, A61B2034/256, A61B34/10, A61B2034/105, A61B2034/107
European ClassificationA61B19/52H12, A61B19/52
Legal Events
DateCodeEventDescription
Jun 2, 2014ASAssignment
Owner name: SURGICAL NAVIGATION TECHNOLOGIES, INC., COLORADO
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VERARD, LAURENT;KESSMAN, PAUL;HUNTER, MARK;REEL/FRAME:033010/0373
Effective date: 20020814
Owner name: MEDTRONIC NAVIGATION, INC., COLORADO
Free format text: CHANGE OF NAME;ASSIGNOR:SURGICAL NAVIGATION TECHNOLOGIES, INC.;REEL/FRAME:033075/0772
Effective date: 20041220