|Publication number||US7050845 B2|
|Application number||US 10/134,977|
|Publication date||May 23, 2006|
|Filing date||Apr 29, 2002|
|Priority date||Dec 18, 2001|
|Also published as||DE50100535D1, DE50101677D1, EP1321105A1, EP1321105B1, EP1327421A1, EP1327421B1, US20030114741|
|Publication number||10134977, 134977, US 7050845 B2, US 7050845B2, US-B2-7050845, US7050845 B2, US7050845B2|
|Original Assignee||Brainlab Ag|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (15), Non-Patent Citations (1), Referenced by (19), Classifications (26), Legal Events (3)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention relates to a device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images. The device comprises an image display device, at least one video camera and a computer-assisted navigation system which can detect the spatial position of the image display device and/or the video camera and the spatial positions of a part of a patient's body via tracking means attached to it.
Such devices serve to visually assist a physician during diagnosis or while treating a patient. The option should be provided of combining images from the patient's interior with conventional video images, in order to facilitate treatment and diagnosis.
A video-based surgical target system is known from U.S. Pat. No. 5,765,561, for which among other things the use of a tracked video camera is proposed, whose images are superimposed onto assigned images from radioscopic imaging methods and/or tomographic imaging methods on the statically arranged screen of a navigation system. A particular disadvantage of such an embodiment is that the physician wishing to view the images always has to look away from the patient, to the screen of the navigation system. He can then no longer pay exact attention to the position in which the camera is, i.e. the exterior position from which he is obtaining the superimposed images. The system is therefore somewhat awkward and does not enable direct inspection within the part of the patient's body.
The present invention provides a device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images, which overcomes the disadvantages of the prior art cited above. In particular, it should be made possible for the treatment staff to obtain the desired image data directly and in direct view of the patient.
In accordance with the invention, an image display device and the video camera are assigned to each other and realized as a portable unit. Using this embodiment in accordance with the invention, the physician carrying out the treatment then has the option of holding the image display device, even with the assigned camera, in front of the part of the patient's body, and therefore of reading the desired anatomical information while looking directly at the patient. It is possible in this respect to adjust the video image or the image from the radioscopic imaging method and/or tomographic imaging method to be more or less transparent, as desired, in the image shown on the image display device, in order to highlight specifically the desired information most clearly within the display as a whole. Using the device in accordance with the invention, another step is taken towards creating a “glass patient”, simply by holding the image display device in front of the relevant part of the patient's body. Since the camera assignment to the image display device is tracked via the tracking means, the position of these two elements is always known, and the navigation system has information on how and from which direction the part of the patient's body is being viewed. In particular when performing minimally invasive surgery in which the area of surgery is not exposed, the device in accordance with the invention thus offers the physician carrying out the treatment the option of again informing himself of the exact position of the parts of the patient's anatomy during the operation. In this way, tracked or navigated instruments can of course also been shown on the image display device. The physician no longer has to look away from the patient in order to look into the interior of the patient's body.
In accordance with a preferred embodiment of the invention, the video camera is arranged on the rear side of the image display device. This results in a compact apparatus; the camera can be completely removed from the operator's field of view. It should be noted in principle that the camera can be attached at any point on the rear side of the image display device, or can be integrated into it, for example even in the area of the edge of the image display device.
The video camera is preferably spatially assigned to the image display device in a way which is predetermined and/or known to the system, and the tracking means is arranged on the image display device. Spatially assigning the video camera and the image display device in a way which is predetermined and/or known to the system opens up the option of tracking just one of these two apparatus by means of the tracking means, since it is also known exactly where the other apparatus is situated. The designer is then also at liberty to arrange the tracking means either on the image display device itself or on the video camera, according to how said tracking means can best be detected by the navigation system. In optically based navigation systems, the tracking means can also be an arrangement of reflecting or actively emitting markers, although it is of course also possible within the context of the present invention to perform navigation via a magnetic tracking system in which coils are tracked in a generated magnetic field.
In order to design the image display device to be easy to handle, it can be designed as a portable LCD flat screen.
Data transmission between the individual apparatus of the device in accordance with the invention is realized in different ways. On the one hand, it is possible to communicate the image data from the radioscopic imaging method and/or tomographic imaging method to be projected onto the video images to the image display device via radio interfaces of the navigation system, while on the other hand communicating the data via data cables is also conceivable. One embodiment ought to prove particularly advantageous within the context of handling the device in accordance with the invention, in which the image display device is provided with its own energy supply (battery or power pack) and transmission is by radio, since this achieves the most extensive freedom of handling.
In accordance with a preferred embodiment, the video camera should exhibit a small aperture and a low depth of field, so that only a small area of the detected image is in the focal plane. Using such a camera, it is possible to determine the distance between the image and the focal plane. If the spatial position of the camera is known from the navigation system, and the distance from the image plane is also known, then the computer system connected to the navigation system can calculate the spatial position of the video image in real time. This enables the video image and the image data from a radioscopic imaging method and/or tomographic imaging method carried out beforehand to be optimally assigned.
In accordance with another advantageous embodiment of the device in accordance with the invention, a data transmission means is arranged on the video camera or on the image display device, wherein information on the image shown on the image display device is transmitted to the navigation system by means of said data transmission means. The image data to be projected, communicated from the navigation system to the image display device, are then adapted in size and position on the image display device such that they are in registration with the video images, i.e. a 1:1 representation of the two images in overlap is created. To this end, a contour-matching unit can be provided in the navigation system, said contour-matching unit superimposing the video images and the projected images, in particular via outer contour matching of the part of the patient's body.
It should also be noted here that it may be conceivable and advantageous within the context of the invention to attach to the portable unit many and various control apparatus and command input apparatus for altering the combined image shown. For example, the focal plane both of the radioscopic imaging method or tomographic imaging method employed and of the video image shown can be altered using such control elements. As already mentioned above, it is also possible to provide control elements which alter the respective degree of transparency of one of the images with respect to the other.
An illuminating device for the part of the patient's body is advantageously provided on the camera or in its vicinity on the image display device, especially for example as a ring light (LEDs, lamps, fluorescent tubes).
The invention will now be described in more detail, by way of an embodiment. In the drawings provided for this purpose, there is shown:
As is indicated more clearly in
This type of image representation is made possible by the spatial position of the camera 14 or of the image screen 10 being tracked around the leg 1 of the patient by means of a tracking means. In the present case, this tracking means is an optical tracking means, namely an arrangement 12 of markers whose position is detected by means of the schematically represented navigation system 20. To this end, the navigation system 20 includes for example two spaced-out infrared cameras 24, 26 and an infrared light emitter 22. The three-dimensional spatial position of the arrangement 12 of markers, and therefore also of the screen 10 and the firmly installed camera 14, can be determined from the two images from the infrared cameras 24, 26. Furthermore, it is possible to determine the spatial position of the leg 1 using a corresponding arrangement 3 of markers.
If the position of the screen 10 and the camera 14 is then known, the position of the current image plane can be detected as follows. The camera 14 has, namely, a small aperture and a low depth of field, such that the image only appears in focus in a particular focal plane. This allows the position of said image plane to be determined and an image from the CT or MR imaging method to be transmitted on the screen 10, which is also exactly in said image plane. The important thing here is that the image ratio of the video image is shown at a scale of 1:1 with respect to the tomographic image. If the joint of a patient is then viewed using the device in accordance with the invention, one sees for example the contours of the patient's leg at the upper and lower edge of the screen. Using this information, the image from the tomographic imaging method can then be adapted such that it lies exactly over the video image, to ensure a realistic impression. The person viewing the screen 10 can then view on the screen both the video image, i.e. a real image, and the “interior” image from the tomographic imaging method, and in any desired depth and with any adjusted transparency of the two images. These adjustments can be made via control devices (not shown), for example flip switches on the front side of the screen.
For the first time, therefore, the device in accordance with the invention gives the observer the option of looking into the interior treatment area in real time while looking at the patient, and of planning or adapting the treatment accordingly. He no longer has to look away from the patient to a firmly installed navigation screen, which is particularly advantageous when the patient's responses to particular medical measures are to be tested. Since the “interior” images from the radioscopic imaging method and/or tomographic imaging method are formed such that they represent virtual 3-D views of the interior parts of the body, the image as a whole can be made very vivid.
Although a radioscopic and/or tomographic image is therefore not taken in situ, a virtual “glass patient” is thus created.
With respect to MR/CT image detection, an MR/CT image is firstly detected beforehand, whereupon a three-dimensional image is reconstructed from the data obtained. Then, the object—for example, a part of the patient's body—is spatially registered, i.e. the spatial relation of the position of the object to the MR/CT images is determined. This can only be realized by spatially detecting an arrangement of markings situated on the object.
When the video image detection and MR/CT image processing steps cited above have been performed, then video/MR/CT-image matching is performed, i.e. the two images are harmonized under computer guidance, with respect to image position and image size. Then, the MR/CT/3-D images, projected onto the video images, can be superimposed on the image display device, and can assist the physician performing the treatment in diagnose and treatment.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5526812||Oct 27, 1995||Jun 18, 1996||General Electric Company||Display system for enhancing visualization of body structures during medical procedures|
|US5694142||Nov 17, 1994||Dec 2, 1997||General Electric Company||Interactive digital arrow (d'arrow) three-dimensional (3D) pointing|
|US5715836||Feb 15, 1994||Feb 10, 1998||Kliegis; Ulrich||Method and apparatus for planning and monitoring a surgical operation|
|US5765561||Oct 7, 1994||Jun 16, 1998||Medical Media Systems||Video-based surgical targeting system|
|US5961456||Mar 28, 1996||Oct 5, 1999||Gildenberg; Philip L.||System and method for displaying concurrent video and reconstructed surgical views|
|US6038467||Jan 16, 1998||Mar 14, 2000||U.S. Philips Corporation||Image display system and image guided surgery system|
|US6477400 *||Aug 16, 1999||Nov 5, 2002||Sofamor Danek Holdings, Inc.||Fluoroscopic image guided orthopaedic surgery system with intraoperative registration|
|US6490467 *||Jun 26, 1998||Dec 3, 2002||Surgical Navigation Technologies, Inc.||Surgical navigation systems including reference and localization frames|
|US6640128 *||Apr 13, 2001||Oct 28, 2003||Brainlab Ag||Method and device for the navigation-assisted dental treatment|
|US20010035871||Mar 30, 2001||Nov 1, 2001||Johannes Bieger||System and method for generating an image|
|US20020082498||Oct 5, 2001||Jun 27, 2002||Siemens Corporate Research, Inc.||Intra-operative image-guided neurosurgery with augmented reality visualization|
|US20020140694||Mar 27, 2001||Oct 3, 2002||Frank Sauer||Augmented reality guided instrument positioning with guiding graphics|
|US20020163499||Mar 28, 2002||Nov 7, 2002||Frank Sauer||Method and apparatus for augmented reality visualization|
|EP0672389A2||Mar 14, 1995||Sep 20, 1995||Roke Manor Research Limited||Video-based system for computer assisted surgery and localisation|
|WO1996020421A1||Dec 22, 1995||Jul 4, 1996||Bernhard Braunecker||Microscope, in particular a stereomicroscope, and a method of superimposing two images|
|1||Sauer, Frank et al. "Augmented Reality Visualization in iMRI Operating Room: System Description and Pre-clinical Testing." Proceedings of SPIE, Medical Imaging. Jan. 23-26, 2002.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7463823||Jun 23, 2006||Dec 9, 2008||Brainlab Ag||Stereoscopic visualization device for patient image data and video images|
|US7853307||Jun 26, 2008||Dec 14, 2010||Veran Medical Technologies, Inc.||Methods, apparatuses, and systems useful in conducting image guided interventions|
|US7907987||Mar 15, 2011||University Of Florida Research Foundation, Inc.||System for delivering conformal radiation therapy while simultaneously imaging soft tissue|
|US7920909||Apr 25, 2006||Apr 5, 2011||Veran Medical Technologies, Inc.||Apparatus and method for automatic image guided accuracy verification|
|US8118818 *||Dec 15, 2006||Feb 21, 2012||Ao Technology Ag||Method and device for computer assisted distal locking of intramedullary nails|
|US8150495||Nov 7, 2008||Apr 3, 2012||Veran Medical Technologies, Inc.||Bodily sealants and methods and apparatus for image-guided delivery of same|
|US8190233||May 29, 2012||University Of Florida Research Foundation, Inc.||System for delivering conformal radiation therapy while simultaneously imaging soft tissue|
|US8442281 *||Apr 28, 2006||May 14, 2013||The Invention Science Fund I, Llc||Artificially displaying information relative to a body|
|US8483801||Nov 8, 2010||Jul 9, 2013||Veran Medical Technologies, Inc.||Methods, apparatuses, and systems useful in conducting image guided interventions|
|US8696549||Aug 22, 2011||Apr 15, 2014||Veran Medical Technologies, Inc.||Apparatus and method for four dimensional soft tissue navigation in endoscopic applications|
|US8750568||May 22, 2012||Jun 10, 2014||Covidien Lp||System and method for conformal ablation planning|
|US8781186||May 5, 2011||Jul 15, 2014||Pathfinder Therapeutics, Inc.||System and method for abdominal surface matching using pseudo-features|
|US8810640||May 16, 2011||Aug 19, 2014||Ut-Battelle, Llc||Intrinsic feature-based pose measurement for imaging motion compensation|
|US8876830||Aug 11, 2010||Nov 4, 2014||Zimmer, Inc.||Virtual implant placement in the OR|
|US9138165||Feb 22, 2013||Sep 22, 2015||Veran Medical Technologies, Inc.||Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation|
|US20050197564 *||Feb 17, 2005||Sep 8, 2005||University Of Florida Research Foundation, Inc.||System for delivering conformal radiation therapy while simultaneously imaging soft tissue|
|US20070019936 *||Jun 23, 2006||Jan 25, 2007||Rainer Birkenbach||Stereoscopic visualization device for patient image data and video images|
|US20070253614 *||Apr 28, 2006||Nov 1, 2007||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Artificially displaying information relative to a body|
|US20090054910 *||Dec 15, 2006||Feb 26, 2009||Ao Technology Ag||Method and device for computer assisted distal locking of intramedullary nails|
|U.S. Classification||600/427, 600/407|
|International Classification||A61B5/05, A61B5/055, A61B6/00, A61B19/00|
|Cooperative Classification||A61B2019/5289, A61B6/461, A61B19/5244, A61B2019/5291, A61B6/547, A61B2019/5255, A61B19/5212, A61B6/462, A61B19/5202, A61B6/00, A61B19/52, A61B5/055, A61B2019/5272, A61B2019/5229|
|European Classification||A61B6/46B2, A61B6/46B, A61B6/54H, A61B19/52H12, A61B19/52, A61B6/00|
|Jun 26, 2002||AS||Assignment|
Owner name: BRAINLAB AG, GERMANY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VILSMEIER, STEFAN;REEL/FRAME:013046/0062
Effective date: 20020610
|Nov 19, 2009||FPAY||Fee payment|
Year of fee payment: 4
|Nov 15, 2013||FPAY||Fee payment|
Year of fee payment: 8