Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS7050845 B2
Publication typeGrant
Application numberUS 10/134,977
Publication dateMay 23, 2006
Filing dateApr 29, 2002
Priority dateDec 18, 2001
Fee statusPaid
Also published asDE50100535D1, DE50101677D1, EP1321105A1, EP1321105B1, EP1327421A1, EP1327421B1, US20030114741
Publication number10134977, 134977, US 7050845 B2, US 7050845B2, US-B2-7050845, US7050845 B2, US7050845B2
InventorsStefan Vilsmeier
Original AssigneeBrainlab Ag
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images
US 7050845 B2
Abstract
The invention relates to a device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images, comprising an image display device (10), at least one video camera (14) and a computer-assisted navigation system which can detect the spatial positions of the image display device (10) and/or the video camera (14) and the spatial positions of a part of a patient's body (1) via tracking means (12, 3) attached to it, wherein the image display device (10) and the video camera (14) are assigned to each other and are realized as a portable unit.
Images(3)
Previous page
Next page
Claims(14)
1. A device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images, comprising an image display device, at least one video camera, and a computer-assisted navigation system which can detect the spatial positions of said image display device and/or said video camera and the spatial positions of a part of a patient's body via tracking means attached to it, wherein said image display device and said video camera are assigned to each other and are realized as a portable unit.
2. The device as set forth in claim 1, wherein said video camera is arranged on the rear side of said image display device.
3. The device as set forth in claim 1, wherein said video camera is spatially assigned to said image display device in a way which is predetermined and/or known to the system.
4. The device as set forth in claim 1, wherein said video camera is spatially assigned to said image display device in a way which is predetermined and/or known to the system.
5. The device as set forth in claim 1, wherein said image display device is a portable LCD flat screen.
6. The device as set forth in claim 1, wherein the image data from said radioscopic imaging method and/or tomographic imaging method to be projected onto said video images are communicated to said image display device via radio interfaces of said navigation system.
7. The device as set forth in claim 1, wherein the image data from said radioscopic imaging method and/or tomographic imaging method to be projected onto said video images are communicated to said image display device via a data cable of said navigation system.
8. The device as set forth in claim 1, wherein said video camera exhibits a small aperture and a low depth of field.
9. The device as set forth in claim 1, wherein a data transmission means is arranged on said video camera or on said image display device, by means of which information on the image shown on said image display device is transmitted to said navigation system, wherein the image data to be projected, from said radioscopic imaging methods and/or tomographic imaging methods, communicated from said navigation system to said image display device, are adapted in size and position on said image display device such that they are in registration with said video images.
10. The device as set forth in claim 9, wherein said navigation system comprises a contour-matching unit which superimposes said video images and said projected images, in particular via outer contour matching of said part of the patient's body.
11. The device as set forth in claim 1, wherein an illuminating device for said part of the patient's body is provided on said camera or in its vicinity on said image display device.
12. The device as set forth in claim 1, wherein control apparatus and command input apparatus for altering the combined image shown are attached to the portable unit.
13. The device as set forth in claim 12, wherein the control apparatus and command input apparatus alter the focal plane both of the radioscopic or tomographic image data employed and of the video image shown.
14. The device as set forth in claim 12, wherein the control apparatus and command input apparatus alter the respective degree of transparency of one of the images with respect to the other.
Description

The present invention relates to a device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images. The device comprises an image display device, at least one video camera and a computer-assisted navigation system which can detect the spatial position of the image display device and/or the video camera and the spatial positions of a part of a patient's body via tracking means attached to it.

Such devices serve to visually assist a physician during diagnosis or while treating a patient. The option should be provided of combining images from the patient's interior with conventional video images, in order to facilitate treatment and diagnosis.

BACKGROUND OF THE INVENTION

A video-based surgical target system is known from U.S. Pat. No. 5,765,561, for which among other things the use of a tracked video camera is proposed, whose images are superimposed onto assigned images from radioscopic imaging methods and/or tomographic imaging methods on the statically arranged screen of a navigation system. A particular disadvantage of such an embodiment is that the physician wishing to view the images always has to look away from the patient, to the screen of the navigation system. He can then no longer pay exact attention to the position in which the camera is, i.e. the exterior position from which he is obtaining the superimposed images. The system is therefore somewhat awkward and does not enable direct inspection within the part of the patient's body.

SUMMARY OF THE INVENTION

The present invention provides a device for projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images, which overcomes the disadvantages of the prior art cited above. In particular, it should be made possible for the treatment staff to obtain the desired image data directly and in direct view of the patient.

In accordance with the invention, an image display device and the video camera are assigned to each other and realized as a portable unit. Using this embodiment in accordance with the invention, the physician carrying out the treatment then has the option of holding the image display device, even with the assigned camera, in front of the part of the patient's body, and therefore of reading the desired anatomical information while looking directly at the patient. It is possible in this respect to adjust the video image or the image from the radioscopic imaging method and/or tomographic imaging method to be more or less transparent, as desired, in the image shown on the image display device, in order to highlight specifically the desired information most clearly within the display as a whole. Using the device in accordance with the invention, another step is taken towards creating a “glass patient”, simply by holding the image display device in front of the relevant part of the patient's body. Since the camera assignment to the image display device is tracked via the tracking means, the position of these two elements is always known, and the navigation system has information on how and from which direction the part of the patient's body is being viewed. In particular when performing minimally invasive surgery in which the area of surgery is not exposed, the device in accordance with the invention thus offers the physician carrying out the treatment the option of again informing himself of the exact position of the parts of the patient's anatomy during the operation. In this way, tracked or navigated instruments can of course also been shown on the image display device. The physician no longer has to look away from the patient in order to look into the interior of the patient's body.

In accordance with a preferred embodiment of the invention, the video camera is arranged on the rear side of the image display device. This results in a compact apparatus; the camera can be completely removed from the operator's field of view. It should be noted in principle that the camera can be attached at any point on the rear side of the image display device, or can be integrated into it, for example even in the area of the edge of the image display device.

The video camera is preferably spatially assigned to the image display device in a way which is predetermined and/or known to the system, and the tracking means is arranged on the image display device. Spatially assigning the video camera and the image display device in a way which is predetermined and/or known to the system opens up the option of tracking just one of these two apparatus by means of the tracking means, since it is also known exactly where the other apparatus is situated. The designer is then also at liberty to arrange the tracking means either on the image display device itself or on the video camera, according to how said tracking means can best be detected by the navigation system. In optically based navigation systems, the tracking means can also be an arrangement of reflecting or actively emitting markers, although it is of course also possible within the context of the present invention to perform navigation via a magnetic tracking system in which coils are tracked in a generated magnetic field.

In order to design the image display device to be easy to handle, it can be designed as a portable LCD flat screen.

Data transmission between the individual apparatus of the device in accordance with the invention is realized in different ways. On the one hand, it is possible to communicate the image data from the radioscopic imaging method and/or tomographic imaging method to be projected onto the video images to the image display device via radio interfaces of the navigation system, while on the other hand communicating the data via data cables is also conceivable. One embodiment ought to prove particularly advantageous within the context of handling the device in accordance with the invention, in which the image display device is provided with its own energy supply (battery or power pack) and transmission is by radio, since this achieves the most extensive freedom of handling.

In accordance with a preferred embodiment, the video camera should exhibit a small aperture and a low depth of field, so that only a small area of the detected image is in the focal plane. Using such a camera, it is possible to determine the distance between the image and the focal plane. If the spatial position of the camera is known from the navigation system, and the distance from the image plane is also known, then the computer system connected to the navigation system can calculate the spatial position of the video image in real time. This enables the video image and the image data from a radioscopic imaging method and/or tomographic imaging method carried out beforehand to be optimally assigned.

In accordance with another advantageous embodiment of the device in accordance with the invention, a data transmission means is arranged on the video camera or on the image display device, wherein information on the image shown on the image display device is transmitted to the navigation system by means of said data transmission means. The image data to be projected, communicated from the navigation system to the image display device, are then adapted in size and position on the image display device such that they are in registration with the video images, i.e. a 1:1 representation of the two images in overlap is created. To this end, a contour-matching unit can be provided in the navigation system, said contour-matching unit superimposing the video images and the projected images, in particular via outer contour matching of the part of the patient's body.

It should also be noted here that it may be conceivable and advantageous within the context of the invention to attach to the portable unit many and various control apparatus and command input apparatus for altering the combined image shown. For example, the focal plane both of the radioscopic imaging method or tomographic imaging method employed and of the video image shown can be altered using such control elements. As already mentioned above, it is also possible to provide control elements which alter the respective degree of transparency of one of the images with respect to the other.

An illuminating device for the part of the patient's body is advantageously provided on the camera or in its vicinity on the image display device, especially for example as a ring light (LEDs, lamps, fluorescent tubes).

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will now be described in more detail, by way of an embodiment. In the drawings provided for this purpose, there is shown:

FIG. 1 a schematic representation of the use of a device in accordance with the invention for viewing the knee joint of a patient, in aspect;

FIG. 2 a lateral view of the device according to FIG. 1, together with a schematically represented navigation system; and

FIG. 3 a flow diagram of operating the device in accordance with the invention.

DETAILED DESCRIPTION

FIG. 1 schematically shows a device in accordance with the invention, using which the interior of a patient's joint may for example be inspected. The leg 1 of the patient is to be examined at the joint 5. To this end, the flat screen 10 designed in accordance with the invention is held directly in front of the joint, and an image then appears on the screen 10 containing both image information from the video camera image of the camera 14 (FIG. 2) as well as image information on the interior anatomical structure, said information being communicated for example by a navigation apparatus, by radio. To this end, a tomographic image (MR or CT image) is produced beforehand of the patient or of the patient's leg 1.

As is indicated more clearly in FIG. 2, the camera 14 is situated, firmly attached, on the rear side of the screen 10.

This type of image representation is made possible by the spatial position of the camera 14 or of the image screen 10 being tracked around the leg 1 of the patient by means of a tracking means. In the present case, this tracking means is an optical tracking means, namely an arrangement 12 of markers whose position is detected by means of the schematically represented navigation system 20. To this end, the navigation system 20 includes for example two spaced-out infrared cameras 24, 26 and an infrared light emitter 22. The three-dimensional spatial position of the arrangement 12 of markers, and therefore also of the screen 10 and the firmly installed camera 14, can be determined from the two images from the infrared cameras 24, 26. Furthermore, it is possible to determine the spatial position of the leg 1 using a corresponding arrangement 3 of markers.

If the position of the screen 10 and the camera 14 is then known, the position of the current image plane can be detected as follows. The camera 14 has, namely, a small aperture and a low depth of field, such that the image only appears in focus in a particular focal plane. This allows the position of said image plane to be determined and an image from the CT or MR imaging method to be transmitted on the screen 10, which is also exactly in said image plane. The important thing here is that the image ratio of the video image is shown at a scale of 1:1 with respect to the tomographic image. If the joint of a patient is then viewed using the device in accordance with the invention, one sees for example the contours of the patient's leg at the upper and lower edge of the screen. Using this information, the image from the tomographic imaging method can then be adapted such that it lies exactly over the video image, to ensure a realistic impression. The person viewing the screen 10 can then view on the screen both the video image, i.e. a real image, and the “interior” image from the tomographic imaging method, and in any desired depth and with any adjusted transparency of the two images. These adjustments can be made via control devices (not shown), for example flip switches on the front side of the screen.

For the first time, therefore, the device in accordance with the invention gives the observer the option of looking into the interior treatment area in real time while looking at the patient, and of planning or adapting the treatment accordingly. He no longer has to look away from the patient to a firmly installed navigation screen, which is particularly advantageous when the patient's responses to particular medical measures are to be tested. Since the “interior” images from the radioscopic imaging method and/or tomographic imaging method are formed such that they represent virtual 3-D views of the interior parts of the body, the image as a whole can be made very vivid.

Although a radioscopic and/or tomographic image is therefore not taken in situ, a virtual “glass patient” is thus created.

FIG. 3 shows the sequence of a method such as is performed using the device in accordance with the invention; the activity carried out by the computer of the navigation system is illuminated here in particular. The upper left side of the flow diagram relates to the processes in video image detection, while on the right-hand side image detection for the interior of the body is explained, using the example of MR/CT image detection. In video image detection, the focal plane is firstly determined, as has already been explained above. Then the distance from the focal plane to the camera is determined, whereupon the camera is registered by means of its tracking means in the navigation system. The determined spatial position of the camera with respect to the navigation system is then known.

With respect to MR/CT image detection, an MR/CT image is firstly detected beforehand, whereupon a three-dimensional image is reconstructed from the data obtained. Then, the object—for example, a part of the patient's body—is spatially registered, i.e. the spatial relation of the position of the object to the MR/CT images is determined. This can only be realized by spatially detecting an arrangement of markings situated on the object.

When the video image detection and MR/CT image processing steps cited above have been performed, then video/MR/CT-image matching is performed, i.e. the two images are harmonized under computer guidance, with respect to image position and image size. Then, the MR/CT/3-D images, projected onto the video images, can be superimposed on the image display device, and can assist the physician performing the treatment in diagnose and treatment.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US5526812Oct 27, 1995Jun 18, 1996General Electric CompanyDisplay system for enhancing visualization of body structures during medical procedures
US5694142Nov 17, 1994Dec 2, 1997General Electric CompanyInteractive digital arrow (d'arrow) three-dimensional (3D) pointing
US5715836Feb 15, 1994Feb 10, 1998Kliegis; UlrichMethod and apparatus for planning and monitoring a surgical operation
US5765561Oct 7, 1994Jun 16, 1998Medical Media SystemsVideo-based surgical targeting system
US5961456Mar 28, 1996Oct 5, 1999Gildenberg; Philip L.System and method for displaying concurrent video and reconstructed surgical views
US6038467Jan 16, 1998Mar 14, 2000U.S. Philips CorporationImage display system and image guided surgery system
US6477400 *Aug 16, 1999Nov 5, 2002Sofamor Danek Holdings, Inc.Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US6490467 *Jun 26, 1998Dec 3, 2002Surgical Navigation Technologies, Inc.Surgical navigation systems including reference and localization frames
US6640128 *Apr 13, 2001Oct 28, 2003Brainlab AgMethod and device for the navigation-assisted dental treatment
US20010035871Mar 30, 2001Nov 1, 2001Johannes BiegerSystem and method for generating an image
US20020082498Oct 5, 2001Jun 27, 2002Siemens Corporate Research, Inc.Intra-operative image-guided neurosurgery with augmented reality visualization
US20020140694Mar 27, 2001Oct 3, 2002Frank SauerAugmented reality guided instrument positioning with guiding graphics
US20020163499Mar 28, 2002Nov 7, 2002Frank SauerMethod and apparatus for augmented reality visualization
EP0672389A2Mar 14, 1995Sep 20, 1995Roke Manor Research LimitedVideo-based system for computer assisted surgery and localisation
WO1996020421A1Dec 22, 1995Jul 4, 1996Bernhard BrauneckerMicroscope, in particular a stereomicroscope, and a method of superimposing two images
Non-Patent Citations
Reference
1Sauer, Frank et al. "Augmented Reality Visualization in iMRI Operating Room: System Description and Pre-clinical Testing." Proceedings of SPIE, Medical Imaging. Jan. 23-26, 2002.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7463823Jun 23, 2006Dec 9, 2008Brainlab AgStereoscopic visualization device for patient image data and video images
US7907987Feb 17, 2005Mar 15, 2011University Of Florida Research Foundation, Inc.System for delivering conformal radiation therapy while simultaneously imaging soft tissue
US8118818 *Dec 15, 2006Feb 21, 2012Ao Technology AgMethod and device for computer assisted distal locking of intramedullary nails
US8190233Oct 30, 2009May 29, 2012University Of Florida Research Foundation, Inc.System for delivering conformal radiation therapy while simultaneously imaging soft tissue
US8442281 *Apr 28, 2006May 14, 2013The Invention Science Fund I, LlcArtificially displaying information relative to a body
US8750568May 22, 2012Jun 10, 2014Covidien LpSystem and method for conformal ablation planning
US8810640May 16, 2011Aug 19, 2014Ut-Battelle, LlcIntrinsic feature-based pose measurement for imaging motion compensation
US20090054910 *Dec 15, 2006Feb 26, 2009Ao Technology AgMethod and device for computer assisted distal locking of intramedullary nails
Classifications
U.S. Classification600/427, 600/407
International ClassificationA61B5/05, A61B5/055, A61B6/00, A61B19/00
Cooperative ClassificationA61B2019/5289, A61B6/461, A61B19/5244, A61B2019/5291, A61B6/547, A61B2019/5255, A61B19/5212, A61B6/462, A61B19/5202, A61B6/00, A61B19/52, A61B5/055, A61B2019/5272, A61B2019/5229
European ClassificationA61B6/46B2, A61B6/46B, A61B6/54H, A61B19/52H12, A61B19/52, A61B6/00
Legal Events
DateCodeEventDescription
Nov 15, 2013FPAYFee payment
Year of fee payment: 8
Nov 19, 2009FPAYFee payment
Year of fee payment: 4
Jun 26, 2002ASAssignment
Owner name: BRAINLAB AG, GERMANY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VILSMEIER, STEFAN;REEL/FRAME:013046/0062
Effective date: 20020610