
[0001]
Reference is hereby made to Provisional Application No. 60/212,199 filed on Jun. 16, 2000 in the names of Benedicte Bascle, Nassir Navab, and Bernhard Geiger and entitled “METHOD FOR NEEDLE PLACEMENT IN A FIXED NUMBER OF ITERATIONS USING PERSPECTIVE INVARIANTS AND METHOD FOR DETERMINING THE BEST ENTRY POINT FOR PERCUTANEOUS PROCEDURES”, whereof the disclosure is herein incorporated by reference.

[0002]
Reference is also herein made to the following documents whereof the disclosure is herein incorporated by reference: U.S. Pat. No. 6,028,912 “APPARATUS AND METHOD FOR POINT RECONSTRUCTION AND METRIC MEASUREMENT ON RADIOGRAPHIC IMAGES”; U.S. Pat. No. 5,930,329 “APPARATUS AND METHOD FOR DETECTION AND LOCALIZATION OF A BIOPSY NEEDLE OR SIMILAR SURGICAL TOOL IN A RADIOGRAPHIC IMAGE”; and pending U.S. patent application Ser. No. 09/408,929, Attorney Docket No. 99P7849, entitled “METHOD AND APPARATUS FOR VISUAL SERVOING OF A LINEAR APPARATUS” and filed on Sep. 30, 1999 in the name of inventor Benedicte Bascle.

[0003]
The present invention relates to the field of percutaneous procedures and, more specifically, to method and apparatus for needle placement, such as for needle biopsy, and for determining an appropriate entry point for such a needle.

[0004]
In accordance with an aspect of the present invention, a method is provided for determining the best entry point for percutaneous procedures, given a primary target for the biopsy and a secondary target through which the needle must pass on its way to the primary target.

[0005]
In accordance with another aspect of the invention, a method is provided for positioning a biopsy needle from a given entry point to a given target.

[0006]
In accordance with another aspect of the invention, a method is provided for visual servoing of a needle in a plane in a fixed number of iterations.

[0007]
In accordance with an aspect of the present inventive concepts, it is herein shown how precise 3Dalignment of a tool from a fixed entry point to a target can be achieved by performing visual servoing of the tool in 3 successive planes using two different views. Visual servoing of the needle or tool in each plane is achieved using a technique based on projective invariants. 3D alignment is obtained in exactly twelve iterations using the technique. If there are multiple (n) targets, the approach does not require n*12 iterations, but 6*(n+1).

[0008]
In accordance with another aspect of the present inventive concepts, a method for finding the entry point to reach a given target while passing through a secondary target is herein described.

[0009]
The invention will be more fully understood from the following detailed description, in conjunction with the Drawing, in which

[0010]
[0010]FIG. 1 shows needle placement by visual servoing in 3 successive planes using 2 views;

[0011]
[0011]FIG. 2 shows visual servoing of a needle in a plane using crossratios;

[0012]
[0012]FIG. 3 shows needle orientation from fixed point F to multiple targets;

[0013]
[0013]FIG. 4 shows a best entry point to reach one target by passing through a secondary target;

[0014]
[0014]FIG. 5 shows a flow diagram or chart of a method in accordance with the invention for determining the best entry point for percutaneous procedures, given a primary target for the beiopsy and a secondary target through which the needle must pass on its way to the primary target;

[0015]
[0015]FIG. 6 shows a flow diagram or chart of a method in accordance with the invention for positioning a biopsy needle from a given entry point to a given target; and

[0016]
[0016]FIG. 7 shows a flow diagram or chart of a method in accordance with the invention for visual servoing of a needle in a plane in a fixed number of iterations.

[0017]
With regard to needle placement by visual servoing in 3 successive planes, using 2 views, reference is made to copending U.S. patent application No. 08/722,725, Attorney Docket No. 96P7553, entitled “APPARATUS AND METHOD FOR POSITIONING A BIOPSY NEEDLE” and filed Sep. 30, 1996 in the name of inventors Nassir Navab and Bernhard Geiger, whereof the disclosure is herein incorporated by reference.

[0018]
For illustrative purposes, it is assumed that imaging is provided by a simple uniplanar Xray fluoroscope (Carm) or any other imaging modality whose imaging process can be modeled by a pinhole camera model. The needle itself is manipulated by a mechanical device such as a passive or active robotic arm that allows arbitrary pivoting of the needle around its tip. The operator, physician, surgeon or doctor, chooses a fixed needle entry point F on the patient, and places the needle device in a way that its needle tip is located at that entry point. No calibration of the setup or registration of the patient to the setup is required.

[0019]
The Carm is then positioned, so that the target area and a part of the needle are both visible on the Xray image. The surgeon defines the projection t of the 3D anatomical target T in the image. At this point of the description, it is assumed T remains static during the procedure.

[0020]
First, the mechanical device moves the needle in an arbitrary plane P_{1 }containing F until the projection of the needle is aligned with the target tin the image (see FIG. 1a). This can be performed in 3 iterations using the visual servoing technique presented in the next section. The final position of the needle in plane P_{1 }is called L_{1}.

[0021]
The system repeats this process by choosing a second plane P_{2 }containing F. The choice of P_{2 }is arbitrary. In practice, the system takes P_{2 }perpendicular to P_{1 }for precision purposes. The needle is rotated in plane P_{2 }until it is visually aligned to the target t in the image (see FIG. 1b). This is done as previously described by using the visual servoing technique presented in section 2. The position of the needle that gives visual alignment is called L_{2 }.

[0022]
The positions of the needle L_{1}⊂P_{1 }and L_{2}⊂P_{2 }define a unique plane P, which contains the Xray source, the target point T and the fixed entry point F. This is essentially the maximum information that can be obtained from a single viewpoint.

[0023]
The physician needs to move the Carm to a second viewing direction. The surgeon defines the projection t′ of the 3D target point T in the new image. Next, the needle is moved only in the plane P until the needle is once again visually aligned to the target in the image (see FIG. 1c). This is done using the visual servoing approach of section 2. This results in the final 3D alignment of the needle, the entry point and the anatomic target point. The needle is then ready to be inserted. The correctness of the alignment can also be checked by moving the Carm to a third viewing direction.

[0024]
Visual Servoing of a Needle in a plane using crossratios

[0025]
With regard to visual servoing of a needle in a plane using crossratios, reference is made to the aforementioned U.S. patent application Ser. No. 09/408,929, Attorney Docket No. 99P7849, entitled “METHOD AND APPARATUS FOR VISUAL SERVOING OF A LINEAR APPARATUS”.

[0026]
In the previous section, it was shown how 3D alignment of a needle to a target can be achieved by performing visual servoing of the needle in three successive planes. There now follows an explanation of how the visual servoing of the needle in a plane is performed. This is a new technique based on projective invariants and is described as follows:

[0027]
Let π be the plane in which the needle is rotated, and F the fixed point around which the rotation is done. The initial orientation L_{1 }of the needle in plane π is arbitrary. T is the 3D target point.

[0028]
An image I is taken of the scene. The 3D position L_{1 }of the needle projects onto line 11 in the image. The position of l_{1 }is detected and stored in memory.

[0029]
The needle is rotated in plane π around fixed point F by an arbitrary amount θ_{1}. This brings it to position L_{2}. Another image is taken. The 3D line L_{2}projects onto 2D line l_{2 }in the image. The position of l_{2 }is detected and stored in memory.

[0030]
The needle is rotated again by an angle θ_{2}. This puts it into position L_{3}. Another image is obtained. L_{3}projects onto l_{3 }in the image. l_{3 }is detected and its position stored in memory.

[0031]
The intersection point of ;_{1}, l_{2 }and l_{3}, denoted f, is determined by least squares. Note that f is the projection of the fixed point F around which the needle is rotated.

[0032]
Let t be the projection of the 3D target T in the image. We assume t remains static during the procedure. The position of t is given interactively by the surgeon.

[0033]
The line l_{t}=(ft) is constructed. Note that l_{t }is the 2D projection of the 3D position L_{t }of the needle that achieves visual servoing (e.g. the visual alignment of the needle and the target) and that we wish to estimate.

[0034]
l
_{1}, l
_{2}, l
_{3 }and l
_{t }form a pencil of 2D lines. The crossratio c=(l
_{1}, l
_{2}, l
_{3}, l
_{t}) of these lines is calculated. This is done using an arbitrary line m that intersects all four lines. If q
_{1}=l
_{1}∩m, q
_{2}=l
_{2}∩m, q
_{3}=l
_{3}∩m and q
_{t}=l
_{t}∩m are the intersections of l
_{1}, l
_{2}, l
_{3}, l
_{t }with m, then
$\begin{array}{c}c=\left({l}_{1},{l}_{2},{l}_{3},{l}_{t}\right)=\left({q}_{1},{q}_{2},{q}_{3},{q}_{t}\right)\\ =\left({q}_{1}\ue89e{q}_{3}*{q}_{2}\ue89e{q}_{t}\right)\xf7\left({q}_{1}\ue89e{q}_{t}*{q}_{2}\ue89e{q}_{3}\right).\end{array}$

[0035]
Note that the value of c is invariant to the choice of the line m.

[0036]
Crossratios are one of the invariants of projective geometry. Therefore the crossratio of a pencil of 3D lines is equal to the crossratio of the pencil of 2D lines formed by its perspective projections in an image. Therefore the crossratio (L_{1}, L_{2}, L_{3}, L_{t}) of the four 3D lines L_{1}, L_{2}, L_{3 }and L_{t }is equal to c, e.g. (L_{1}, L_{2}, L_{3}, L_{t})=(l_{1}, l_{2}, l_{3}, l_{t})=c.

[0037]
From (L
_{1}, L
_{2}, L
_{3}, L
_{t}), we estimate the angle θ
_{t }necessary to rotate the needle from position L
_{3 }to L
_{t}. The formula for θ
_{t }comes from the relationship between the crossratio of four lines and the angle between these lines. This gives:
$\left({L}_{1},{L}_{2},{L}_{3},{L}_{t}\right)=\frac{\left(\mathrm{sin}\ue8a0\left({\theta}_{1}+{\theta}_{2}\right)*\mathrm{sin}\ue8a0\left({\theta}_{2}+{\theta}_{t}\right)\right)}{\left(\mathrm{sin}\ue8a0\left({\theta}_{1}+{\theta}_{2}+{\theta}_{t}\right)*\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\right)}.$

[0038]
Using the fact that (L
_{1}, L
_{2}, L
_{3}, L
_{t})=c, the equation can be rewritten as follows:
$\left(c1\right)\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\ue89e\mathrm{cos}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{1}+\left(\frac{c\ue89e\text{\hspace{1em}}\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}}{\mathrm{tan}\ue8a0\left({\theta}_{1}+{\theta}_{2}\right)}\mathrm{cos}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\right)\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{t}=0.$

[0039]
This equation in θ_{t }is solved using the change of variable g=tan^{θ}{fraction (t/2)}. Note that there are in general 2 solutions to this equation. However, these solutions are equal modulo π, so that they define the same line L_{t}.

[0040]
The needle is rotated by angle θ_{t }from position L_{3 }to L_{t}. This achieves visual servoing. At position L_{t}, the needle is visually aligned to the target in the 2D image.

[0041]
Note that only visual alignment is achieved. Unless the 3D target T belongs to plane π, full 3D alignment is not achieved. As shown in section 1, complete 3D alignment can be obtained only by performing visual servoing of the needle in several successive planes.

[0042]
It should be noted that this visual servoing technique does not require any camera calibration. It also converges in exactly three iterations, contrary to most visual servoing approaches, which require a variable and typically a larger number of iterations. This is important in Xray applications where each new image increases the radiation exposure of both patient and surgeon.

[0043]
This visual servoing approach can be applied to any imaging device that can be approximated by a pinhole camera. In applications where the number of iterations is not critical, precision can be improved by considering n>3 successive needle positions L_{1}, L_{2}, . . . , L_{n}. Then θ_{t }can then be estimated by leastsquare approximation from all the possible crossratios between lines L_{1}, L_{2}, . . . , L_{n}.

[0044]
In accordance with the present inventive concepts, combining both approaches (see section 1 and 2) ensures that 3D needle placement can be achieved in a fixed number (12) of iterations. This is very important as this limits the radiation exposure of both surgeon and patient and is an advantage of the present method over prior art methods, which usually cannot guarantee the number of iterations that they will need to converge.

[0045]
If there are several targets to align the needle to, the alignment to all n targets can be performed in 6*(n+1) iterations, instead of 12*n iterations, since some of the steps of the alignment can be used for several targets. The variation for orientation of a needle from a fixed point to multiple targets by visual servoing is the following:

[0046]
It is herein assumed for the purpose of illustrative example that imaging is provided by a simple uniplanar Xray fluoroscope (Carm) or another imaging modality that can be approximated by a pinhole camera model. The needle itself is manipulated by a mechanical device such as a passive or active robotic arm that allows arbitrary pivoting of the needle around its tip. The surgeon chooses a fixed needle entry point F on the patient, and places the needle device in such a way that its tip is located at that entry point. No calibration of the setup or registration of the patient to the setup is required.

[0047]
The Carm or imaging modality is then positioned, so that the target area and a part of the needle are both visible on the image. This position corresponds to the first image plane. The surgeon defines the projection t of the 3D anatomical target T in the image. At this point of the description, it is assumed T remains static during the procedure. Other target points can be defined as necessary. To simplify the description of the approach and the figures, the case of 2 targets T and U is considered; however, this is not intended to be limiting as the approach applies to n targets.

[0048]
Let P_{1 }and P_{2 }be two arbitrary and nonparallel planes containing F (see FIG. 3a). For details on the choice of P_{1 }and P_{2}, see discussion below.

[0049]
The mechanical device (passive mechanical arm or active robot) first places the needle in plane P_{1 }at an arbitrary position L_{1}⊂P_{1 }(see FIG. 3b). An image is taken. 3D line L_{1 }projects onto 2D line l_{1 }in this image (see FIG. 3c). Then the needle is rotated in plane P_{1 }by an arbitrary angle θ_{1}. This brings it to position L_{2}, which project onto 2D position l_{2 }in a new image. The needle is again rotated, this time by an amount θ_{2}. This puts it into position L_{3}. Another image is obtained and the corresponding 2D line l_{3 }is measured. The intersection point of l_{1}, l_{2 }and l_{3}, denoted f, is determined by least squares. Note that f is the projection of the fixed point F around which the needle is rotated.

[0050]
Let t and u be the 2D projections of the 3D targets T and u in the Xray image. They are given interactively by the surgeon.

[0051]
Let us consider first. The line l_{t}=(ft) is constructed. Note that l_{t }is the 2D projection of a 3D line L_{t }in plane P_{1}. The rotation angle between L_{3 }and L_{t }is denoted θ_{t}. First we calculate the crossratio c=(l_{1}, l_{2}, l_{3}, l_{t}) of the 4 intersecting 2D lines. This can be done using an arbitrary line m that intersects all four lines. If

q
_{1}
=l
_{1}
∩m, q
_{2}
=l
_{2}
∩m, q
_{3}
=l
_{3}
∩m, q
_{t}
=l
_{t}
∩m,

[0052]
then

c=(q _{1} q _{3} *q _{2} q _{t})÷(q _{1} q _{t} *q _{2} q _{3}).

[0053]
Note that the value of c is invariant to the choice of the line m. Since crossratios are one of the invariants of projective geometry, we have the following equation: (L
_{1}, L
_{2 }, L
_{3}, L
_{t})=(l
_{1}, l
_{2}, l
_{3}, l
_{t})=c. And from the relationship between crossratios and angles, we can write the following formula:
$\left({L}_{1},{L}_{2},{L}_{3},{L}_{t}\right)=\frac{\left(\mathrm{sin}\ue8a0\left({\theta}_{1}+{\theta}_{2}\right)*\mathrm{sin}\ue8a0\left({\theta}_{2}+{\theta}_{t}\right)\right)}{\left(\mathrm{sin}\ue8a0\left({\theta}_{1}+{\theta}_{2}+{\theta}_{t}\right)*\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\right)}.$

[0054]
Therefore, we can deduce the angle θ
_{t }from the value of c measured in the image by using the following equation:
$\left(c1\right)\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\ue89e\mathrm{cos}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{1}+\left(\frac{c\ue89e\text{\hspace{1em}}\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}}{\mathrm{tan}\ue8a0\left({\theta}_{1}+{\theta}_{2}\right)}\mathrm{cos}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{2}\right)\ue89e\mathrm{sin}\ue89e\text{\hspace{1em}}\ue89e{\theta}_{t}=0.$

[0055]
There are in general 2 solutions to this equation. However, these solutions are equal modulo π, so that they define the same line L_{t}. The needle is rotated by angle θ_{t }from position L_{3 }to L_{t }(see FIG. 3d). This achieves visual servoing of the needle in plane P_{1}, e.g. the visual alignment of the needle and the target T. In the reminder of this paper, L_{t }will be called D_{1} ^{T }(see FIG. 3e). Note that the 3D alignment of the needle to T is not achieved yet.

[0056]
Similarly, the line l_{u}=(fu) can be constructed and the rotation angle θ_{u }that achieves visual servoing of the needle to target U can be deduced from the crossratio (l_{1}, l_{2}, l_{3}, l_{u}). The resulting position of the needle in plane P_{1 }is noted D_{1} ^{U }(see FIG. 3e). Note that the same lines l_{1}, l_{2}, l_{3 }are used to achieve visual servoing in plane P_{1 }for all the targets.

[0057]
The same visual servoing procedure can be applied in plane P_{2 }for each target. This defines 2 lines D_{2} ^{T }and D_{2} ^{U }belonging to plane P_{2 }and visually aligned to the targets T and U in the image defined by the first position of the Xray Carm (see FIG. 3e).

[0058]
Let π_{FT}=D_{1} ^{T}D_{2} ^{T }be the plane defined by 3D lines D_{1} ^{T }and D_{2} ^{T}. Since both lines project to (ft) in the image plane, this plane contains F, T and the center of the camera corresponding to the first position of the Xray source. We call this plane the viewing plane of target T and entry point F for the first image plane (see FIG. 3e). This is the maximum information we can get about T from a single viewpoint. Similarly, π_{FU}=D_{1} ^{U}D_{2} ^{U }is the viewing plane of target U and entry point F for the first image plane.

[0059]
At this point, the physician needs to move the Carm to a second viewing direction. The surgeon also needs to define the 2D projections t′ and u′ of the 3D target points T and U in the new Xray image.

[0060]
Then we find the position of the needle in plane π_{FT }that is visually aligned to target T in the new image (see FIG. 3f). This can be done by moving the needle first to D_{1} ^{T}, then to D_{2} ^{T}, then rotating it to a third arbitrary position and applying our crossratio based approach for visual servoing of the needle in a plane (see details above). This results in the complete 3D alignment of the needle, entry point F and target point T. The needle is then ready to insert to reach target T. The correctness of the alignment can also be checked by moving the Carm to a third viewing direction.

[0061]
Similarly, the 3D orientation of the needle (FU) can be determined by moving the needle in plane π_{FU }and visually aligning it to target U in the image.

[0062]
Note that the complete orientation of the needle from one fixed point to one target only takes 12 iterations (or Xray images). However, as described above, if there are n targets, we do not need to do the complete needle orientation procedure for each target. Careful step counting shows that only 6*(n+1) iterations are needed to determine the 3D orientations of the needle.

[0063]
Note that this visual servoing technique does not require any camera calibration. In addition and contrary to most visual servoing approaches that usually require a variable and often large number of iterations, it converges in a fixed number of iterations. This is important in Xray applications where each new image increases the radiation exposure to both patient and surgeon.

[0064]
After 3D alignment of the needle, the insertion depth required to reach the target from the entry point can be estimated using crossratios. For this, we use markers mounted on the needle guide at known intervals. The crossratio of the position of these markers, the entry point and the target is measured in the image. Since crossratios are projective invariants, the 3D distance from the entry point to the target can be deduced from the crossratio.

[0065]
Visual servoing in a plane is most precise if the plane is parallel to the image plane. Therefore ideally P_{1 }and P_{2 }should be parallel to the image plane. However, for a line D_{1} ^{T }in plane P_{1 }and a line D_{2} ^{T }in plane P_{2 }to define a plane with good precision, P_{1 }and P_{2 }should ideally be perpendicular. The compromise we found is to use perpendicular planes P_{1 }and P_{2 }which are tilted fortyfive degrees with respect to the image plane. The error analysis simulations shown in the paragraph below seem to support this choice.

[0066]
The visual servoing approach in accordance with the principles of the invention can be applied to any imaging device that can be approximated by a pinhole camera. In applications where the number of iterations is not critical, the precision of visual servoing in a plane can be improved by considering n >3 successive needle positions L_{1}, L_{2 }, . . . , L_{n}. Then θ_{t }can then be estimated by leastsquare approximation from all the possible crossratios between lines L_{1}, L_{2}, . . . , L_{n}. Similarly, if more than two positions of the imaging device are used, then the viewing planes π_{FT }of the target T and entry point F corresponding to each camera pose can be intersected by leastsquares in order to determine the 3D orientation (FT). Using more camera positions increases precision.

[0067]
The following part relating to a method for finding the best entry point for percutaneous procedures is distinct from the foregoing material and it applies to any imaging modality and any method for aligning a needle or linear tool to a target from a given entry point on the patient. It can be combined with the methods presented in section 1 and 2, or others.

[0068]
The approaches presented in previous sections supposes that the entry point is fixed. However, in many applications, the “optimal” entry point might not be known. An example of this are vertebroplasty procedures. Typically in those procedures the surgeon wants to reach a target point inside a vertebra and wants to use a given entry point into the vertebra. However only a highly experienced surgeon is able to determine the corresponding entry point on the skin of the patient. In this section we propose a new method to determine the entry point necessary for the needle to reach an anatomical target while passing through some given anatomical landmark. The method (illustrated by FIG. 4) is as follows:

[0069]
Let T be the primary target and U the secondary target that the needle must pass through. First we choose two arbitrary entry points F and G. Then we apply the technique presented in section 1,2,3 (or any other technique that performs the 3D alignment of a needle from an entry point to a target using any imaging modality) to determine the 3D orientations of the needle necessary to reach the two targets T and U from each entry point. This gives (FT), (FU), (GT), (GU). The intersection of the planes π_{FTU}=(FT)(FU) and π_{GTU}=(GT)(GU) gives the direction (TU) that passes through both targets. The mechanical device that holds the needle can servo the needle to this direction. By lowering the servoed needle to the skin of the patient, the surgeon can find the entry point proposed by the system for reaching target T through target U.

[0070]
In accordance with an embodiment of the invention, a primary target for biopsy and a secondary target through which a biopsy needle must pass on its way to the primary target are given. The method for determining the best entry point for a percutaneous procedure in accordance with the invention comprises steps shown in FIG. 5 for the embodiment being considered. First, a first and second arbitrary entry points on a patient are selected, the second point being different from the first point, followed by the following steps: determining the three dimensional (3D) orientation of the needle at the first arbitrary entry point for pointing the needle at the primary target; determining the 3D orientation of the needle at the first arbitrary entry point for pointing the needle at the secondary target; determining the 3D dimensional orientation of the needle at the second arbitrary entry point for pointing the needle at the primary target; determining the 3D orientation of the needle at the second arbitrary entry point for pointing the needle at the secondary target; determining a 3D line representing the intersection of a first plane containing the first arbitrary entry point, the primary target point, and the secondary target point, and a second plane containing the second arbitrary entry point, the primary target, and the secondary target point, whereby the 3D line provides a position and orientation for the needle for performing needle biopsy of the primary target through the secondary target.

[0071]
In accordance with another embodiment of the invention for use in conjunction with a Carm imaging apparatus, for positioning a biopsy needle from a given entry point on a patient to a given target for biopsy, comprises the steps shown in FIG. 6 for this embodiment. These comprise: positioning the Carm in a desired first position, whereby an image formed with the Carm in the first position is formed in a first image plane; storing location information of the given target in the first image plane; defining a first arbitrary plane including the given entry point; placing the needle at the given entry point; visually servoing the needle in the first arbitrary plane with respect to the location information in the first image plane to derive a first threedimensional (3D) needle position; defining a second arbitrary plane including the given entry point, different from the first arbitrary plane; placing the needle at the given entry point; visually servoing the needle in the second arbitrary plane with respect to the location information in the image plane to derive a second threedimensional (3D) needle position; determining a resulting plane defined by the first and second threedimensional (3D) needle positions; placing the needle at a selected point in the resulting plane at an arbitrary orientation; positioning the Carm in a desired second position, whereby an image formed with the Carm in the second position is formed in a second image plane; storing location information of the given target in the second image plane; and visually servoing the needle around the entry point in the resulting plane with respect to the location information in the second image plane, whereby the needle is at the entry point and aimed at the target.

[0072]
In accordance with another embodiment of the invention for use in conjunction with a Carm imaging apparatus, for visual servoing of a needle in a plane in a fixed number of iterations, the following are given: a given position of the Carm being modeled on an image plane, a given target point for needle biopsy, a given entry point on a patient, and a given plane within which the needle can rotate around the given entry point.

[0073]
The method comprises the steps shown in FIG. 7 for this embodiment. These include placing the needle in an arbitrary orientation around the given entry point, the orientation being associated with an angle θ1 with respect to an arbitrary reference direction; obtain a 2dimensional (2D) projection image of the needle in the image plane; measure the position 11 of the 2D projection image; rotate the needle around the entry point by an angle Δθ to a position (θ1+Δθ) in the given plane; measure the position 12 of the 2D projection image; rotate the needle around the entry point by an angle Δθ to a position θ3=(θ1+2*Δθ); measure the position 13 of the 2D projection image; locate 2D points f and t in the 2D projection image that are the respective 2D projections of the target point and the entry point, the points f and t defining an image line It=(ft); calculate the crossratio c=(11, 12, 13, It); determine a 3D angle θt such that c=(θ1, θ2, θ3, θt); and rotate the needle around the entry point by θt from position θ3 in the given plane, whereby the needle is positioned in three dimensional space at the entry point along a direction such that the needle is visually aligned to 2D projection image of the target in the image plane.

[0074]
The step of locating 2D points f and t is performed automatically in one embodiment and manually in another embodiment.

[0075]
The use and/or incorporation of computer information processing and the storage of data is contemplated, such as the use of a programmed digital computer or a dedicated computer chip or the like.

[0076]
While the present invention has been described by way of illustrative embodiments, it will be understood by one of skill in the art to which it pertains that various changes and modifications can be made without departing from the spirit of the invention. For example, where reference in the specification and in the claims is made to a biopsy needle, it will be understood that this may refer to a holder for such needle to permit alignment and manipulation of the needle itself as may be convenient. Such adjunct equipment is also described in the materials herein referenced. Such and the like modifications are intended to be within the scope of the invention as defined by the claims following.