|Publication number||US5901199 A|
|Application number||US 08/890,776|
|Publication date||May 4, 1999|
|Filing date||Jul 11, 1997|
|Priority date||Jul 11, 1996|
|Also published as||US6125164, WO1998002091A1|
|Publication number||08890776, 890776, US 5901199 A, US 5901199A, US-A-5901199, US5901199 A, US5901199A|
|Inventors||Martin J. Murphy, Richard S. Cox|
|Original Assignee||The Board Of Trustees Of The Leland Stanford Junior University|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (6), Referenced by (136), Classifications (16), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application claims priority from U.S. Provisional Application No. 60/021,588 entitled "HIGH-SPEED INTER-MODALITY IMAGE REGISTRATION VIA ITERATIVE FEATURE MATCHING" filed Jul. 11, 1996, which is herein incorporated by reference.
This invention relates to the field of medical imaging. More particularly, it relates to a real-time method of positioning therapeutic radiation beams with respect to a target area within a patient.
Radiation therapy is often used to treat cancerous tumors within a patient's body. An early diagnostic session is conducted where the physician uses an imaging technique, such as computed tomography (CT) scanning or magnetic resonance imaging (MRI) to study the target area. He or she then decides the ideal placement and volume of the radiation beam(s) with respect to the target area. During the actual treatment, the radiation beams are focused directly at the target area, using the diagnostic studies as a position reference. Precise positioning of the radiation beams insures that most of the radiation contacts the target cells, while also insuring that the healthy cells surrounding the target cells are not affected. Unfortunately, it is often difficult to be certain that radiation beams are optimally positioned with respect to target cells. Often, a smaller total dose of radiation must be used in order to reduce the possibility of damage to healthy cells. The consequence, however, is that the radiation treatment becomes less effective.
In addition, radiotherapy often requires a patient to return for treatment over the course of several days. Repositioning a patient precisely each time can be time-consuming and frustrating.
Over the past decade, many methods have been devised to improve the alignment of radiation beams with the target area of a patient. An early method involves a rigid frame to physically hold in place the part of the patient's body to be treated. In one embodiment for treatment of target areas within a patient's skull, the frame is attached to a floorstand mounted in a Linac (linear accelerator) floor turret. This method is considered generally reliable and accurate, as it fixes the target area rather precisely with respect to the radiation beams. Unfortunately, due to the nature of the frame, it also greatly limits accessibility to the patient's skull. Target areas may be located in the skull where the Linac radiation beams cannot reach. In addition, it is extremely uncomfortable for the patient, who must remain in an awkward position for long periods of time.
Another method involves invasive techniques. U.S. Pat. No. 5,097,839 by Allen describes fiducial implants attached in a pattern to a patient's skull bones, underneath the skin. These implants are then used as fixed references when aligning radiation beams with the target area. These implants are an improvement over rigid frames in that they allow all target areas within a skull to be reached. However, because inserting the fiducial implants into the patient is a surgical procedure itself, the patient must often wait for several days until the radiation treatment. During this time, the target area may grow or otherwise change in shape, rendering inaccurate the early diagnostic analyses taken when the fiducial implants were put in place. In addition, the implants are often disfiguring and painful to the patient.
Another type of invasive technique involves placing tattoos on the patient's skin where the radiation beams are to enter. Although this is less intrusive than the fiducial implants, it has many of the same problems, such as the patient having to wait several days from the time of the tattoo procedure until the radiation treatment, thus giving time for the target area to grow or change shape. In addition, given the nature of tattoos, it is possible they may also change shape.
More recently, non-invasive, non-disfiguring alignment systems have been developed. These typically use signal processing to convert the CT or MRI data of the position of the patient in the diagnostic setting to the position of the patient in the treatment setting. Many of these systems require a large amount of preprocessing, whereby data generated from the diagnostic scan is gathered and manipulated until it is usable in the treatment setting. The preprocessing step can take several days. During treatment, real time images are compared with the preprocessing data and the patient or the radiation therapy beams are adjusted accordingly. Oftentimes, manual adjustment is necessary. Three degrees of freedom, corresponding to one plane, are typically allowed. The patient has greater freedom of movement than in the previously described techniques, but his movement is still confined. These systems are generally accurate, and painless for the patient.
U.S. Pat. No. 5,295,200 by Boyer et al. describes a method of aligning radiation therapy beams using the Fast Fourier Transform (FFT) to compare the position of diagnostic images with the position of treatment images. In this invention, a large amount of complex data must be gathered and processed prior to treatment. Reference images collected during the diagnostic study are later used to position the patient during treatment.
U.S. Pat. No. 5,531,520 by Grimson et al. describes a method of image registration that takes into consideration patient movement within six degrees of freedom. It employs lasers to determine patient position and as such is confined to surface images. Thus, treatment beams must be based relative to tattoos or other markers on a patient's skin, which have the problems mentioned above.
These existing alignment methods require an extensive amount of time to process complex diagnostic data, usually restrict accuracy to three degrees of freedom, limit patient movement, and make adjustment of either the treatment beams or the patient difficult. In addition, they are unable to generate instant reference images with which to compare the present position of a patient. They also require manual operations to supplement automatic procedures.
Accordingly, it is a primary object of the present invention to provide a wholly automatic method of aligning radiation therapy beams with the treatment target of a patient. It is another object of the invention to provide such a method by making use of radiographic techniques. Yet another object of the invention is to decrease the time required for the preprocessing step, as well as reduce the complexity of data manipulation. A further object of this invention is to generate instant reference images derived from the diagnostic study with which to compare the present position of the treatment target. Another object of the invention is to allow a patient to move around freely during treatment. A sixth object of the invention is to measure patient movement within six degrees of freedom. A seventh object of the invention is to provide continuous adjustment of radiation therapy beams to improve precision during the course of treatment. A final object of this invention is to accommodate both diagnostic imaging and treatment of a patient in the same day.
These objects and advantages are attained by the present invention. This method begins with a diagnostic computed tomographic (CT) scan of the treatment target of a patient. The information from the CT scan is used to generate an intermediate 3-D image. This intermediate 3-D image will eventually be moved or rotated to mimic the position of the treatment target. During treatment, the position of the treatment target in relation to the radiation therapy beams is recorded using at least two x-ray cameras. Thus, both translational and rotational information is received. The treatment images are then processed to produce a feature vector, which is specific for those set of images and thus the position of the treatment target.
Using the intermediate 3-D image generated during the diagnostic stage, at least two digitally reconstructed radiographs (DRRs) are produced. These radiographs are wholly artificial constructs that are judged to be similar to the treatment images. These DRRs are also processed to produce a feature vector. The difference between the two feature vectors is then calculated using a mathematical equation, for example chi squared. If the difference falls below a minimum allowable value, then the position of the treatment target is deemed accurate enough to begin radiation therapy. However, if the difference does not fall below the minimum allowable value, then the intermediate 3-D image is iteratively moved within six degrees of freedom until successive DRRs produce an acceptable minimum difference with respect to the treatment images. Data from the repositioned intermediate 3-D image is used to adjust either the patient or the radiation therapy beams, thus achieving accurate alignment of the radiation beams and the treatment target.
FIG. 1 is a flow chart illustrating the operation of the invention.
FIG. 2 shows a radiograph produced from a patient with a brain tumor.
FIG. 3 is a diagram of the apparatus, consisting of a Cyberknife connected to a computer system.
FIG. 4 is a diagram showing the diagnostic coordinate system.
FIG. 5 is a diagram showing the treatment coordinate system.
FIG. 6 is a diagram showing the three translational degrees of freedom in the preferred embodiment.
FIG. 7 is a diagram showing the three rotational degrees of freedom in the preferred embodiment.
FIG. 8 is a diagram illustrating the projection geometry for generating two digitally reconstructed radiographs (DRRs) from the intermediate 3-D image using a hypothetical camera model.
FIG. 9 shows a fluoroscopic image of an anthropomorphic phantom, a DRR of said anthropomorphic phantom, and a DRR marked with key pixels to locate anatomical edge features.
FIG. 10 contains graphs showing the empirical results for determining (x, y, z) translations, (α, β, γ) rotations one axis at a time, and composite (α, β) rotations for an anthropomorphic phantom.
FIG. 11 is a table showing the standard deviation of errors in measuring translations and rotations of the anthropomorphic phantom.
FIG. 12 is a graph illustrating the distribution of rotation measurement errors for the empirical tests. The distribution is approximately Gaussian with zero mean.
FIG. 13 is a graph showing the distribution of χ2 from the empirical tests.
FIG. 14 is a graph showing the correlation between χ2 and the errors in angle measurements from the empirical tests.
FIG. 15 is a graph showing the correlation between χ2 and the rotation angles around each axis from the empirical tests.
For illustrative purposes, the method of the invention is described for use with the Cyberknife, as shown in FIG. 3, and a patient 30 with a brain tumor 32, as shown in FIG. 2. Please note the method of this invention is not confined for use with the Cyberknife. In the preprocessing step, patient 30 undergoes a computed tomography (CT) scan of his skull. The CT scans are used to assemble an intermediate 3-D image 54, as shown in FIG. 8. Intermediate 3-D image 54 can be moved and rotated along both translational axes 50 and rotational axes 52 in the diagnostic coordinate system, as shown in FIGS. 6 and 7. In this representation, translational axes 50 are represented by dx, dy, and dz, while the rotational axes 52 are represented by α, β, γ. Thus up to six degrees of freedom of movement are allowed. Intermediate 3-D image 54 may consist solely of computer script within a computer or system of computers 44 or it may be visualized on a monitor.
In the preferred embodiment, intermediate 3-D image 54 is used to generate a set of at least two digitally reconstructed radiographs (DRRS) 56, as shown in FIG. 8. DRRs 56 are artificial 2-D images that show how intermediate 3-D image 54 would appear from different angles using a hypothetical camera model 58. Each DRR 56 of a set represents one hypothetical camera angle. These DRRs 56 can then be masked to isolate key pixels associated with anatomical edge features 64, as shown in FIG. 9. Where a complete image 60 would have 40,000 pixels, for example, a masked image 62 typically has 1,000 to 4,000 key pixels 64.
The set of masked DRRs 62 may then be used to generate a lookup table. The lookup table provides the first derivatives of said translational and rotational measurements of intermediate 3-D image 54. These calculations can be used later during treatment to match the actual patient images with intermediate 3-D image 54. The preprocessing procedure as described above requires about 20 seconds on a computer or computer systems 44 with 200 MHz processors.
During treatment, the patient is placed within the view of at least two radiographic cameras 42, in a position approximating his position during the diagnostic scan. The patient has complete freedom of movement, as all possible positions can be defined within six degrees of freedom using translational axes 50 and rotational axes 52, as shown in FIGS. 6 and 7. Translational axes 50 and rotational axes 52 of the treatment coordinate system, as shown in FIG. 5, are defined in the same manner as translational axes 50 and rotational axes 52 of the diagnostic coordinate system, as shown in FIG. 4. (On the Cyberknife, patient 30 has two x-ray cameras 42 and screens 36, which produce real-time radiographs 31 of treatment target 32). These real-time radiographs 31 may then be processed in the same manner as DRRs 56. Real-time radiographs 31 are masked to isolate key pixels associated with anatomical edge features 64. Masked real-time radiographs 31 are used to produce a first feature vector, which specifically identifies the position and orientation of treatment target 32 within the treatment coordinate system, as shown in FIG. 5, at the time real-time radiographs 31 were taken.
Next, intermediate 3-D image 54 is manipulated until its position emulates the position and orientation of treatment target 32. New DRRs 56 are then generated from intermediate 3-D image 54, as shown in FIG. 8. These DRRs 56 are masked to isolate the same key pixels 64 as in real-time radiographs 31 and processed to produce a second feature vector, which specifically identifies the position and orientation of the treatment target of intermediate 3-D image 54 within the diagnostic coordinate system, as shown in FIG. 4.
The two feature vectors are then compared using a mathematical equation, for example the chi squared statistic. If treatment target 32 of patient 30 is positioned in the treatment coordinate system, as shown in FIG. 5, in precisely the same way as intermediate 3-D image 54 is positioned in the diagnostic coordinate system, as shown in FIG. 4, then the difference between the two feature vectors, or χ2, will be less than a designated minimum value. The system has then completed its determination of treatment target 32 position and orientation. This information is passed on to the beam delivery system 40 (e.g. the Cyberknife) and the radiation therapy beams 38 are allowed to operate.
If, however, treatment target 32 is not positioned in the same position and orientation as in the diagnostic coordinate system, the two feature vectors will exhibit χ2 greater than the designated minimum value. In this case, the system moves to the χ2 minimization step. The χ2 minimization process searches for a match between real-time radiographs 31 and DRRs 56 by iteratively varying the position and orientation of intermediate 3-D image 54 with respect to hypothetical camera model 56. For a radiographic imaging system consisting of at least two CCD fluoroscopes, hypothetical camera model 58 is a multi-parameter function which maps the CCD image plane to the fluoroscope screen, and the fluoroscope screen to the treatment coordinate system, as shown in FIG. 5. It accounts for magnification, relative positioning of the two fluoroscopes in the treatment room coordinate system, tilt of the image planes with respect to the fluoroscope screens (in three independent directions), and radial (spherical) distortion of the lens system. Radial distortion is modeled by the factor:
where x and y are the pixel coordinates on the CCD image plane, x' and y' are the corresponding coordinates on the fluoroscope screen, R2 =(x2 +y2), and λ is a free parameter that is determined when hypothetical camera model 58 is calibrated.
The parameters of hypothetical camera model 58 are determined by fitting DRRs 56 of a calibration phantom to actual radiographs 31 of the phantom, using the same χ2 minimization process as for patient position measurements. The residual errors in this calibration-fitting process are spatially random and on the order of the image pixel dimensions, indicating that there is no significant distortion left unaccounted for by the model.
The various positions of the treatment target are emulated in DRRs 56 by changing the point of view of hypothetical camera model 58 with respect to intermediate 3-D image 54. This movement is achieved by use of the lookup table created in the preprocessing step, as described above. The process continues until the difference between the two feature vectors falls below the designated minimum value. At this point, the new coordinates of hypothetical camera model 58 in the diagnostic coordinate system are translated into coordinates for the treatment coordinate system. This information is sent to beam delivery system 40 and radiation therapy beams 38 are realigned.
It is also possible to reposition patient 30. Radiation therapy beams 38 are then allowed to operate.
Translation of intermediate 3-D image 54 rotation geometry into the treatment target geometry is accomplished most effectively via an Eulerian (body-fixed) rotation convention. The Eulerian convention properly represents the fact that patient's rotational axes 52 are fixed in anatomy and not in an external reference frame. On the other hand, mechanical beam and patient alignment systems can measure angles in the space-fixed (α, β, γ) convention illustrated in FIG. 10. To relate space-fixed rotations in (α, β, γ) with the rotations deduced by the image registration algorithm requires that the Eulerian DRR rotations be inverted to correspond to rotations of the object rather than hypothetical camera model 58 (bearing in mind that sequential rotations do not commute) and then transformed to the space-fixed rotation convention. This transformation is summarized below.
Let us designate treatment target 32 in the skull of patient 30 by the vector X in the space-fixed camera coordinate system. When treatment target 32 rotates through the angles (α, β, γ), its coordinates in the fixed frame become
X"= Λγ! Λβ! Λα!X
where the rotation matrices are: ##EQU1##
This convention locates points in the anatomy of patient 30 in the fixed treatment coordinate frame that beam delivery system 40 refers to in directing treatment beam 38.
The DRR ray-tracing process works most effectively in a coordinate system fixed to the image plane in hypothetical camera model 58. The image plane of hypothetical camera model 58 is positioned within the diagnostic coordinate system, relative to intermediate 3-D image 54, through the three Eulerian rotations (φ0, θ0, ψ0). Rotations of the patient's anatomy with respect to x-ray cameras 42 can then be represented in the DRR process by moving the image plane of hypothetical camera model 58 through (φ0, θ0, ψ0) to the perturbed orientation (Φ0+ dφ, θ0+ dθ, ψ0+ dψ). If a point in the patient's anatomy is located by the vector X' in the coordinate frame fixed to the image plane, then X and X' are related according to the transformation:
X"= Λφ! Λθ! Λψ!X
where the rotation matrices are: ##EQU2##
Let E0 != Λ.sub.φ ! Λ.sub.θ ! Λ.sub.ψ ! define the complete Eulerian rotation matrix for angles (φ0, θ0, ψ0) and E0 +dE! define the complete rotation matrix for rotations (φ0 +dφ, θ0 +dθ, ψ0 +dψ). The inverse transformtion is E-1 != Λ--φ ! Λ--θ ! Λ--ψ !. Following this convention, the vector locating the point X" in the treatment coordinate frame after a rigid-body rotation (dφ, dθ, dψ) that is modeled by the rotation of hypothetical camera model 58 is given by:
X"= E0 -1 ! E0 +dE!X.
From this, the relationship between the space-fixed rotations (α, β, γ) and the body-fixed Eulerian rotations (dφ, dθ, dψ) is given by:
E0 -1 ! E0 +dE!= Λγ! Λβ! Λα!
The reduction of this method to practice has been demonstrated in the following tests. In the tests, the precision of measuring actual phantom rotations will be reported in (α, β, γ), while numerical simulations of rotation measurement will be reported in (φ, θ, ψ) Eulerian angles.
In the tests, an anthropomorphic skull phantom was set up on the treatment couch, with the inferior/superior axis along the x-ray camera x-axis, as it would be for a typical patient. The phantom's anterior direction was in the positive z direction. The three axes of rotation in the test setup corresponded to the space-fixed angles (α, β, γ), as defined in FIG. 7. A tiltmeter measured angles (α,β) relative to gravity and was calibrated on a milling machine's rotary table. The resulting calibration was accurate to better than 0.1 degrees. The remaining translational and rotational degrees of freedom were measured mechanically and had precisions of 0.25 mm and 0.25 degrees, respectively.
The phantom was imaged in a CT study of 74 slices 3.0 mm thick, with each image having 512×512 pixels 0.49 mm on a side. The treatment room imaging system operated at 100 kV and 25 mA. The radiographs were 200×200 pixels, with each pixel 1.30 mm on a side.
The phantom was moved step-by-step through motions along and around each axis separately and in composite multi-axis rotations. At each step, the imaging system acquired radiographs and the algorithm computed the phantom position and orientation relative to the CT study. The correspondence between the measured and actual change in position was recorded for each individual degree of freedom. For each test position, the minimum value of χ2 at convergence was recorded.
FIG. 10 illustrates the results for determining translations and rotations empirically. This figure displays the known translations and rotations along the abscissa and the deduced translations and rotations along the ordinate. The figure shows data for rotations (α, β, γ) around one axis at a time, and data for composite rotations in which α and β were varied simultaneously. The individual β and γ rotations each correspond to a composite rotation of φ and ψ in the Eulerian system used to model phantom position in the DRRs. The composite rotations involved all three Eulerian angles.
The deduced versus actual translations and rotations should follow straight lines of slope equal to one. The data displayed in the figures were fit to straight lines, in one case fixing the slope to one, and in the other case allowing the slope to be a free parameter. The variance of the individual points about the fitted line is the standard deviation. The results of analyzing the data via straight line fits are summarized in FIG. 11.
FIG. 12 plots the frequency distribution of angle measurement errors. This distribution has an approximately Gaussian shape with zero mean, which is consistent with the conclusion that the process is not making systematic errors in determining the phantom position. The frequency distribution for χ2 at convergence is plotted in FIG. 13. This distribution has the basic functional form of a χ2 distribution for fits to data with randomly distributed statistical fluctuations. This again supports the conclusion that the process of extracting the image moments and edge coordinates that are fit by χ2 minimization is not making systematically biased determinations of the image features.
FIG. 14 illustrates the correspondence between the measurement error for each rotational component and the value of χ2 at convergence. The relationship is uncorrelated for values of χ2 less than one and rotation errors less than about one degree. This is consistent with the supposition that once the position determination has gotten close, the minimization routine encounters a χ2 surface without a sharply defined minimum. Values of χ2 greater than one have a positive correlation with increasing error in the angle determinations. This is a valuable property, as it allows one to use the magnitude of χ2 to flag those rare instances where a poor determination of the patient orientation has been made.
The correlation between χ2 and the magnitude of rotation around each axis is displayed in FIG. 15. For the α and β rotations there is no apparent correlation, which indicates that the algorithm's precision and reliability is uniform over the full range of allowable orientations around these axes. The plot for the γ rotations shows greater difficulty in establishing orientation in one of the two directions.
The therapy beam alignment method described in the present invention can be used to direct radiation beams to any part of the body containing radiographic features. In addition, it is obvious that this method may also be used to align instruments other than radiation beams with objects other than disease regions of a patient's body. For example, this method would allow precise positioning of fabrication tools with respect to a manufactured object.
Thus, it is obvious that modifications and variations of the present invention are possible. Therefore, it is to be understood that the scope of the invention should be based on the following claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5097839 *||Feb 13, 1990||Mar 24, 1992||Allen George S||Apparatus for imaging the anatomy|
|US5295200 *||Dec 3, 1992||Mar 15, 1994||Board Of Regents, The University Of Texas System||Method and apparatus for determining the alignment of an object|
|US5531520 *||Sep 1, 1994||Jul 2, 1996||Massachusetts Institute Of Technology||System and method of registration of three-dimensional data sets including anatomical body data|
|US5673300 *||Jun 11, 1996||Sep 30, 1997||Wisconsin Alumni Research Foundation||Method of registering a radiation treatment plan to a patient|
|US5740225 *||Dec 6, 1996||Apr 14, 1998||Kabushiki Kaisha Toshiba||Radiation therapy planning method and its system and apparatus|
|US5784431 *||Oct 29, 1996||Jul 21, 1998||University Of Pittsburgh Of The Commonwealth System Of Higher Education||Apparatus for matching X-ray images with reference images|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6125164 *||Nov 23, 1998||Sep 26, 2000||The Board Of Trustees Of The Leland Stanford Junior University||High-speed inter-modality image registration via iterative feature matching|
|US6269143 *||Aug 25, 1999||Jul 31, 2001||Shimadzu Corporation||Radiotherapy planning system|
|US6516046||Sep 22, 2000||Feb 4, 2003||Brainlab Ag||Exact patient positioning by compairing reconstructed x-ray images and linac x-ray images|
|US6681057||Feb 22, 2000||Jan 20, 2004||National Instruments Corporation||Image registration system and method implementing PID control techniques|
|US6782287 *||Jun 26, 2001||Aug 24, 2004||The Board Of Trustees Of The Leland Stanford Junior University||Method and apparatus for tracking a medical instrument based on image registration|
|US7171257 *||Dec 22, 2003||Jan 30, 2007||Accuray Incorporated||Apparatus and method for radiosurgery|
|US7176916 *||Apr 15, 2005||Feb 13, 2007||T.I.E.S., Inc.||Object identifying system for segmenting unreconstructed data in image tomography|
|US7187792 *||Aug 29, 2003||Mar 6, 2007||Accuray, Inc.||Apparatus and method for determining measure of similarity between images|
|US7204640||Aug 29, 2003||Apr 17, 2007||Accuray, Inc.||Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data|
|US7231076 *||Jun 30, 2004||Jun 12, 2007||Accuray, Inc.||ROI selection in image registration|
|US7327865 *||Jun 30, 2004||Feb 5, 2008||Accuray, Inc.||Fiducial-less tracking with non-rigid image registration|
|US7330578||Jun 23, 2005||Feb 12, 2008||Accuray Inc.||DRR generation and enhancement using a dedicated graphics device|
|US7366278||Jun 30, 2004||Apr 29, 2008||Accuray, Inc.||DRR generation using a non-linear attenuation model|
|US7397890 *||Apr 25, 2006||Jul 8, 2008||Xoran Technologies, Inc.||CT system with synthetic view generation|
|US7412029 *||Jun 25, 2003||Aug 12, 2008||Varian Medical Systems Technologies, Inc.||Treatment planning, simulation, and verification system|
|US7426318||Jun 30, 2004||Sep 16, 2008||Accuray, Inc.||Motion field generation for non-rigid image registration|
|US7480399||Jan 16, 2007||Jan 20, 2009||Accuray, Inc.||Apparatus and method for determining measure of similarity between images|
|US7496174||Jul 18, 2007||Feb 24, 2009||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US7505617 *||Dec 5, 2007||Mar 17, 2009||Accuray, Inc.||Fiducial-less tracking with non-rigid image registration|
|US7522779||Jun 30, 2004||Apr 21, 2009||Accuray, Inc.||Image enhancement method and system for fiducial-less tracking of treatment targets|
|US7535991||Jul 18, 2007||May 19, 2009||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US7564946||Jan 31, 2008||Jul 21, 2009||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7620144||Jun 28, 2006||Nov 17, 2009||Accuray Incorporated||Parallel stereovision geometry in image-guided radiosurgery|
|US7620147||Dec 13, 2007||Nov 17, 2009||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7672429 *||Feb 13, 2007||Mar 2, 2010||Mitsubishi Heavy Industries, Ltd.||Radiotherapy device control apparatus and radiation irradiation method|
|US7680244||Oct 16, 2007||Mar 16, 2010||Oraya Therapeutics, Inc.||Ocular radiosurgery|
|US7680245||Jan 31, 2008||Mar 16, 2010||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7684647 *||Nov 16, 2005||Mar 23, 2010||Accuray Incorporated||Rigid body tracking for radiosurgery|
|US7693258||Jan 31, 2008||Apr 6, 2010||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7693259||Feb 1, 2008||Apr 6, 2010||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7693260||Apr 9, 2008||Apr 6, 2010||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US7697663||Jan 31, 2008||Apr 13, 2010||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7756563||May 19, 2006||Jul 13, 2010||The Penn State Research Foundation||Guidance method based on 3D-2D pose estimation and 3D-CT registration with application to live bronchoscopy|
|US7756567 *||Aug 29, 2003||Jul 13, 2010||Accuray Incorporated||Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data|
|US7792249||Dec 18, 2008||Sep 7, 2010||Oraya Therapeutics, Inc.||Methods and devices for detecting, controlling, and predicting radiation delivery|
|US7801271||Oct 30, 2008||Sep 21, 2010||Oraya Therapeutics, Inc.||Methods and devices for orthovoltage ocular radiotherapy and treatment planning|
|US7822175||Aug 3, 2007||Oct 26, 2010||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US7831017||Jun 9, 2008||Nov 9, 2010||Varian Medical Systems Technologies, Inc.||Treatment planning simulation and verification system|
|US7835500 *||Nov 16, 2005||Nov 16, 2010||Accuray Incorporated||Multi-phase registration of 2-D X-ray images to 3-D volume studies|
|US7840093||Dec 15, 2008||Nov 23, 2010||Accuray, Inc.||Image enhancement method and system for fiducial-less tracking of treatment targets|
|US7853313||Dec 12, 2006||Dec 14, 2010||Accuray Incorporated||Apparatus and method for radiosurgery|
|US7889905||May 19, 2006||Feb 15, 2011||The Penn State Research Foundation||Fast 3D-2D image registration method with application to continuously guided endoscopy|
|US7912178||Feb 5, 2008||Mar 22, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7912179||Jun 19, 2008||Mar 22, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US7934869 *||Jun 30, 2009||May 3, 2011||Mitsubishi Electric Research Labs, Inc.||Positioning an object based on aligned images of the object|
|US7953203||May 16, 2008||May 31, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US7961845||Jul 2, 2009||Jun 14, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7962192||Apr 28, 2006||Jun 14, 2011||Restoration Robotics, Inc.||Systems and methods for aligning a tool with a desired location or object|
|US7978818||Jun 26, 2009||Jul 12, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US7978819||Aug 5, 2009||Jul 12, 2011||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8059784||Oct 26, 2010||Nov 15, 2011||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US8064669||Nov 22, 2011||The Penn State Research Foundation||Fast 3D-2D image registration system with application to continuously guided endoscopy|
|US8073105||Mar 16, 2010||Dec 6, 2011||Oraya Therapeutics, Inc.||Ocular radiosurgery|
|US8094779||Apr 6, 2010||Jan 10, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8170319 *||Sep 5, 2008||May 1, 2012||Siemens Medical Solutions Usa, Inc.||Motion detection by direct imaging during radiotherapy|
|US8180021||Apr 13, 2010||May 15, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8184772||May 31, 2011||May 22, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US8189739||Mar 22, 2011||May 29, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8229069||Aug 7, 2009||Jul 24, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8229073||Jul 11, 2011||Jul 24, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8238517||Aug 7, 2009||Aug 7, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8280491||Jun 3, 2010||Oct 2, 2012||Accuray Incorporated||Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data|
|US8295435||Jan 15, 2009||Oct 23, 2012||Accuray Incorporated||Cardiac target tracking|
|US8295437||Jul 11, 2011||Oct 23, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8306185 *||Jul 28, 2005||Nov 6, 2012||Koninklijke Philips Electronics N.V.||Radiotherapeutic treatment plan adaptation|
|US8306186||Jun 13, 2011||Nov 6, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8320524||Apr 6, 2010||Nov 27, 2012||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8351574 *||Feb 18, 2010||Jan 8, 2013||Kabushiki Kaisha Toshiba||X-ray diagnosis apparatus for creating a display image from three-dimensional medical image data|
|US8363783||Apr 15, 2008||Jan 29, 2013||Oraya Therapeutics, Inc.||Method and device for ocular alignment and coupling of ocular structures|
|US8385502||Sep 14, 2010||Feb 26, 2013||Varian Medical Systems, Inc.||Treatment planning simulation and verification system|
|US8442185||Mar 22, 2011||May 14, 2013||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US8457277||Apr 6, 2010||Jun 4, 2013||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US8457372||Sep 30, 2008||Jun 4, 2013||Accuray Incorporated||Subtraction of a segmented anatomical feature from an acquired image|
|US8494116||Sep 21, 2010||Jul 23, 2013||Oraya Therapeutics, Inc.||Methods and devices for orthovoltage ocular radiotherapy and treatment planning|
|US8503609||Sep 7, 2010||Aug 6, 2013||Oraya Therapeutics, Inc.||Methods and devices for detecting, controlling, and predicting radiation delivery|
|US8506558||Feb 6, 2008||Aug 13, 2013||Oraya Therapeutics, Inc.||System and method for performing an ocular irradiation procedure|
|US8512236||Feb 6, 2008||Aug 20, 2013||Oraya Therapeutics, Inc.||System and method for positioning and stabilizing an eye|
|US8526700||Oct 5, 2011||Sep 3, 2013||Robert E. Isaacs||Imaging system and method for surgical and interventional medical procedures|
|US8588367 *||Oct 16, 2007||Nov 19, 2013||Koninklijke Philips N.V.||Motion compensation in quantitative data analysis and therapy|
|US8611497||Nov 14, 2011||Dec 17, 2013||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US8630388||Jan 29, 2013||Jan 14, 2014||Oraya Therapeutics, Inc.||Method and device for ocular alignment and coupling of ocular structures|
|US8672836||Jan 30, 2008||Mar 18, 2014||The Penn State Research Foundation||Method and apparatus for continuous guidance of endoscopy|
|US8675935||Nov 16, 2011||Mar 18, 2014||The Penn State Research Foundation||Fast 3D-2D image registration method with application to continuously guided endoscopy|
|US8681938 *||Jan 17, 2013||Mar 25, 2014||Varian Medical Systems, Inc.||Treatment planning simulation and verification system|
|US8690894||May 20, 2011||Apr 8, 2014||Restoration Robotics, Inc.||Automated system for harvesting or implanting follicular units|
|US8693634 *||Mar 19, 2010||Apr 8, 2014||Hologic Inc||System and method for generating enhanced density distribution in a three dimensional model of a structure for use in skeletal assessment using a limited number of two-dimensional views|
|US8761336||Jan 6, 2012||Jun 24, 2014||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8787524||Nov 6, 2012||Jul 22, 2014||Oraya Therapeutics, Inc.||Orthovoltage radiotherapy|
|US8792704||Aug 27, 2013||Jul 29, 2014||Saferay Spine Llc||Imaging system and method for use in surgical and interventional medical procedures|
|US8811660||Mar 27, 2009||Aug 19, 2014||Restoration Robotics, Inc.||Object-tracking systems and methods|
|US8825137 *||Mar 9, 2008||Sep 2, 2014||Xiaodong Wu||Repositionable gynecological applicator for image-guided radiosurgery (IGRS) and image-guided radiation therapy (IGRT) for localized treatment of gynecological tumors|
|US8837675||Dec 5, 2011||Sep 16, 2014||Oraya Therapeutics, Inc.||Ocular radiosurgery|
|US8848869||Aug 2, 2013||Sep 30, 2014||Oraya Therapeutics, Inc.||Methods and devices for detecting, controlling, and predicting radiation delivery|
|US8848974||Sep 29, 2008||Sep 30, 2014||Restoration Robotics, Inc.||Object-tracking systems and methods|
|US8855267||May 13, 2013||Oct 7, 2014||Oraya Therapeutics, Inc.||Orthovoltage radiosurgery|
|US8911453||Jun 30, 2011||Dec 16, 2014||Restoration Robotics, Inc.||Methods and systems for directing movement of a tool in hair transplantation procedures|
|US8920406||Feb 6, 2008||Dec 30, 2014||Oraya Therapeutics, Inc.||Device and assembly for positioning and stabilizing an eye|
|US8923479||Jan 13, 2014||Dec 30, 2014||Oraya Therapeutics, Inc.||Method and device for ocular alignment and coupling of ocular structures|
|US8995618||Dec 17, 2013||Mar 31, 2015||Oraya Therapeutics, Inc.||Portable orthovoltage radiotherapy|
|US9014454 *||May 20, 2011||Apr 21, 2015||Varian Medical Systems, Inc.||Method and apparatus pertaining to images used for radiation-treatment planning|
|US9025727||Jul 23, 2013||May 5, 2015||Oraya Therapeutics, Inc.||Methods and devices for orthovoltage ocular radiotherapy and treatment planning|
|US9036777 *||Sep 14, 2012||May 19, 2015||Kabushiki Kaisha Toshiba||Medical image processing apparatus|
|US9037215||Jan 24, 2008||May 19, 2015||The Penn State Research Foundation||Methods and apparatus for 3D route planning through hollow organs|
|US20020077543 *||Jun 26, 2001||Jun 20, 2002||Robert Grzeszczuk||Method and apparatus for tracking a medical instrument based on image registration|
|US20040114718 *||Nov 26, 2003||Jun 17, 2004||Elekta Ab||Radiotherapy apparatus and operating method|
|US20040264640 *||Jun 25, 2003||Dec 30, 2004||Myles Jeremy R.||Treatment planning, simulation, and verification system|
|US20040267113 *||Dec 22, 2003||Dec 30, 2004||Euan Thomson||Apparatus and method for radiosurgery|
|US20050047544 *||Aug 29, 2003||Mar 3, 2005||Dongshan Fu||Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data|
|US20050049477 *||Aug 29, 2003||Mar 3, 2005||Dongshan Fu||Apparatus and method for determining measure of similarity between images|
|US20050049478 *||Aug 29, 2003||Mar 3, 2005||Gopinath Kuduvalli||Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data|
|US20060002601 *||Jun 30, 2004||Jan 5, 2006||Accuray, Inc.||DRR generation using a non-linear attenuation model|
|US20060002615 *||Jun 30, 2004||Jan 5, 2006||Accuray, Inc.||Image enhancement method and system for fiducial-less tracking of treatment targets|
|US20060002630 *||Jun 30, 2004||Jan 5, 2006||Accuray, Inc.||Fiducial-less tracking with non-rigid image registration|
|US20060002631 *||Jun 30, 2004||Jan 5, 2006||Accuray, Inc.||ROI selection in image registration|
|US20060002632 *||Jun 30, 2004||Jan 5, 2006||Accuray, Inc.||Motion field generation for non-rigid image registration|
|US20100215149 *||Feb 18, 2010||Aug 26, 2010||Kabushiki Kaisha Toshiba||X-ray diagnosis apparatus|
|US20100266099 *||Oct 16, 2007||Oct 21, 2010||Koninklijke Philips Electronics N. V.||Motion compensation in quantitative data analysis and therapy|
|US20110231162 *||Sep 22, 2011||Krishna Ramamurthi||System and Method for Generating Enhanced Density Distribution in a Three Dimenional Model of a Structure for Use in Skeletal Assessment Using a Limited Number of Two-Dimensional Views|
|US20120294497 *||Nov 22, 2012||Varian Medical Systems, Inc.||Method and Apparatus Pertaining to Images Used for Radiation-Treatment Planning|
|US20130010924 *||Sep 14, 2012||Jan 10, 2013||Toshiba Medical Systems Corporation||Medical image processing apparatus|
|US20130151218 *||Jan 17, 2013||Jun 13, 2013||Jeremy R. Myles||Treatment planning simulation and verification system|
|CN101138010B||Mar 9, 2006||May 18, 2011||皇家飞利浦电子股份有限公司||Image processing system and method for registration of two-dimensional with three-dimensional volume data during interventional procedures|
|CN101478918B||May 30, 2007||Jul 27, 2011||艾可瑞公司||Parallel stereovision geometry in image-guided radiosurgery|
|DE19953177A1 *||Nov 4, 1999||Jun 21, 2001||Brainlab Ag||Method to position patient exactly for radiation therapy or surgery; involves comparing positions in landmarks in X-ray image and reconstructed image date, to determine positioning errors|
|DE102006012970A1 *||Mar 21, 2006||Oct 4, 2007||Siemens Ag||Correlation level determining method, involves cumulating individual analogy measure to total analogy measure as determination of correlation level between extraction images and corresponding object images|
|DE102009040392A1 *||Sep 7, 2009||Apr 7, 2011||Siemens Aktiengesellschaft||Verfahren zum Registrieren eines ersten Abbildungsdatensatzes zu einem zweiten Abbildungsdatensatz und Vorrichtung|
|EP1779288A2 *||Mar 15, 2005||May 2, 2007||Accuray, Inc.||Fiducial-less tracking with non-rigid image registration|
|EP2119397A1 *||May 15, 2008||Nov 18, 2009||BrainLAB AG||Determining calibration information for an x-ray machine|
|WO2002019936A2||Sep 7, 2001||Mar 14, 2002||Cbyon Inc||Virtual fluoroscopic system and method|
|WO2005000102A2 *||Jun 10, 2004||Jan 6, 2005||Accuray Inc||Apparatus and method for radiosurgery|
|WO2005024721A2 *||Aug 20, 2004||Mar 17, 2005||Accuray Inc||2d/3d image registration in image-guided radiosurgery|
|WO2006011925A2 *||Mar 15, 2005||Feb 2, 2006||Accuray Inc||Fiducial-less tracking with non-rigid image registration|
|WO2006011928A2 *||Mar 18, 2005||Feb 2, 2006||Accuray Inc||Image enhancement method and system for fiducial-less tracking of treatment targets|
|WO2006011935A2 *||Mar 30, 2005||Feb 2, 2006||Accuray Inc||Roi selection in image registration|
|WO2008021245A2 *||Aug 10, 2007||Feb 21, 2008||Accuray Inc||Image segmentation for drr generation and image registration|
|WO2012127724A1 *||Oct 24, 2011||Sep 27, 2012||Mitsubishi Heavy Industries, Ltd.||Control device for radiation therapy device, processing method and programme for same|
|U.S. Classification||378/65, 606/33|
|International Classification||A61N5/01, A61N5/10, A61B19/00, A61B6/08|
|Cooperative Classification||A61B6/583, A61B6/08, A61N2005/1061, A61N2005/1062, A61N5/1069, A61N5/1065, A61B2019/5289, A61N5/1049|
|European Classification||A61N5/10E1, A61B6/08|
|Feb 2, 1998||AS||Assignment|
Owner name: LELAND STANFORD JUNIOR UNIVERSITY, BOARD OF TRUSTE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURPHY, MARTIN J.;COX, RICHARD S.;REEL/FRAME:008931/0805
Effective date: 19980127
|Nov 1, 2002||FPAY||Fee payment|
Year of fee payment: 4
|Oct 30, 2006||FPAY||Fee payment|
Year of fee payment: 8
|Nov 4, 2010||FPAY||Fee payment|
Year of fee payment: 12