US20060291717A1 - Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously - Google Patents

Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously Download PDF

Info

Publication number
US20060291717A1
US20060291717A1 US11/202,777 US20277705A US2006291717A1 US 20060291717 A1 US20060291717 A1 US 20060291717A1 US 20277705 A US20277705 A US 20277705A US 2006291717 A1 US2006291717 A1 US 2006291717A1
Authority
US
United States
Prior art keywords
viewport
cross
control line
control
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/202,777
Other versions
US7496222B2 (en
Inventor
Christopher Mussack
Litao Yan
Cheryl Jones
David Mack
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US11/202,777 priority Critical patent/US7496222B2/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JONES, CHERYL RUTH, MACK, DAVID CHARLES, MUSSACK, CHRISTOPHER JOSEPH, YAN, LITAO
Priority to JP2006170843A priority patent/JP5113351B2/en
Priority to CN2006101064701A priority patent/CN1891175B/en
Publication of US20060291717A1 publication Critical patent/US20060291717A1/en
Application granted granted Critical
Publication of US7496222B2 publication Critical patent/US7496222B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/465Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/008Cut plane or projection plane definition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/028Multiple view windows (top-side-front-sagittal-orthogonal)

Definitions

  • the present invention generally relates to the viewing of medical images at several angles. More specifically, the present invention relates to defining a three-dimensional (“3D”) oblique cross-section of an anatomy at a specific angle and to modify additional angles of display simultaneously.
  • 3D three-dimensional
  • Current systems and methods allow for the viewing of a 3D volume of an object, such as a patient's anatomy, in one or more two-dimensional (“2D”) cross-sectional images or cross-sectional stack of images of the 3D volume.
  • the angle of display in the 2D cross-sectional images may be manipulated by a user.
  • the angle of display in the 2D images may be manipulated or adjusted by rotating the image about a line (such as a control line, for example) displayed on a 2D image.
  • a line such as a control line, for example
  • movement of a control line in a first 2D image can affect the angle of display of a 2D image in another, subsequently viewed 2D image.
  • current systems may display a control line over a first 2D cross-sectional image.
  • This control line can represent a 2D image plane in which a view of the 3D image is presented in a subsequent 2D image.
  • current systems permit the user to then move a control line in the second 2D image to adjust the 2D image plane of the next subsequent 2D image.
  • the movement of a control line in one 2D image affects the angle of display or 2D image plane in all subsequent 2D images.
  • the angles of display or image planes in all subsequent 2D images are adjusted. In other words, changing the angle of display in a first 2D image has a domino effect of changing the angle of display in all subsequent 2D images.
  • control lines so as to adjust angles of display in multiple 2D images is used to obtain a final 2D image that is positioned correctly according to a user's needs.
  • a major drawback of the existing methods is that the user can only see one prescribed cross-sectional view at a time. This implies that while the user is looking at the final view he can no longer see the coronal view; likewise, while performing the first step the user cannot see the final view. As a result, if the user cannot obtain the view through the disk correctly, the user may need to go back to the first step to adjust the pseudo-sagittal view from the coronal image. However, the user will no longer be able to see if the adjustments made are correct on the final image. This will typically result in multiple iterations before a correct view through the disk can be obtained.
  • the present invention provides a method to modify one or more angles of cross-section in one or more cross-sectional images of a 3D volume simultaneously.
  • the method includes providing a plurality of viewports, where each of the viewports is configured to display one or more 2D images, moving a control line in a selected viewport, and altering an angle of cross-section of at least one of the 2D images in at least one viewport other than the selected viewport simultaneous with the step of moving the control line.
  • the 2D images can represent one or more cross-sections of the 3D volume.
  • the present invention also provides a computer-readable storage medium including a set of instructions for a computer.
  • the set of instructions includes a display routine and an angle of cross-section modifying routine.
  • the display routine is configured to display a 2D image representative of a cross-section of a 3D volume in each of a plurality of viewports.
  • the modifying routine is configured to alter an angle of cross-section of at least one of the 2D images based on and simultaneous with a movement of a control line in a selected viewport.
  • the present invention also provides a method for adjusting an angle of cross-section in at least one of a plurality of 2D images.
  • the method includes providing a plurality of viewports including first, second and final viewports, displaying a plurality of control lines including first and second control lines, moving the first control line, and adjusting a first angle of display of a second 2D image and a second angle of display of a final 2D image simultaneous with moving the first control line.
  • the first, second and final viewports are configured to display first, second and final 2D images, respectively.
  • the first, second and final 2D images each representative of one or more cross-sections of a 3D volume.
  • the first control line is displayed in the first viewport and the second control line is displayed in the second viewport.
  • the first control line is configured to represent cross-sectional plane of the second 2D image at a first angle of cross-section and the second control line is configured to represent a cross-sectional plane of the final 2D image at a second angle of cross
  • FIG. 1 illustrates a screenshot of multiple viewports according to an embodiment of the invention.
  • FIG. 2 illustrates a flowchart for a method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates a flowchart for a method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with another embodiment of the present invention.
  • FIG. 4 illustrates first control viewport with additional control markings in accordance with an embodiment of the present invention.
  • FIG. 5 illustrates a system for simultaneously modifying one or more angles of display or angles of cross-section in one or more cross-sectional images or cross-sectional stack of images of a 3D volume according to an embodiment of the present invention.
  • FIG. 6 illustrates first and second control viewports with additional rotational control markings in accordance with an embodiment of the present invention.
  • FIG. 5 illustrates a system 600 for simultaneously modifying one or more angles of display or angles of cross-section in one or more cross-sectional images or cross-sectional stack of images of a 3D volume according to an embodiment of the present invention.
  • System 600 includes a computing device 610 and a computer-readable storage medium 620 .
  • Computing device 610 may include any one or more interconnected machines capable of carrying out operations based on one or more set of instructions. While a personal computer is shown as device 610 in FIG. 5 , the various embodiments of the present invention are not limited to a personal computer. Any one or more interconnected machines capable of carrying out operations based on one or more set of instructions may comprise device 610 .
  • a set of instructions may include, for example, a software application or program, for example.
  • Medium 620 may include any computer-readable storage medium, such as a local and/or remote memory.
  • medium 620 may include a computer hard drive (internal or external) or a memory in a server accessible via a network connection.
  • Medium 620 includes a memory 630 and one or more sets of instructions including a display routine 640 and an angle of cross-section modifying routine 650 .
  • Memory 630 may include any portion of medium 620 dedicated to the storage of one or more sets of instructions (for example, software applications), imaging studies, and/or images.
  • Display routine 640 may include any set(s) of instruction(s) capable of directing computing device 610 to carry out one or more tasks.
  • modifying routine 650 may include one or more set(s) of instruction(s) capable of directing computing device 610 to carry out one or more tasks.
  • display routine 640 and modifying routine 650 may be written and carried out in any suitable computer-programming language.
  • Display routine 640 and/or modifying routine 650 may be implemented locally (that is, put into operation by a processor of computer or workstation) or remotely (that is, put into operation by a processor of a remote computer, workstation or server), for example.
  • a user may employ system 600 to view a plurality of 2D cross-sectional images or cross-sectional stack of images of a 3D volume.
  • at least one technical effect of display routine 640 and modifying routine 650 is to permit a user to simultaneously modify one or more angles of display or angles of cross-section in one or more cross-sectional images or stacks of images of a 3D volume.
  • a user may access one or more images and/or imaging studies of a patient anatomy using device 610 .
  • the images and/or imaging studies may be stored locally on a memory 630 in device 610 or in a memory 630 remote from device 610 and accessible via one or more network connections, for example.
  • a user may load or run a software application to examine the image(s) and/or imaging study(ies) on device 610 .
  • device 610 loads display routine 640 to display one or more 2D images.
  • the 2D image(s) may be presented in one or more viewports.
  • a viewport may include a subscreen or subdivision of a display screen of device 610 .
  • FIG. 1 illustrates a screenshot 100 of multiple viewports according to an embodiment of the invention.
  • Screenshot 100 is a visual representation of the viewports displayed on device 610 when display routine 640 is implemented or carried out by device 610 .
  • screenshot 100 may be a visual representation of computer software running on device 610 .
  • Screenshot 100 includes multiple viewports 110 , 120 , 130 and 140 . While four viewports are illustrated in FIG. 1 , any number of viewports may be used in accordance with the present invention. For example, two or more viewports may be used in accordance with embodiments of the present invention.
  • Each of viewports 110 , 120 , 130 , 140 may include a 2D cross-sectional image of a 3D imaged object at a given angle of display.
  • a user may select which 2D image is displayed in each viewport.
  • the 2D image(s) displayed in each viewport is preset.
  • a 2D image displayed in first viewport 120 is displayed at a preset angle of display.
  • the angle of display of a 2D image displayed in first viewport 120 can be free rotated to any position by a user of device 610 .
  • a user may employ an input device connected to device 610 to free rotate the angle of display of a 2D image displayed in first viewport 120 .
  • At least one technical effect of display routine 640 is to create a sufficient number of viewports to display all 2D images simultaneously.
  • display routine 640 can be used to display a number of viewports sufficient to display all 2D images desired by a user of device 610 .
  • the viewports created by display routine 640 may be provided in a sequence.
  • viewport 120 may be the first viewport
  • viewport 130 may be the second viewport
  • viewport 140 may be the third viewport
  • viewport 110 may be the final viewport, for example.
  • the angle of display or angle of cross-setion for each 2D image or stack of images may be equivalent or differ in all viewports or in any subset of viewports.
  • a user may select the angle of display and/or plane for each 2D image displayed in each viewport.
  • the exact angle and/or plane at which the image is viewed in one or more viewports may be preset.
  • the angle of display or image plane of a 2D image displayed in one viewport is orthogonal to the angle of display or image plane of a 2D image displayed in a previous viewport.
  • the angle of display or image plane of a 2D image displayed in one viewport is orthogonal to the angle of display or image plane of a 2D image displayed in a previous viewport and to the angle of display or image plane of a 2D image displayed in a subsequent viewport.
  • a 2D image displayed in a viewport is centered at a control line of a 2D image displayed in a previous viewport.
  • each control viewport 120 , 130 , 140 may display a cross sectional view of a 3D volume.
  • viewport 120 may display a coronal (anterior) view of a 3D volume of a patient's spine.
  • viewport 130 may display an oblique cross sectional view of the same 3D volume and viewport 140 may display a sagittal (left) cross sectional view of the same 3D image, for example.
  • Another viewport e.g., viewport 110 in screen shot 100 of FIG. 1 may display a final view of the 3D volume.
  • Final viewport 110 may include another cross sectional image (such as another oblique cross sectional image of the 3D image) defined by the movement of control lines 122 , 132 in control viewports 120 , 130 , as described in more detail below.
  • One or more viewports created by display routine 640 can include a control line.
  • viewports 120 and 130 of screenshot 100 each include control lines 122 and 132 , respectively.
  • a control line 122 , 132 can represent a plane in which a view of the 3D image is presented in a subsequent control viewport and/or in final viewport 110 .
  • control line 122 in first control viewport 120 may represent the 2D image plane of a 3D volume displayed in second control viewport 130 and control line 132 in second control viewport 130 may represent the 2D image plane of the 3D volume displayed in final viewport 110 .
  • each viewport in a subset of the viewports displayed by display routine 640 includes a control line.
  • each viewport may include additional markings to help orient a user.
  • a viewport may include a directional marker 124 , 134 .
  • Directional marker 124 , 134 may represent the direction at which the 3D volume is presented in final viewport 110 as a 2D image.
  • control line 122 in viewport 120 represents the 2D cross-section in which the 3D volume is presented in final viewport 110 .
  • directional marker 124 in viewport 120 it may be unclear the direction from which the image plane (defined by control line 122 ) is viewed.
  • control line 122 For example, without directional marker 124 pointing down and to the left of control viewport 120 , it may be unclear whether the 2D image plane defined by control line 122 is being viewed from the right or the left as displayed in final viewport 110 . However, with the inclusion of directional marker 124 , it is clear that the 2D image in final viewport 110 represents a 2D image plane defined by control line 122 as viewed from the right side of control line 122 .
  • directional marker 134 in viewport 130 may assist a user in determining that the image plane defined by control line 132 is being viewed (in final viewport 110 ) from the bottom of control line 132 .
  • directional marker 124 , 134 is represented by a triangle. However, in other embodiments, directional marker 124 , 134 may be represented by any other symbol or marking that conveys the direction at which an image plane defined by a control line is viewed. For example, directional marker 124 , 134 may be represented by an arrow or line.
  • Directional marker 124 , 134 may also include a control line center point.
  • the center point may represent a center point along a control line.
  • directional marker 124 can include a circle, for example, indicating the center of control line 122 .
  • directional marker 134 can include a circle, for example, indicating the center of control line 132 . While a circle is used for the center point in directional marker 124 , 134 , any point, line, or geometric object may be used to indicate the center point of a control line.
  • the center point of a control line may also be represented by a center point as represented by 116 , 146 in FIG. 1 .
  • the “X” in final viewport 110 marked with 116 and the “X” in third control viewport 140 marked with 146 represent the center point of control lines.
  • an “X” is used as center points 116 , 146 , any point, line, or geometric object may be used in its place.
  • a user may employ an input device connected to or included in device 610 to move one or more control lines in one or more viewports.
  • an angle of display or angle of cross section of a 2D image or stack of images displayed in another viewport is accordingly modified or altered by modifying routine 650 .
  • at least one technical effect of modifying routine 650 is to modify or alter an angle of display or angle of cross-section of a 2D image or stack of cross-sectional images based on and simultaneous with the movement of a control line in another 2D image.
  • a control line in one viewport represents an image plane of at least one other 2D image displayed in at least one other viewport at a given angle of display.
  • the image plane of the at least one other 2D image displayed in the at least one other viewport simultaneously changes.
  • modifying routine 650 causes device 610 to alter the angle of display or angle of cross-section of a 2D image or stack of images as a corresponding control line is moved.
  • modifying routine 650 causes an angle of display or angle of cross-section of another 2D image in another viewport (such as second, third or final viewports 130 , 140 , 110 ) to make a corresponding change, at the same time that the control line is moved for example.
  • modifying routine 650 permits a user to witness the real time effects of moving a control line in one viewport on the angle of display or angle of cross-section of at least one other 2D image in at least one other viewport. Such a simultaneous change allows for the effects of moving a control line in one control viewport to be viewed in one or more subsequent viewports.
  • the terms and phrases “real time,” “at the same time,” and “simultaneous” do not exclude any small delay inherent in the processing of images and/or sets of instructions (e.g., display routine 640 and/or modifying routine 650 ) by a device 610 .
  • various embodiments of the present invention provide for the ability of a user to witness to effects of moving a control line in a first 2D image on other 2D images, all while viewing both the first and other 2D images.
  • a user viewing a first, second and final 2D images all at the same time may move a control line in the first and/or second images (multiple times and in any order), and witness the effects of moving the control line(s) on the display of the first and/or second and/or final 2D images all at the same time.
  • each of the viewports displayed by display routine 640 may be linked so that a control line in one viewport controls the angle of display or angle of cross-section in another 2D image or stack of images displayed in another viewport.
  • first viewport 120 may be linked to second viewport 130 and second viewport 130 may be linked to final viewport 110 so that movement of first control line 122 in first viewport 120 controls the angle of display of a 2D image displayed in second viewport 130 , and movement of second control line 132 controls the angle of display of a 2D image displayed in final viewport 110 .
  • modifying routine 650 alters an angle of display or angle of cross-section of all 2D images displayed in viewports subsequent to the viewport in which a control line is moved.
  • modifying routine 650 alters the angle of display of the 2D image displayed in second control viewport 130 , which consequently causes the angle of display of the 2D image displayed in third control viewport 140 to be altered by modifying routine 650 , which then consequently causes the angle of display of the 2D image displayed in final viewport 110 to be altered by modifying routine 650 , for example.
  • the movement of a control line in one viewport in a sequence of viewports therefore causes a domino-like effect on the angle(s) of display in the 2D images displayed in all subsequent viewports.
  • modifying routine 650 is capable of modifying a plurality of angles of display in a plurality 2D images in any order selected by a user. For example, a user may first move control line 122 in first viewport 120 , which causes modifying routine 650 to alter the angles of display of the 2D images displayed in second, third and final viewports 130 , 140 , 110 . Next, the user may move control line 132 in second viewport 130 , which then causes modifying routine 650 to alter the angles of display or angle of cross-section of the 2D images displayed in third and final viewports 140 , 110 , for example.
  • the user may move control line 122 in first control viewport 120 again, thereby causing modifying routine 650 to alter the angles of display of the 2D images displayed in second, third and final viewports 130 , 140 , 110 , for example.
  • a user of system 600 may repeatedly move multiple control lines in multiple viewports in order to achieve a 2D image in a final viewport at a desired angle of display or angle of cross-section.
  • the present invention provide a user with the ability to adjust two or more cross-sectional images or stacks of images at the same time without having to page back and forth between different image processing steps. Therefore, the present invention also minimizes the number of iterations required to correctly view desired anatomy in a 2D cross-sectional image.
  • one or more control lines in one or more viewports may include additional markings or controls.
  • FIG. 4 illustrates first control viewport 120 with additional control markings 126 in accordance with an embodiment of the present invention.
  • at least one technical effect of display routine 640 is to display a control line such as line 124 along with one or more markings such as the dashed lines of control markings 126 (for example) shown in FIG. 4 to represent slice thickness. Slice thickness relates to how “thick” a 2D image is.
  • a user can prescribe how thick a 2D image is; it is as if the user stacked a number of 2D images on top of each other and the display of the final 2D image is a combination of the images.
  • There are several ways to combine the pixels For example, each pixel in the final 2D image can be the highest value pixel in the stack of 2D images. In another example, each pixel in the final 2D image can be an average of all the pixels in the stack of 2D images.
  • the stack of 2D images is referred to as a “slab.”
  • Slice thickness refers to the thickness of this slab. For medical images slice thickness can be measured in mm. That is, slice thickness can be measured in real-world space.
  • markings 126 may be displayed in any one or more viewports displayed by display routine 640 .
  • the slice thickness indicated by markings 126 may be adjusted by a user of device 610 .
  • a user may employ an input device connected to device 610 to adjust the slice thickness of a 2D image by moving markings 126 .
  • one or more control lines in one or more viewports may include additional markings or controls to aid in the movement of a control line.
  • FIG. 6 illustrates first and second control viewports 120 , 130 with additional rotational control markings 128 , 138 in accordance with an embodiment of the present invention.
  • at least one technical effect of display routine 640 is to display rotational control markings 128 , 138 on one or more ends of control lines 122 , 132 as shown in FIG. 6 .
  • Rotational control markings 128 , 138 can be used to aid a user in rotating a control line.
  • a user may employ an input device, such as a mouse or stylus, to click or select one of control markings 128 , 138 .
  • an input device such as a mouse or stylus
  • the user may move the input device and cause the corresponding control line to rotate.
  • circles are used in FIG. 6 as rotational control markings 128 , 138
  • any geometric shape or object may be used in accordance with an embodiment of the present invention.
  • markings 128 , 138 are shown as displayed in first and second control viewports 120 , 130 along with control lines 122 , 132 in FIG. 6
  • markings 128 , 138 may be displayed in any one or more viewports displayed by display routine 640 along with any control line.
  • FIG. 2 illustrates a flowchart for a method 200 to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with an embodiment of the present invention.
  • Method 200 may be used in conjunction with screenshot 100 of FIG. 100 in an embodiment of the present invention.
  • First control viewport 120 includes a cross-sectional view of a 3D image, such as a 2D cross-sectional image or stack of images.
  • a user may select the exact angle and/or plane at which the image is viewed in first control viewport 120 .
  • the exact angle and/or plane at which the image is viewed in viewport 120 may be preset.
  • Second control viewport 130 includes a second cross sectional view of the 3D image.
  • a user may select the exact angle and/or plane at which the image is viewed in second control viewport 130 .
  • the exact angle and/or plane at which the image is viewed in second control viewport 130 may be preset.
  • step 240 a determination is made as to whether additional control viewports are selected or defined. If additional control viewports are to be selected or defined, then method 200 proceeds to step 265 and an additional control viewport is selected or defined.
  • the additional control viewport can include, for example, third control viewport 140 .
  • Third control viewport 140 includes a third cross sectional view of the 3D volume. A user may select the exact angle and/or plane at which the image is viewed in third control viewport 140 . In another embodiment, the exact angle and/or plane at which the image is viewed in third control viewport 140 may be preset.
  • each control viewport 120 , 130 , 140 may display a cross sectional view of a 3D volume.
  • control viewport 120 may display a coronal (anterior) view of a 3D volume of a patient's spine.
  • control viewport 130 may display an oblique cross sectional view of the same 3D volume and control viewport 140 may display a sagittal (left) cross sectional view of the same 3D volume, for example.
  • Another viewport e.g., final viewport 110 in screen shot 100 of FIG. 1 may display a final view of the 3D volume.
  • Final viewport 110 may include another cross sectional image (such as another oblique cross sectional image of the 3D volume) defined by the movement of control lines 122 , 132 in control viewports 120 , 130 .
  • step 265 After the additional, or Nth, control viewport is defined or selected at step 265 , method 200 proceeds to step 240 where another determination is made as to whether an additional control viewport is to be selected or defined. In this way, method 200 proceeds in a loop between steps 240 and 265 until all Nth control viewports are defined or selected. Once the total number of control viewports have been defined or selected, method 200 proceeds from step 240 to step 260 . A user may select the total number of control viewports. In another embodiment, the total number of control viewports may be preset.
  • a control line 122 , 132 is defined or selected in one or more of the control viewports.
  • control line 122 may be selected or defined in control viewport 120 and/or control line 132 may be selected or defined in control viewport 130 of FIG. 1 .
  • a control line 122 , 132 can represent a cross-sectional plane in which a view of the 3D volume is presented in a subsequent control viewport and/or in final viewport 110 .
  • control line 122 in first control viewport 120 may represent the 2D image plane or cross-section of a 3D volume displayed in second control viewport 130 ;
  • control line 132 in second control viewport 130 may represent the 2D image plane or cross-section of the 3D volume displayed in final viewport 110 .
  • step 270 a determination is made as to whether a control line 122 in a first control viewport 120 is moving or has been moved. If control line 122 is moving or has been moved (for example, by a user selecting the line 122 with an input device such as a mouse or stylus and moving line 122 in viewport 120 ), method 200 proceeds to step 275 .
  • control line and 2D view of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the first viewport. For example, as control line 122 in first control viewport 120 is moved, the cross-sectional plane defined by control line 122 changes. Therefore, as control line 122 moves, the 2D image or stack of images displayed in second control viewport 130 makes a corresponding change. In addition, the 2D images displayed in subsequent viewports (for example, second and third control viewports 130 , 140 and final viewport 110 ) and/or control lines 132 , 142 in subsequent viewports 130 , 140 may also make a corresponding change. Such a simultaneous change allows for the effects of moving a control line in one control viewport to be viewed in one or more subsequent viewports. After step 275 , method 200 proceeds to step 280 .
  • control line 122 in first control viewport 120 is not or was not moved, then method 200 proceeds to step 280 .
  • control line 132 in second control viewport 130 is moving or has been moved. If control line 132 is moving or has been moved (for example, by a user selecting the line 132 with an input device such as a mouse or stylus and moving line 132 in viewport 130 ), method 200 proceeds to step 285 .
  • the 2D view of the 3D volume displayed in subsequent control viewports and/or the final viewport is moved based on the movement of the control line in the second viewport. For example, as control line 132 in second control viewport 130 is moved, the cross-sectional plane defined by control line 132 changes. Therefore, as control line 132 in second control viewport 130 moves, the 2D image displayed in subsequent control viewports (e.g., third control viewport 140 ) and/or in final viewport 110 makes a corresponding change. In addition, control lines in subsequent viewports (e.g., control line 142 in third control viewport 140 ) may also make corresponding changes.
  • control line 132 in second control viewport 130 does not affect the image or control line 122 displayed in the previous first control viewport 120 .
  • Any control line 132 movement only affects the display of images in subsequent control viewports and in final viewport 110 .
  • step 285 method 200 proceeds to step 290 .
  • step 280 If a determination is made at step 280 that control line 132 in second control viewport 130 is not or was not moved, then method 200 proceeds to step 290 .
  • step 290 a determination is made as to whether a control line in the next control viewport is moving or has been moved. If a control line in the next control viewport is moving or has been moved (for example, by a user selecting the line 142 in third control viewport 140 with an input device such as a mouse or stylus and moving line 142 in viewport 140 ), method 200 proceeds to step 295 .
  • the 2D image or stack of images of the 3D volume displayed in subsequent control viewports and/or the final viewport is moved based on the movement of the control line in the previous control viewport.
  • control lines in subsequent control viewports are also moved based on the movement of the control line in the previous control viewport. For example, as control line 142 in third control viewport 140 is moved, the cross-section defined by control line 142 changes. Therefore, as the cross-section or 2D image plane changes in third control viewport 140 (again, as defined by control line 142 ), the 2D image displayed in subsequent control viewports and in final viewport 110 makes a corresponding change. In addition, the control lines displayed in subsequent control viewports also make a corresponding change.
  • control line 142 in third control viewport 140 does not affect the image or control lines 122 , 132 displayed in the previous first and second control viewports 120 , 130 .
  • Any control line 142 movement only affects the display of images and control lines in subsequent control viewports and in final viewport 110 .
  • step 297 a determination is made if a user is able to see desired anatomy correctly based on the previous movement of one or more control lines. If the desired anatomy is able to be correctly viewed according to a user's wishes, method 200 proceeds to step 299 where the user views the anatomy. If the desired anatomy is not able to be correctly viewed, method 200 proceeds back to step 270 , where one or more of the control lines may be moved in order to properly position the anatomy for the user.
  • Steps 270 through 297 continue in a loop until the desired anatomy is positioned in a manner preferential to a user (that is, until the desired anatomy is able to be viewed).
  • method 200 may proceed from any one or more of steps 280 , 285 , 290 and 295 to steps 270 , 280 , or 290 .
  • method 200 may proceed from any one or more of steps 290 and 295 to step 280 .
  • method 200 may proceed by determining whether a control line has been moved or is moved in a subsequent control viewport (e.g., at step 280 or 290 , and then changing the view in all viewports subsequent to that control viewport at step 285 or 295 , respectively) and then determining whether a control line in a previous viewport control has been moved or is being moved (e.g., by proceeding back to step 270 or 280 ).
  • a control line in an (N ⁇ 1) control viewport may be moved, thereby causing the views in all subsequent control viewports (e.g., the Nth viewport and the final viewport) to change, as described above.
  • a control line in a previous control viewport e.g., an (N ⁇ 2), (N ⁇ 3), (N ⁇ 4) and so on control viewport
  • a control line in a previous control viewport may be moved, thereby causing the views in all subsequent viewports to change, as described above.
  • FIG. 3 illustrates a flowchart for a method 400 to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with another embodiment of the present invention.
  • Method 400 may be used in conjunction with screenshot 100 of FIG. 100 in an embodiment of the present invention.
  • First control viewport 120 includes a cross-sectional view of a 3D volume.
  • a user may select the exact angle and/or plane at which the image is viewed in first control viewport 120 .
  • the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in first control viewport 120 may be preset.
  • a control line 122 is defined or selected in first control viewport 120 , as described above.
  • control line 122 may be selected or defined in control viewport 120 and/or control line 132 may be selected or defined in control viewport 130 of FIG. 1 .
  • step 440 a determination is made as to whether a control line 122 in first control viewport 120 is moving or has been moved, as described above. If control line 122 is moving or has been moved (for example, by a user selecting the line 122 with an input device such as a mouse or stylus and moving line 122 in viewport 120 ), method 400 proceeds to step 445 .
  • step 445 the control line and cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the first viewport, as described above.
  • control line 122 in first control viewport 120 is moved, the cross-sectional plane defined by control line 122 changes. Therefore, as the cross-sectional plane changes in first control viewport 120 (again, as defined by control line 122 ), the cross-sectional image or stack of images displayed in subsequent control viewports (e.g., second and third control viewports 130 , 140 ) and in final viewport 110 makes a corresponding change.
  • any control lines 132 , 142 in subsequent viewports 130 , 140 may also make a corresponding change.
  • step 440 If a determination is made at step 440 that control line 122 in first control viewport 120 is not or was not moved, then method 400 proceeds to step 450 .
  • a second control viewport 130 is selected or defined, as described above.
  • Second control viewport 130 includes a second cross sectional image or stack of images of the 3D volume.
  • a user may select the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in second control viewport 130 .
  • the exact angle and/or cross-sectional plane at which the image is viewed in second control viewport 130 may be preset.
  • control line 132 in second control viewport 130 is moving or has been moved, as described above. If control line 132 is moving or has been moved (for example, by a user selecting the line 132 with an input device such as a mouse or stylus and moving line 132 in viewport 130 ), method 400 proceeds to step 465 .
  • the cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the second viewport, as described above.
  • control line 132 in second control viewport 130 the angle of display or angle of cross-section of the cross-sectional image or stack of images defined by control line 132 changes. Therefore, as the cross-sectional plane changes in second control viewport 130 (again, as defined by control line 132 ), the 2D image or stack of images displayed in subsequent control viewports (e.g., third control viewport 140 ) and in final viewport 110 makes a corresponding change.
  • control lines in subsequent viewports may also make corresponding changes.
  • any movement of control line 132 in second control viewport 130 does not affect the image or control line 122 displayed in the previous first control viewport 120 .
  • Any control line 132 movement only affects the display of images in subsequent control viewports 140 and in final viewport 110 .
  • method 400 proceeds to step 470 .
  • step 460 If a determination is made at step 460 that control line 132 in second control viewport 130 is not or was not moved, then method 400 proceeds to step 470 .
  • step 470 a determination is made as to whether additional control viewports are selected or defined, as described above. If additional control viewports are to be selected or defined, then method 400 proceeds to step 480 and an additional control viewport is selected or defined, as described above.
  • the additional control viewport can include, for example, third control viewport 140 .
  • Third control viewport 140 includes a third cross sectional image or stack of images of the 3D volume. A user may select the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in third control viewport 140 . In another embodiment, the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in third control viewport 140 may be preset.
  • step 480 After the additional, or Nth, control viewport is defined or selected at step 480 , method 400 proceeds to step 485 where a control line is defined or selected in the additional control viewport defined or selected at step 480 , as described above. From step 485 , method 400 proceeds back to step 470 where another determination is made as to whether an additional control viewport is to be selected or defined, as described above. In this way, method 400 proceeds in a loop between steps 470 , 480 and 485 and 485 until all Nth control viewports are defined or selected. Once the total number of control viewports have been defined or selected, method 400 proceeds from step 470 to step 490 . A user may select the total number of control viewports. In another embodiment, the total number of control viewports may be preset.
  • step 490 a determination is made as to whether a control line in any control viewport is moving or has been moved, as described above. If a control line in the next control viewport is moving or has been moved (for example, by a user selecting the line 142 in third control viewport 140 with an input device such as a mouse or stylus and moving line 142 in viewport 140 ), method 400 proceeds to step 495 .
  • the cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the previous control viewport, as described above.
  • control lines in subsequent control viewports may also be moved based on the movement of the control line in the previous control viewport. For example, as control line 142 in third control viewport 140 is moved, the cross-sectional plane defined by control line 142 changes. Therefore, as the cross-sectional plane changes in third control viewport 140 (again, as defined by control line 142 ), the 2D image or stack of images displayed in subsequent control viewports and in final viewport 110 makes a corresponding change.
  • control lines displayed in subsequent control viewports also make a corresponding change.
  • any movement of control line 142 in third control viewport 140 does not affect the image or control lines 122 , 132 displayed in the previous first and second control viewports 120 , 130 .
  • Any control line 142 movement only affects the display of images and control lines in subsequent control viewports and in final viewport 110 .
  • step 490 If it is determined at step 490 that no control lines in any of the control viewports have been moved, method 400 proceeds to step 497 where a user views the anatomy.
  • Steps 490 and 495 continue in a loop until all control viewports defined or selected in steps 470 and 480 have been examined to determine whether a control line in each viewport has been moved. For example, a user may first move a control line in a first control viewport, followed by moving a control line in a second control viewport, followed by an additional movement of the control line in the first control viewport.
  • This invention minimizes the number of iterations required to visualize specific anatomy by visualizing the multiple steps, at the same time.

Abstract

The present invention provides a system and a method for simultaneously modifying one or more angles of cross-sections in one or more cross-sectional images or stacks of images of a 3D volume. A plurality of viewports is displayed to a user. Each viewport includes a 2D cross-sectional image or stack of images and all 2D images or stacks of images are displayed simultaneously. One or more viewports may also include a control line representative of an angle of cross-section or image plane of a 2D image or stack of images displayed in another viewport. If a control line is moved in a selected viewport, the angle of cross-section or image plane of a 2D image or stack of images in another viewport is accordingly altered simultaneous with such control line movement.

Description

    RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 60/695,327, filed Jun. 23, 2005, entitled “A Method to Define the 3D Oblique Cross-Section of Anatomy at a Specific Angle and be Able to Easily Modify Multiple Angles of Display Simultaneously.” The disclosure of the '327 application is hereby incorporated by reference in its entirety.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • Not applicable.
  • BACKGROUND OF THE INVENTION
  • The present invention generally relates to the viewing of medical images at several angles. More specifically, the present invention relates to defining a three-dimensional (“3D”) oblique cross-section of an anatomy at a specific angle and to modify additional angles of display simultaneously.
  • Current systems and methods allow for the viewing of a 3D volume of an object, such as a patient's anatomy, in one or more two-dimensional (“2D”) cross-sectional images or cross-sectional stack of images of the 3D volume. The angle of display in the 2D cross-sectional images may be manipulated by a user.
  • The angle of display in the 2D images may be manipulated or adjusted by rotating the image about a line (such as a control line, for example) displayed on a 2D image. In general, movement of a control line in a first 2D image can affect the angle of display of a 2D image in another, subsequently viewed 2D image. For example, current systems may display a control line over a first 2D cross-sectional image. This control line can represent a 2D image plane in which a view of the 3D image is presented in a subsequent 2D image. By moving the control line in the first 2D image, the 2D image plane (and consequently the 2D image representation of the 3D image) in the second 2D image will change when the second 2D image is viewed. Moreover, current systems permit the user to then move a control line in the second 2D image to adjust the 2D image plane of the next subsequent 2D image. The movement of a control line in one 2D image affects the angle of display or 2D image plane in all subsequent 2D images. Moreover, when a control line in a first 2D image is moved, the angles of display or image planes in all subsequent 2D images are adjusted. In other words, changing the angle of display in a first 2D image has a domino effect of changing the angle of display in all subsequent 2D images.
  • The movement of control lines so as to adjust angles of display in multiple 2D images is used to obtain a final 2D image that is positioned correctly according to a user's needs. By moving control lines in one or more 2D images so as to affect the angles of display or the angle of the cross-section in subsequent 2D images (including the final 2D image), a user is able to obtain a preferred angle of display in the final 2D image.
  • However, while current systems may display more than a single 2D image at a time, these systems typically display only a single step of the double oblique process. That is, current systems display one image with a control line, and another image with the result(s) of moving that control line. Current systems do not include multiple control lines in multiple images, thereby prohibiting the display of multiple steps of the double oblique process. Therefore, users are unable to move multiple control lines in multiple 2D images and are therefore unable to witness multiple steps of the double oblique process simultaneously.
  • Thus, current systems and methods do not allow the easy manipulation of the angle of display or the angle of cross-section in multiple cross-sectional images or cross-sectional stacks of images of a 3D volume in multiple viewports or displays. For example, when attempting to view a specific portion of an anatomy in a medical image in 3D Maximum Intensity Pixel (“MIP”)/Multi-Planar Reformat (“MPR”) mode, free rotation, or rotation about a single line cannot always position the desired anatomy correctly in one or more of the subsequent viewports. In other words, a user of a software application attempting to view a portion of an anatomy in a particular position may need to position the anatomical portion using multiple rotations about several lines in multiple viewports.
  • For example, when attempting to obtain a view through a spinal disk on a patient with scoliosis (curved spine), the user would need to complete a two-step cross-sectioning process using existing methods:
      • 1. Prescribe a pseudo-sagittal view by defining a line along the spine on a coronal view.
      • 2. Prescribe a view straight through the disk by defining a line on the pseudo sagittal view.
  • As described above, a major drawback of the existing methods is that the user can only see one prescribed cross-sectional view at a time. This implies that while the user is looking at the final view he can no longer see the coronal view; likewise, while performing the first step the user cannot see the final view. As a result, if the user cannot obtain the view through the disk correctly, the user may need to go back to the first step to adjust the pseudo-sagittal view from the coronal image. However, the user will no longer be able to see if the adjustments made are correct on the final image. This will typically result in multiple iterations before a correct view through the disk can be obtained.
  • The above example shows that the existing oblique cross-sectioning methods are oftentimes time-consuming when used to obtain a view through multiple cross-sections. Therefore, a need exists for a more efficient method for obtaining a view through multiple cross-sections of an image.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention provides a method to modify one or more angles of cross-section in one or more cross-sectional images of a 3D volume simultaneously. The method includes providing a plurality of viewports, where each of the viewports is configured to display one or more 2D images, moving a control line in a selected viewport, and altering an angle of cross-section of at least one of the 2D images in at least one viewport other than the selected viewport simultaneous with the step of moving the control line. The 2D images can represent one or more cross-sections of the 3D volume.
  • The present invention also provides a computer-readable storage medium including a set of instructions for a computer. The set of instructions includes a display routine and an angle of cross-section modifying routine. The display routine is configured to display a 2D image representative of a cross-section of a 3D volume in each of a plurality of viewports. The modifying routine is configured to alter an angle of cross-section of at least one of the 2D images based on and simultaneous with a movement of a control line in a selected viewport.
  • The present invention also provides a method for adjusting an angle of cross-section in at least one of a plurality of 2D images. The method includes providing a plurality of viewports including first, second and final viewports, displaying a plurality of control lines including first and second control lines, moving the first control line, and adjusting a first angle of display of a second 2D image and a second angle of display of a final 2D image simultaneous with moving the first control line. The first, second and final viewports are configured to display first, second and final 2D images, respectively. The first, second and final 2D images each representative of one or more cross-sections of a 3D volume. The first control line is displayed in the first viewport and the second control line is displayed in the second viewport. The first control line is configured to represent cross-sectional plane of the second 2D image at a first angle of cross-section and the second control line is configured to represent a cross-sectional plane of the final 2D image at a second angle of cross-section.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 illustrates a screenshot of multiple viewports according to an embodiment of the invention.
  • FIG. 2 illustrates a flowchart for a method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates a flowchart for a method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with another embodiment of the present invention.
  • FIG. 4 illustrates first control viewport with additional control markings in accordance with an embodiment of the present invention.
  • FIG. 5 illustrates a system for simultaneously modifying one or more angles of display or angles of cross-section in one or more cross-sectional images or cross-sectional stack of images of a 3D volume according to an embodiment of the present invention.
  • FIG. 6 illustrates first and second control viewports with additional rotational control markings in accordance with an embodiment of the present invention.
  • The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 5 illustrates a system 600 for simultaneously modifying one or more angles of display or angles of cross-section in one or more cross-sectional images or cross-sectional stack of images of a 3D volume according to an embodiment of the present invention. System 600 includes a computing device 610 and a computer-readable storage medium 620. Computing device 610 may include any one or more interconnected machines capable of carrying out operations based on one or more set of instructions. While a personal computer is shown as device 610 in FIG. 5, the various embodiments of the present invention are not limited to a personal computer. Any one or more interconnected machines capable of carrying out operations based on one or more set of instructions may comprise device 610. A set of instructions may include, for example, a software application or program, for example. Medium 620 may include any computer-readable storage medium, such as a local and/or remote memory. For example, medium 620 may include a computer hard drive (internal or external) or a memory in a server accessible via a network connection.
  • Medium 620 includes a memory 630 and one or more sets of instructions including a display routine 640 and an angle of cross-section modifying routine 650. Memory 630 may include any portion of medium 620 dedicated to the storage of one or more sets of instructions (for example, software applications), imaging studies, and/or images. Display routine 640 may include any set(s) of instruction(s) capable of directing computing device 610 to carry out one or more tasks. Similarly, modifying routine 650 may include one or more set(s) of instruction(s) capable of directing computing device 610 to carry out one or more tasks. As understood by one of ordinary skill in the art, display routine 640 and modifying routine 650 may be written and carried out in any suitable computer-programming language. Display routine 640 and/or modifying routine 650 may be implemented locally (that is, put into operation by a processor of computer or workstation) or remotely (that is, put into operation by a processor of a remote computer, workstation or server), for example.
  • A user may employ system 600 to view a plurality of 2D cross-sectional images or cross-sectional stack of images of a 3D volume. In accordance with an embodiment of the present invention, at least one technical effect of display routine 640 and modifying routine 650 is to permit a user to simultaneously modify one or more angles of display or angles of cross-section in one or more cross-sectional images or stacks of images of a 3D volume. For example, a user may access one or more images and/or imaging studies of a patient anatomy using device 610. The images and/or imaging studies may be stored locally on a memory 630 in device 610 or in a memory 630 remote from device 610 and accessible via one or more network connections, for example.
  • A user may load or run a software application to examine the image(s) and/or imaging study(ies) on device 610. In an embodiment of the present invention, device 610 loads display routine 640 to display one or more 2D images. The 2D image(s) may be presented in one or more viewports. A viewport may include a subscreen or subdivision of a display screen of device 610.
  • FIG. 1 illustrates a screenshot 100 of multiple viewports according to an embodiment of the invention. Screenshot 100 is a visual representation of the viewports displayed on device 610 when display routine 640 is implemented or carried out by device 610. For example, screenshot 100 may be a visual representation of computer software running on device 610. Screenshot 100 includes multiple viewports 110, 120, 130 and 140. While four viewports are illustrated in FIG. 1, any number of viewports may be used in accordance with the present invention. For example, two or more viewports may be used in accordance with embodiments of the present invention.
  • Each of viewports 110, 120, 130, 140 may include a 2D cross-sectional image of a 3D imaged object at a given angle of display. In an embodiment of the present invention, a user may select which 2D image is displayed in each viewport. In another embodiment of the present invention, the 2D image(s) displayed in each viewport is preset.
  • In an embodiment of the present invention, a 2D image displayed in first viewport 120 is displayed at a preset angle of display. In another embodiment of the present invention, the angle of display of a 2D image displayed in first viewport 120 can be free rotated to any position by a user of device 610. For example, a user may employ an input device connected to device 610 to free rotate the angle of display of a 2D image displayed in first viewport 120.
  • At least one technical effect of display routine 640 is to create a sufficient number of viewports to display all 2D images simultaneously. In other words, display routine 640 can be used to display a number of viewports sufficient to display all 2D images desired by a user of device 610.
  • In an embodiment of the present invention, the viewports created by display routine 640 may be provided in a sequence. In other words, viewport 120 may be the first viewport, viewport 130 may be the second viewport, viewport 140 may be the third viewport, and viewport 110 may be the final viewport, for example.
  • The angle of display or angle of cross-setion for each 2D image or stack of images may be equivalent or differ in all viewports or in any subset of viewports. In an embodiment of the present invention, a user may select the angle of display and/or plane for each 2D image displayed in each viewport. In another embodiment of the present invention, the exact angle and/or plane at which the image is viewed in one or more viewports may be preset.
  • In an embodiment of the present invention, the angle of display or image plane of a 2D image displayed in one viewport is orthogonal to the angle of display or image plane of a 2D image displayed in a previous viewport. In another embodiment, the angle of display or image plane of a 2D image displayed in one viewport is orthogonal to the angle of display or image plane of a 2D image displayed in a previous viewport and to the angle of display or image plane of a 2D image displayed in a subsequent viewport.
  • In an embodiment of the present invention, a 2D image displayed in a viewport is centered at a control line of a 2D image displayed in a previous viewport.
  • As described above, each control viewport 120, 130, 140 may display a cross sectional view of a 3D volume. For example, in FIG. 1, viewport 120 may display a coronal (anterior) view of a 3D volume of a patient's spine. In addition, viewport 130 may display an oblique cross sectional view of the same 3D volume and viewport 140 may display a sagittal (left) cross sectional view of the same 3D image, for example. Another viewport (e.g., viewport 110) in screen shot 100 of FIG. 1 may display a final view of the 3D volume. Final viewport 110 may include another cross sectional image (such as another oblique cross sectional image of the 3D image) defined by the movement of control lines 122, 132 in control viewports 120, 130, as described in more detail below.
  • One or more viewports created by display routine 640 can include a control line. For example, viewports 120 and 130 of screenshot 100 each include control lines 122 and 132, respectively. A control line 122, 132 can represent a plane in which a view of the 3D image is presented in a subsequent control viewport and/or in final viewport 110. For example, control line 122 in first control viewport 120 may represent the 2D image plane of a 3D volume displayed in second control viewport 130 and control line 132 in second control viewport 130 may represent the 2D image plane of the 3D volume displayed in final viewport 110.
  • In an embodiment of the present invention, each viewport in a subset of the viewports displayed by display routine 640 includes a control line.
  • In an embodiment of the present invention, each viewport may include additional markings to help orient a user. For example, a viewport may include a directional marker 124, 134. Directional marker 124, 134 may represent the direction at which the 3D volume is presented in final viewport 110 as a 2D image. In other words, control line 122 in viewport 120 represents the 2D cross-section in which the 3D volume is presented in final viewport 110. Without directional marker 124 in viewport 120, it may be unclear the direction from which the image plane (defined by control line 122) is viewed. For example, without directional marker 124 pointing down and to the left of control viewport 120, it may be unclear whether the 2D image plane defined by control line 122 is being viewed from the right or the left as displayed in final viewport 110. However, with the inclusion of directional marker 124, it is clear that the 2D image in final viewport 110 represents a 2D image plane defined by control line 122 as viewed from the right side of control line 122.
  • In another example, directional marker 134 in viewport 130 may assist a user in determining that the image plane defined by control line 132 is being viewed (in final viewport 110) from the bottom of control line 132.
  • In an embodiment of the present invention, directional marker 124, 134 is represented by a triangle. However, in other embodiments, directional marker 124, 134 may be represented by any other symbol or marking that conveys the direction at which an image plane defined by a control line is viewed. For example, directional marker 124, 134 may be represented by an arrow or line.
  • Directional marker 124, 134 may also include a control line center point. The center point may represent a center point along a control line. In other words, directional marker 124 can include a circle, for example, indicating the center of control line 122. In another example, directional marker 134 can include a circle, for example, indicating the center of control line 132. While a circle is used for the center point in directional marker 124, 134, any point, line, or geometric object may be used to indicate the center point of a control line.
  • The center point of a control line may also be represented by a center point as represented by 116, 146 in FIG. 1. In other words, the “X” in final viewport 110 marked with 116 and the “X” in third control viewport 140 marked with 146 represent the center point of control lines. However, while an “X” is used as center points 116, 146, any point, line, or geometric object may be used in its place.
  • A user may employ an input device connected to or included in device 610 to move one or more control lines in one or more viewports. When a control line in one viewport is moved, an angle of display or angle of cross section of a 2D image or stack of images displayed in another viewport is accordingly modified or altered by modifying routine 650. In other words, in accordance with an embodiment of the present invention, at least one technical effect of modifying routine 650 is to modify or alter an angle of display or angle of cross-section of a 2D image or stack of cross-sectional images based on and simultaneous with the movement of a control line in another 2D image.
  • As described above, a control line in one viewport represents an image plane of at least one other 2D image displayed in at least one other viewport at a given angle of display. When the control line is moved, the image plane of the at least one other 2D image displayed in the at least one other viewport simultaneously changes.
  • As described above, at least one technical effect of modifying routine 650 is to cause device 610 to alter the angle of display or angle of cross-section of a 2D image or stack of images as a corresponding control line is moved. In other words, as a user moves control line 122 displayed in a first viewport 120, modifying routine 650 causes an angle of display or angle of cross-section of another 2D image in another viewport (such as second, third or final viewports 130, 140, 110) to make a corresponding change, at the same time that the control line is moved for example. By doing so, modifying routine 650 permits a user to witness the real time effects of moving a control line in one viewport on the angle of display or angle of cross-section of at least one other 2D image in at least one other viewport. Such a simultaneous change allows for the effects of moving a control line in one control viewport to be viewed in one or more subsequent viewports.
  • As understood by one of ordinary skill in the art, the terms and phrases “real time,” “at the same time,” and “simultaneous” do not exclude any small delay inherent in the processing of images and/or sets of instructions (e.g., display routine 640 and/or modifying routine 650) by a device 610. Instead, various embodiments of the present invention provide for the ability of a user to witness to effects of moving a control line in a first 2D image on other 2D images, all while viewing both the first and other 2D images. For example, a user viewing a first, second and final 2D images all at the same time may move a control line in the first and/or second images (multiple times and in any order), and witness the effects of moving the control line(s) on the display of the first and/or second and/or final 2D images all at the same time.
  • In an embodiment of the present invention, each of the viewports displayed by display routine 640 may be linked so that a control line in one viewport controls the angle of display or angle of cross-section in another 2D image or stack of images displayed in another viewport. For example, first viewport 120 may be linked to second viewport 130 and second viewport 130 may be linked to final viewport 110 so that movement of first control line 122 in first viewport 120 controls the angle of display of a 2D image displayed in second viewport 130, and movement of second control line 132 controls the angle of display of a 2D image displayed in final viewport 110.
  • In an embodiment of the present invention, modifying routine 650 alters an angle of display or angle of cross-section of all 2D images displayed in viewports subsequent to the viewport in which a control line is moved. In other words, if control line 122 in first control viewport 120 is moved, modifying routine 650 then alters the angle of display of the 2D image displayed in second control viewport 130, which consequently causes the angle of display of the 2D image displayed in third control viewport 140 to be altered by modifying routine 650, which then consequently causes the angle of display of the 2D image displayed in final viewport 110 to be altered by modifying routine 650, for example. The movement of a control line in one viewport in a sequence of viewports therefore causes a domino-like effect on the angle(s) of display in the 2D images displayed in all subsequent viewports.
  • In an embodiment of the present invention, modifying routine 650 is capable of modifying a plurality of angles of display in a plurality 2D images in any order selected by a user. For example, a user may first move control line 122 in first viewport 120, which causes modifying routine 650 to alter the angles of display of the 2D images displayed in second, third and final viewports 130, 140, 110. Next, the user may move control line 132 in second viewport 130, which then causes modifying routine 650 to alter the angles of display or angle of cross-section of the 2D images displayed in third and final viewports 140, 110, for example. Finally, the user may move control line 122 in first control viewport 120 again, thereby causing modifying routine 650 to alter the angles of display of the 2D images displayed in second, third and final viewports 130, 140, 110, for example. In this way, a user of system 600 may repeatedly move multiple control lines in multiple viewports in order to achieve a 2D image in a final viewport at a desired angle of display or angle of cross-section. In other words, the present invention provide a user with the ability to adjust two or more cross-sectional images or stacks of images at the same time without having to page back and forth between different image processing steps. Therefore, the present invention also minimizes the number of iterations required to correctly view desired anatomy in a 2D cross-sectional image.
  • In an embodiment of the present invention, one or more control lines in one or more viewports may include additional markings or controls. FIG. 4 illustrates first control viewport 120 with additional control markings 126 in accordance with an embodiment of the present invention. In accordance with an embodiment of the present invention, at least one technical effect of display routine 640 is to display a control line such as line 124 along with one or more markings such as the dashed lines of control markings 126 (for example) shown in FIG. 4 to represent slice thickness. Slice thickness relates to how “thick” a 2D image is. For example, a user can prescribe how thick a 2D image is; it is as if the user stacked a number of 2D images on top of each other and the display of the final 2D image is a combination of the images. There are several ways to combine the pixels. For example, each pixel in the final 2D image can be the highest value pixel in the stack of 2D images. In another example, each pixel in the final 2D image can be an average of all the pixels in the stack of 2D images. The stack of 2D images is referred to as a “slab.” Slice thickness refers to the thickness of this slab. For medical images slice thickness can be measured in mm. That is, slice thickness can be measured in real-world space.
  • While dashed lines are used in FIG. 4 as additional control markings 126, any geometric shape or object may be used to indicate a slice thickness in accordance with an embodiment of the present invention. In addition, while markings 126 are shown as displayed in first control viewport 120 in FIG. 4, markings 126 may be displayed in any one or more viewports displayed by display routine 640.
  • In an embodiment of the present invention, the slice thickness indicated by markings 126 may be adjusted by a user of device 610. For example, a user may employ an input device connected to device 610 to adjust the slice thickness of a 2D image by moving markings 126.
  • In an embodiment of the present invention, one or more control lines in one or more viewports may include additional markings or controls to aid in the movement of a control line. FIG. 6 illustrates first and second control viewports 120, 130 with additional rotational control markings 128, 138 in accordance with an embodiment of the present invention. In accordance with an embodiment of the present invention, at least one technical effect of display routine 640 is to display rotational control markings 128, 138 on one or more ends of control lines 122, 132 as shown in FIG. 6. Rotational control markings 128, 138 can be used to aid a user in rotating a control line. For example, a user may employ an input device, such as a mouse or stylus, to click or select one of control markings 128, 138. Once a control marking 128, 138 is selected, the user may move the input device and cause the corresponding control line to rotate. While circles are used in FIG. 6 as rotational control markings 128, 138, any geometric shape or object may be used in accordance with an embodiment of the present invention. In addition, while markings 128, 138 are shown as displayed in first and second control viewports 120, 130 along with control lines 122, 132 in FIG. 6, markings 128, 138 may be displayed in any one or more viewports displayed by display routine 640 along with any control line.
  • FIG. 2 illustrates a flowchart for a method 200 to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with an embodiment of the present invention. Method 200 may be used in conjunction with screenshot 100 of FIG. 100 in an embodiment of the present invention.
  • First, at step 210, one or more images are selected to be viewed on a display of a workstation or computer. Next, at step 220, a first control viewport 120 is selected or defined. First control viewport 120 includes a cross-sectional view of a 3D image, such as a 2D cross-sectional image or stack of images. A user may select the exact angle and/or plane at which the image is viewed in first control viewport 120. In another embodiment, the exact angle and/or plane at which the image is viewed in viewport 120 may be preset.
  • Next, at step 230, a second control viewport 130 is selected or defined. Second control viewport 130 includes a second cross sectional view of the 3D image. A user may select the exact angle and/or plane at which the image is viewed in second control viewport 130. In another embodiment, the exact angle and/or plane at which the image is viewed in second control viewport 130 may be preset.
  • Next, at step 240, a determination is made as to whether additional control viewports are selected or defined. If additional control viewports are to be selected or defined, then method 200 proceeds to step 265 and an additional control viewport is selected or defined. The additional control viewport can include, for example, third control viewport 140. Third control viewport 140 includes a third cross sectional view of the 3D volume. A user may select the exact angle and/or plane at which the image is viewed in third control viewport 140. In another embodiment, the exact angle and/or plane at which the image is viewed in third control viewport 140 may be preset.
  • As described above, each control viewport 120, 130, 140 may display a cross sectional view of a 3D volume. For example, control viewport 120 may display a coronal (anterior) view of a 3D volume of a patient's spine. In addition, control viewport 130 may display an oblique cross sectional view of the same 3D volume and control viewport 140 may display a sagittal (left) cross sectional view of the same 3D volume, for example. Another viewport (e.g., final viewport 110) in screen shot 100 of FIG. 1 may display a final view of the 3D volume. Final viewport 110 may include another cross sectional image (such as another oblique cross sectional image of the 3D volume) defined by the movement of control lines 122, 132 in control viewports 120, 130.
  • After the additional, or Nth, control viewport is defined or selected at step 265, method 200 proceeds to step 240 where another determination is made as to whether an additional control viewport is to be selected or defined. In this way, method 200 proceeds in a loop between steps 240 and 265 until all Nth control viewports are defined or selected. Once the total number of control viewports have been defined or selected, method 200 proceeds from step 240 to step 260. A user may select the total number of control viewports. In another embodiment, the total number of control viewports may be preset.
  • At step 260, a control line 122, 132 is defined or selected in one or more of the control viewports. For example, control line 122 may be selected or defined in control viewport 120 and/or control line 132 may be selected or defined in control viewport 130 of FIG. 1. A control line 122, 132 can represent a cross-sectional plane in which a view of the 3D volume is presented in a subsequent control viewport and/or in final viewport 110. For example, control line 122 in first control viewport 120 may represent the 2D image plane or cross-section of a 3D volume displayed in second control viewport 130; control line 132 in second control viewport 130 may represent the 2D image plane or cross-section of the 3D volume displayed in final viewport 110.
  • Next, at step 270, a determination is made as to whether a control line 122 in a first control viewport 120 is moving or has been moved. If control line 122 is moving or has been moved (for example, by a user selecting the line 122 with an input device such as a mouse or stylus and moving line 122 in viewport 120), method 200 proceeds to step 275.
  • At step 275, the control line and 2D view of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the first viewport. For example, as control line 122 in first control viewport 120 is moved, the cross-sectional plane defined by control line 122 changes. Therefore, as control line 122 moves, the 2D image or stack of images displayed in second control viewport 130 makes a corresponding change. In addition, the 2D images displayed in subsequent viewports (for example, second and third control viewports 130, 140 and final viewport 110) and/or control lines 132, 142 in subsequent viewports 130, 140 may also make a corresponding change. Such a simultaneous change allows for the effects of moving a control line in one control viewport to be viewed in one or more subsequent viewports. After step 275, method 200 proceeds to step 280.
  • If a determination is made that control line 122 in first control viewport 120 is not or was not moved, then method 200 proceeds to step 280.
  • At step 280, a determination is made as to whether control line 132 in second control viewport 130 is moving or has been moved. If control line 132 is moving or has been moved (for example, by a user selecting the line 132 with an input device such as a mouse or stylus and moving line 132 in viewport 130), method 200 proceeds to step 285.
  • At step 285, the 2D view of the 3D volume displayed in subsequent control viewports and/or the final viewport is moved based on the movement of the control line in the second viewport. For example, as control line 132 in second control viewport 130 is moved, the cross-sectional plane defined by control line 132 changes. Therefore, as control line 132 in second control viewport 130 moves, the 2D image displayed in subsequent control viewports (e.g., third control viewport 140) and/or in final viewport 110 makes a corresponding change. In addition, control lines in subsequent viewports (e.g., control line 142 in third control viewport 140) may also make corresponding changes. In other words, any movement of control line 132 in second control viewport 130 does not affect the image or control line 122 displayed in the previous first control viewport 120. Any control line 132 movement only affects the display of images in subsequent control viewports and in final viewport 110. After step 285, method 200 proceeds to step 290.
  • If a determination is made at step 280 that control line 132 in second control viewport 130 is not or was not moved, then method 200 proceeds to step 290.
  • At step 290, a determination is made as to whether a control line in the next control viewport is moving or has been moved. If a control line in the next control viewport is moving or has been moved (for example, by a user selecting the line 142 in third control viewport 140 with an input device such as a mouse or stylus and moving line 142 in viewport 140), method 200 proceeds to step 295.
  • At step 295, the 2D image or stack of images of the 3D volume displayed in subsequent control viewports and/or the final viewport is moved based on the movement of the control line in the previous control viewport. In addition, control lines in subsequent control viewports are also moved based on the movement of the control line in the previous control viewport. For example, as control line 142 in third control viewport 140 is moved, the cross-section defined by control line 142 changes. Therefore, as the cross-section or 2D image plane changes in third control viewport 140 (again, as defined by control line 142), the 2D image displayed in subsequent control viewports and in final viewport 110 makes a corresponding change. In addition, the control lines displayed in subsequent control viewports also make a corresponding change. In other words, any movement of control line 142 in third control viewport 140 does not affect the image or control lines 122, 132 displayed in the previous first and second control viewports 120, 130. Any control line 142 movement only affects the display of images and control lines in subsequent control viewports and in final viewport 110.
  • After step 290 and/or step 295, method 200 proceeds to step 297 where a determination is made if a user is able to see desired anatomy correctly based on the previous movement of one or more control lines. If the desired anatomy is able to be correctly viewed according to a user's wishes, method 200 proceeds to step 299 where the user views the anatomy. If the desired anatomy is not able to be correctly viewed, method 200 proceeds back to step 270, where one or more of the control lines may be moved in order to properly position the anatomy for the user.
  • Steps 270 through 297 continue in a loop until the desired anatomy is positioned in a manner preferential to a user (that is, until the desired anatomy is able to be viewed).
  • In an embodiment, method 200 may proceed from any one or more of steps 280, 285, 290 and 295 to steps 270, 280, or 290. In addition, method 200 may proceed from any one or more of steps 290 and 295 to step 280. In other words, method 200 may proceed by determining whether a control line has been moved or is moved in a subsequent control viewport (e.g., at step 280 or 290, and then changing the view in all viewports subsequent to that control viewport at step 285 or 295, respectively) and then determining whether a control line in a previous viewport control has been moved or is being moved (e.g., by proceeding back to step 270 or 280). For example, a control line in an (N−1) control viewport may be moved, thereby causing the views in all subsequent control viewports (e.g., the Nth viewport and the final viewport) to change, as described above. Next, a control line in a previous control viewport (e.g., an (N−2), (N−3), (N−4) and so on control viewport) may be moved, thereby causing the views in all subsequent viewports to change, as described above.
  • FIG. 3 illustrates a flowchart for a method 400 to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously in accordance with another embodiment of the present invention. Method 400 may be used in conjunction with screenshot 100 of FIG. 100 in an embodiment of the present invention.
  • First, at step 410, one or more images are selected to be viewed on a display of a workstation or computer, as described above. Next, at step 420, a first control viewport 120 is selected or defined, as described above. First control viewport 120 includes a cross-sectional view of a 3D volume. A user may select the exact angle and/or plane at which the image is viewed in first control viewport 120. In another embodiment, the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in first control viewport 120 may be preset.
  • At step 430, a control line 122 is defined or selected in first control viewport 120, as described above. For example, control line 122 may be selected or defined in control viewport 120 and/or control line 132 may be selected or defined in control viewport 130 of FIG. 1.
  • Next, at step 440, a determination is made as to whether a control line 122 in first control viewport 120 is moving or has been moved, as described above. If control line 122 is moving or has been moved (for example, by a user selecting the line 122 with an input device such as a mouse or stylus and moving line 122 in viewport 120), method 400 proceeds to step 445.
  • At step 445, the control line and cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the first viewport, as described above. For example, as control line 122 in first control viewport 120 is moved, the cross-sectional plane defined by control line 122 changes. Therefore, as the cross-sectional plane changes in first control viewport 120 (again, as defined by control line 122), the cross-sectional image or stack of images displayed in subsequent control viewports (e.g., second and third control viewports 130, 140) and in final viewport 110 makes a corresponding change. In addition, any control lines 132, 142 in subsequent viewports 130, 140 may also make a corresponding change. After step 445, method 400 proceeds to step 450.
  • If a determination is made at step 440 that control line 122 in first control viewport 120 is not or was not moved, then method 400 proceeds to step 450.
  • At step 450, a second control viewport 130 is selected or defined, as described above. Second control viewport 130 includes a second cross sectional image or stack of images of the 3D volume. A user may select the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in second control viewport 130. In another embodiment, the exact angle and/or cross-sectional plane at which the image is viewed in second control viewport 130 may be preset.
  • At step 460, a determination is made as to whether control line 132 in second control viewport 130 is moving or has been moved, as described above. If control line 132 is moving or has been moved (for example, by a user selecting the line 132 with an input device such as a mouse or stylus and moving line 132 in viewport 130), method 400 proceeds to step 465.
  • At step 465, the cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the second viewport, as described above. For example, as control line 132 in second control viewport 130 is moved, the angle of display or angle of cross-section of the cross-sectional image or stack of images defined by control line 132 changes. Therefore, as the cross-sectional plane changes in second control viewport 130 (again, as defined by control line 132), the 2D image or stack of images displayed in subsequent control viewports (e.g., third control viewport 140) and in final viewport 110 makes a corresponding change. In addition, control lines in subsequent viewports (e.g., control line 142 in third control viewport 140) may also make corresponding changes. In other words, any movement of control line 132 in second control viewport 130 does not affect the image or control line 122 displayed in the previous first control viewport 120. Any control line 132 movement only affects the display of images in subsequent control viewports 140 and in final viewport 110. After step 465, method 400 proceeds to step 470.
  • If a determination is made at step 460 that control line 132 in second control viewport 130 is not or was not moved, then method 400 proceeds to step 470.
  • Next, at step 470, a determination is made as to whether additional control viewports are selected or defined, as described above. If additional control viewports are to be selected or defined, then method 400 proceeds to step 480 and an additional control viewport is selected or defined, as described above. The additional control viewport can include, for example, third control viewport 140. Third control viewport 140 includes a third cross sectional image or stack of images of the 3D volume. A user may select the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in third control viewport 140. In another embodiment, the exact angle and/or cross-sectional plane at which the image or stack of images is viewed in third control viewport 140 may be preset.
  • After the additional, or Nth, control viewport is defined or selected at step 480, method 400 proceeds to step 485 where a control line is defined or selected in the additional control viewport defined or selected at step 480, as described above. From step 485, method 400 proceeds back to step 470 where another determination is made as to whether an additional control viewport is to be selected or defined, as described above. In this way, method 400 proceeds in a loop between steps 470, 480 and 485 and 485 until all Nth control viewports are defined or selected. Once the total number of control viewports have been defined or selected, method 400 proceeds from step 470 to step 490. A user may select the total number of control viewports. In another embodiment, the total number of control viewports may be preset.
  • At step 490, a determination is made as to whether a control line in any control viewport is moving or has been moved, as described above. If a control line in the next control viewport is moving or has been moved (for example, by a user selecting the line 142 in third control viewport 140 with an input device such as a mouse or stylus and moving line 142 in viewport 140), method 400 proceeds to step 495.
  • At step 495, the cross-sectional image or stack of images of the 3D volume displayed in subsequent control viewports and the final viewport is moved based on the movement of the control line in the previous control viewport, as described above. In addition, control lines in subsequent control viewports may also be moved based on the movement of the control line in the previous control viewport. For example, as control line 142 in third control viewport 140 is moved, the cross-sectional plane defined by control line 142 changes. Therefore, as the cross-sectional plane changes in third control viewport 140 (again, as defined by control line 142), the 2D image or stack of images displayed in subsequent control viewports and in final viewport 110 makes a corresponding change. In addition, the control lines displayed in subsequent control viewports also make a corresponding change. In other words, any movement of control line 142 in third control viewport 140 does not affect the image or control lines 122, 132 displayed in the previous first and second control viewports 120, 130. Any control line 142 movement only affects the display of images and control lines in subsequent control viewports and in final viewport 110.
  • If it is determined at step 490 that no control lines in any of the control viewports have been moved, method 400 proceeds to step 497 where a user views the anatomy.
  • Steps 490 and 495 continue in a loop until all control viewports defined or selected in steps 470 and 480 have been examined to determine whether a control line in each viewport has been moved. For example, a user may first move a control line in a first control viewport, followed by moving a control line in a second control viewport, followed by an additional movement of the control line in the first control viewport.
  • This invention minimizes the number of iterations required to visualize specific anatomy by visualizing the multiple steps, at the same time.
  • Current solutions to this problem require a multiple-step approach that often times call for multiple iterations. Adjusting the first angle, and then going to a second page to adjust the second, and so on. If, while adjusting the second angle, the user decides they need to further adjust the first angle, they need to go back to the original image, and can no longer see the second oblique angle. This invention allows the user to see the original image, the first oblique and the second oblique, (and even a third oblique if desired), and adjust each while viewing the others.
  • While particular elements, embodiments and applications of the present invention have been shown and described, it is understood that the invention is not limited thereto since modifications may be made by those skilled in the art, particularly in light of the foregoing teaching. It is therefore contemplated by the appended claims to cover such modifications and incorporate those features that come within the spirit and scope of the invention.

Claims (21)

1. A method to modify one or more angles of cross-section in one or more cross-sectional images of a three-dimensional (“3D”) volume simultaneously, said method including:
providing a plurality of viewports each configured to display a two-dimensional (“2D”) image, each said 2D image representative of a cross-section of said 3D volume;
moving a control line in a selected viewport of said plurality of viewports; and
simultaneous with said moving step, altering an angle of said cross-section of at least one of said 2D images in at least one viewport other than said selected viewport.
2. The method of claim 1, wherein said control line is configured to represent a cross-sectional plane of said 3D volume in of at least one of said 2D images other than said 2D image displayed in said selected viewport.
3. The method of claim 1, wherein said providing step includes providing each of said plurality of viewports in a sequence and said altering step includes altering an angle of said cross-section in each of said 2D images displayed in said viewports subsequent to said selected viewport.
4. The method of claim 3, further including:
moving a second control line in a second selected viewport, said second selected viewport being prior to said selected viewport in said sequence of viewports; and
simultaneous with moving said second control line, altering an angle of said cross-section in each of said 2D images displayed in said viewports subsequent to said second selected viewport.
5. The method of claim 1, wherein said 2D images displayed in said viewports are capable of being viewed simultaneously.
6. The method of claim 1, further including displaying a control line in each of said viewports, each of said control lines configured to represent said cross-section of said 3D volume displayed in at least one other viewport.
7. The method of claim 6, wherein said altering step includes altering a position of at least one of said control lines in at least one viewport other than said selected viewport simultaneously with said moving step.
8. The method of claim 1, wherein said control line includes an indication of a slice thickness.
9. A computer-readable storage medium including a set of instructions for a computer, said instructions including:
a display routine configured to display a two-dimensional (“2D”) image representative of a cross-section of a three-dimensional (“3D”) volume in each of a plurality of viewports; and
an angle of cross-section modifying routine configured to alter an angle of said cross-section of at least one of said 2D images based on and simultaneous with a movement of a control line in a selected viewport.
10. The set of instructions of claim 9, wherein said control line is configured to represent a cross-sectional plane of said 3D volume in at least one of said 2D images other than said 2D image displayed in said selected viewport.
11. The set of instructions of claim 9, wherein each of said plurality of viewports are arranged in a sequence and said modifying routine is configured to alter an angle of said cross-section in each of said 2D images displayed in each of said viewports subsequent to said selected viewport.
12. The set of instructions of claim 11, wherein said modifying routine is configured to alter an angle of said cross-section of each of said 2D images displayed in said viewports subsequent to a second selected viewport when a second control line in said second selected viewport is moved, said second selected viewport prior to said selected viewport in said sequence of viewports.
13. The set of instructions of claim 12, wherein said second control line moved after said control line in said selected viewport is moved.
14. The set of instructions of claim 9, wherein said display routine is configured to display said 2D images simultaneously.
15. The set of instructions of claim 9, wherein said display routine is configured to display a control line in each of said viewports, each of said control lines configured to represent a cross-sectional plane of a 2D image displayed in at least one other viewport.
16. The set of instructions of claim 15, wherein said modifying routine is configured to alter a position of at least one of said control lines in at least one viewport other than said selected viewport simultaneously with said moving step.
17. The set of instructions of claim 9, wherein said control line includes an indication of a slice thickness.
18. A method for adjusting an angle of cross-section in at least one of a plurality of two-dimensional (“2D”) images, said method including:
providing a plurality of viewports including a first viewport configured to display a first 2D image, a second viewport configured to display a second 2D image and a final viewport configured to display a final 2D image, said first, second and final 2D images each representative of one or more cross-sections of a three-dimensional (“3D”) volume;
displaying a plurality of control lines including a first control line displayed in said first viewport and a second control line displayed in said second viewport, said first control line configured to represent a cross-sectional plane of said second 2D image at a first angle of cross-section and said second control line configured to represent a cross-sectional plane of said final 2D image at a second angle of cross-section;
moving said first control line; and
simultaneous with said moving step, adjusting said first angle of cross-section of said second 2D image and said second angle of cross-section of said final 2D image.
19. The method of claim 18, further including:
moving said second control line; and
simultaneous with said step of moving said second control line, adjusting said second angle of cross-section of said final 2D image.
20. The method of claim 19, wherein said step of moving said second control line occurs prior to said step of moving said first control line.
21. The method of claim 18, wherein each of said first, second and final viewports and said first, second and final 2D images are capable of being viewed simultaneously during said moving step.
US11/202,777 2005-06-23 2005-08-12 Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously Expired - Fee Related US7496222B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/202,777 US7496222B2 (en) 2005-06-23 2005-08-12 Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously
JP2006170843A JP5113351B2 (en) 2005-06-23 2006-06-21 Method for defining a 3D oblique section of an anatomical structure at a specific angle and allowing a number of display angles to be easily modified simultaneously
CN2006101064701A CN1891175B (en) 2005-06-23 2006-06-23 Method to define the 3d oblique cross-section of anatomy and be able to easily modify multiple angles of display simultaneously

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US69532705P 2005-06-23 2005-06-23
US11/202,777 US7496222B2 (en) 2005-06-23 2005-08-12 Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously

Publications (2)

Publication Number Publication Date
US20060291717A1 true US20060291717A1 (en) 2006-12-28
US7496222B2 US7496222B2 (en) 2009-02-24

Family

ID=37567408

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/202,777 Expired - Fee Related US7496222B2 (en) 2005-06-23 2005-08-12 Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously

Country Status (3)

Country Link
US (1) US7496222B2 (en)
JP (1) JP5113351B2 (en)
CN (1) CN1891175B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070097086A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Viewing device having a touch pad
US20070229540A1 (en) * 2006-03-28 2007-10-04 Xanavi Informatics Corporation On-Vehicle Stereoscopic Display Device
US20080074427A1 (en) * 2006-09-26 2008-03-27 Karl Barth Method for display of medical 3d image data on a monitor
US20100146423A1 (en) * 2008-12-10 2010-06-10 Isabelle Duchene Method of operating a device for controlling home automation equipment
US20110222753A1 (en) * 2010-03-11 2011-09-15 Virtual Radiologic Corporation Adjusting Radiological Images
US20120134566A1 (en) * 2009-08-21 2012-05-31 Kabushiki Kaisha Toshiba Image processing apparatus for diagnostic imaging and method thereof
US20150261908A1 (en) * 2012-12-03 2015-09-17 Taiwan Semiconductor Manufacturing Company, Ltd. Method of generating a set of defect candidates for wafer
WO2015172726A1 (en) * 2014-05-14 2015-11-19 同方威视技术股份有限公司 Image display method
US20160135760A1 (en) * 2013-06-18 2016-05-19 Canon Kabushiki Kaisha Control device for controlling tomosynthesis imaging, imaging apparatus, imaging system, control method, and program for causing computer to execute the control method
US10339710B2 (en) 2011-05-06 2019-07-02 Koninklijke Philips N.V. Medical image system and method

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8199168B2 (en) * 2005-11-15 2012-06-12 General Electric Company System and method for 3D graphical prescription of a medical imaging volume
US8290225B2 (en) * 2005-12-14 2012-10-16 Koninklijke Philips Electronics N.V. Method and device for relating medical 3D data image viewing planes to each other
US8126108B2 (en) * 2007-04-18 2012-02-28 Agency For Science, Technology And Research Method and apparatus for reorientated resconstruction of computed tomography images of planar objects
US8934604B2 (en) * 2007-09-28 2015-01-13 Kabushiki Kaisha Toshiba Image display apparatus and X-ray diagnostic apparatus
US20090153548A1 (en) * 2007-11-12 2009-06-18 Stein Inge Rabben Method and system for slice alignment in diagnostic imaging systems
US8977016B2 (en) * 2009-11-10 2015-03-10 General Electric Company Method and system for checking the diagnostic quality of a medical system
US8798227B2 (en) * 2010-10-15 2014-08-05 Kabushiki Kaisha Toshiba Medical image processing apparatus and X-ray computed tomography apparatus
US8641210B2 (en) 2011-11-30 2014-02-04 Izi Medical Products Retro-reflective marker including colored mounting portion
US20140358004A1 (en) * 2012-02-13 2014-12-04 Koninklijke Philips N.V. Simultaneous ultrasonic viewing of 3d volume from multiple directions
US8661573B2 (en) 2012-02-29 2014-03-04 Izi Medical Products Protective cover for medical device having adhesive mechanism
US20130328874A1 (en) * 2012-06-06 2013-12-12 Siemens Medical Solutions Usa, Inc. Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging
US9567432B2 (en) 2012-09-17 2017-02-14 The Board Of Trustees Of The Leland Stanford Junior University Lignin poly(lactic acid) copolymers
GB2542114B (en) * 2015-09-03 2018-06-27 Heartfelt Tech Limited Method and apparatus for determining volumetric data of a predetermined anatomical feature
CN111248941A (en) * 2018-11-30 2020-06-09 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image display method, system and equipment
JP7346066B2 (en) * 2019-04-19 2023-09-19 キヤノンメディカルシステムズ株式会社 Medical information processing device and medical information processing method
CN110989901B (en) * 2019-11-29 2022-01-18 北京市商汤科技开发有限公司 Interactive display method and device for image positioning, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5371778A (en) * 1991-11-29 1994-12-06 Picker International, Inc. Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
US5734384A (en) * 1991-11-29 1998-03-31 Picker International, Inc. Cross-referenced sectioning and reprojection of diagnostic image volumes
US5986662A (en) * 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US6603868B1 (en) * 1998-11-24 2003-08-05 Siemens Aktiengesellschaft Method and apparatus for processing and playback of images at a display monitor
US6801643B2 (en) * 1995-06-01 2004-10-05 Medical Media Systems Anatomical visualization system
US7061484B2 (en) * 2002-11-27 2006-06-13 Voxar Limited User-interface and method for curved multi-planar reformatting of three-dimensional volume data sets
US7315304B2 (en) * 2004-04-15 2008-01-01 Edda Technology, Inc. Multiple volume exploration system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2991088B2 (en) * 1995-06-30 1999-12-20 株式会社島津製作所 Medical image display device
JP2003325514A (en) * 2002-05-16 2003-11-18 Aloka Co Ltd Ultrasonic diagnostic apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5371778A (en) * 1991-11-29 1994-12-06 Picker International, Inc. Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
US5734384A (en) * 1991-11-29 1998-03-31 Picker International, Inc. Cross-referenced sectioning and reprojection of diagnostic image volumes
US6801643B2 (en) * 1995-06-01 2004-10-05 Medical Media Systems Anatomical visualization system
US5986662A (en) * 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US6603868B1 (en) * 1998-11-24 2003-08-05 Siemens Aktiengesellschaft Method and apparatus for processing and playback of images at a display monitor
US7061484B2 (en) * 2002-11-27 2006-06-13 Voxar Limited User-interface and method for curved multi-planar reformatting of three-dimensional volume data sets
US7315304B2 (en) * 2004-04-15 2008-01-01 Edda Technology, Inc. Multiple volume exploration system and method

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070097086A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Viewing device having a touch pad
US8552988B2 (en) * 2005-10-31 2013-10-08 Hewlett-Packard Development Company, L.P. Viewing device having a touch pad
US20070229540A1 (en) * 2006-03-28 2007-10-04 Xanavi Informatics Corporation On-Vehicle Stereoscopic Display Device
US20080074427A1 (en) * 2006-09-26 2008-03-27 Karl Barth Method for display of medical 3d image data on a monitor
US20100146423A1 (en) * 2008-12-10 2010-06-10 Isabelle Duchene Method of operating a device for controlling home automation equipment
US9015613B2 (en) * 2008-12-10 2015-04-21 Somfy Sas Method of operating a device for controlling home automation equipment
US9098927B2 (en) * 2009-08-21 2015-08-04 Kabushiki Kaisha Toshiba Image processing apparatus for diagnostic imaging and method thereof
US20120134566A1 (en) * 2009-08-21 2012-05-31 Kabushiki Kaisha Toshiba Image processing apparatus for diagnostic imaging and method thereof
US20110222753A1 (en) * 2010-03-11 2011-09-15 Virtual Radiologic Corporation Adjusting Radiological Images
US10339710B2 (en) 2011-05-06 2019-07-02 Koninklijke Philips N.V. Medical image system and method
US20150261908A1 (en) * 2012-12-03 2015-09-17 Taiwan Semiconductor Manufacturing Company, Ltd. Method of generating a set of defect candidates for wafer
US9703919B2 (en) * 2012-12-03 2017-07-11 Taiwan Semiconductor Manufacturing Company, Ltd. System and method of filtering actual defects from defect information for a wafer
US20160135760A1 (en) * 2013-06-18 2016-05-19 Canon Kabushiki Kaisha Control device for controlling tomosynthesis imaging, imaging apparatus, imaging system, control method, and program for causing computer to execute the control method
US11020065B2 (en) * 2013-06-18 2021-06-01 Canon Kabushiki Kaisha Control device for controlling tomosynthesis imaging, imaging apparatus, imaging system, control method, and program for causing computer to execute the control method
WO2015172726A1 (en) * 2014-05-14 2015-11-19 同方威视技术股份有限公司 Image display method
US20150332498A1 (en) * 2014-05-14 2015-11-19 Nuctech Company Limited Image display methods
US9805504B2 (en) * 2014-05-14 2017-10-31 Nuctech Company Limited Image display methods

Also Published As

Publication number Publication date
CN1891175A (en) 2007-01-10
JP5113351B2 (en) 2013-01-09
US7496222B2 (en) 2009-02-24
CN1891175B (en) 2010-09-01
JP2007000627A (en) 2007-01-11

Similar Documents

Publication Publication Date Title
US7496222B2 (en) Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously
US10561466B2 (en) Automated planning systems for pedicle screw placement and related methods
US7397475B2 (en) Interactive atlas extracted from volume data
EP2080170B1 (en) Combined intensity projection
JP5224451B2 (en) Projection image creation apparatus, method and program
WO2012063653A1 (en) Medical image display device and medical image display method
JP5866177B2 (en) Image processing apparatus and image processing method
EP2017789B1 (en) Projection image generation apparatus and program
US20070116334A1 (en) Method and apparatus for three-dimensional interactive tools for semi-automatic segmentation and editing of image objects
JP5976003B2 (en) Method and system for visualizing volume datasets
Hachaj et al. Visualization of perfusion abnormalities with GPU-based volume rendering
CN111430012A (en) System and method for semi-automatically segmenting 3D medical images using real-time edge-aware brushes
JP5461782B2 (en) Camera image simulator program
JP6114266B2 (en) System and method for zooming images
AU2019325414A1 (en) A virtual tool kit for radiologists
US9082217B1 (en) Avoidance-based ray tracing for volume rendering
US9035945B1 (en) Spatial derivative-based ray tracing for volume rendering
US20010017624A1 (en) Presentation device
Sveinsson et al. ARmedViewer, an augmented-reality-based fast 3D reslicer for medical image data on mobile devices: A feasibility study
US20120208160A1 (en) Method and system for teaching and testing radiation oncology skills
CN109242964A (en) The treating method and apparatus of 3 D medical model
JP3301654B2 (en) Medical image processing equipment
JP6106293B2 (en) Image processing apparatus, image processing system, and image processing method
US20130114785A1 (en) Method for the medical imaging of a body part, in particular the hand
JPH03228755A (en) Operation support device

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MUSSACK, CHRISTOPHER JOSEPH;YAN, LITAO;JONES, CHERYL RUTH;AND OTHERS;REEL/FRAME:016895/0214

Effective date: 20050811

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210224