WO2003021850A2 - On-line image processing and communication system - Google Patents

On-line image processing and communication system Download PDF

Info

Publication number
WO2003021850A2
WO2003021850A2 PCT/US2002/027895 US0227895W WO03021850A2 WO 2003021850 A2 WO2003021850 A2 WO 2003021850A2 US 0227895 W US0227895 W US 0227895W WO 03021850 A2 WO03021850 A2 WO 03021850A2
Authority
WO
WIPO (PCT)
Prior art keywords
state parameters
image data
telecommunications network
server
receiving stations
Prior art date
Application number
PCT/US2002/027895
Other languages
French (fr)
Other versions
WO2003021850A3 (en
Inventor
Hui Hu
Yi Sun
Original Assignee
H Innovation, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by H Innovation, Inc. filed Critical H Innovation, Inc.
Priority to AU2002332797A priority Critical patent/AU2002332797A1/en
Publication of WO2003021850A2 publication Critical patent/WO2003021850A2/en
Publication of WO2003021850A3 publication Critical patent/WO2003021850A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Definitions

  • the present invention generally relates to miniPACS (Picture
  • Teleradiology is a means for electronically transmitting radiographic patient images and consultative text from one location to another.
  • Teleradiology systems have been widely used by healthcare providers to expand the geographic and/or time coverage of their service and to efficiently utilize the time of healthcare professionals with specialty and subspecialty training and skills (e.g., radiologists).
  • the result is improved healthcare service quality, decreased delivery time, and reduced costs.
  • the present invention provides a computer architecture for a client/server-based advanced image processing and rendering system.
  • the present invention further provides a computer architecture to support multi- user concurrent usage of the processing server.
  • the present invention includes a method and apparatus that combines the network-based conferencing capability with remote interactive advanced image processing capability.
  • the present invention enables users from disparate locations to interactively manipulate images and simultaneously view the processed images in an independent or synchronized fashion.
  • the present invention further enables a user to interactively view and manipulate the images without having to download the entire volumetric data set.
  • the present invention also includes improved methods and procedures for radiology consultation and multi-center trial management involving volumetric images using the above-mentioned technology.
  • the present invention may be used for radiology consultation.
  • the acquisition of 2D or 3D/volumetric image/data sets or retrieval of previously acquired image/data sets is performed.
  • the volumetric data set could be three-dimensional in space, or two- or three-dimensional in space and one-dimensional in time, e.g., time-resolved spatial data sets.
  • data is moved to a server, which could be the scanner workstation itself or a separate computer connected to a network, and which has the conferencing software.
  • client software is initiated by a remote user/users. Each user is able to remotely access and manipulate the 2D as well as volumetric/3D images with full processing capabilities, including Multiplanar
  • MPR Maximum Intensity Projection
  • MIP Volume Rendering
  • Image segmentation an user may send the image processing request, such as MPR request to the server, the server will render the images accordingly and send the result back.
  • each user is able to interactively manipulate volumetric images without transferring the entire dataset, employing an "on-demand" image transmission method.
  • Fig. 1 depicts a block diagram of the present invention.
  • Fig. 2 depicts an alternative diagram of the present invention.
  • Fig. 3 depicts a flowchart of method of the present invention.
  • Fig. 4 depicts a description of state parameters which may be used in one embodiment.
  • Fig. 5 depicts a flowchart of the state parameter updating method.
  • Fig. 1 depicts the teleradiology system described in our previous patent application, United States Patent Application Serial No. 09/434,088.
  • the teleradiology system includes as data transmitting station 100, a receiving station 300, and a network 200 connecting the transmitting station 100 and receiving station 300.
  • the system may also include a data security system 34 which extends into the transmitting station 100, receiving station 300, and network 200.
  • Receiving station 300 comprises a data receiver 26, a send request 22, a user interface 32, a data decompressor 28, a display system 30, a central processing system 24, and, data security 34.
  • Transmitting station 100 comprises a data transmitter 16, a receive request 20, a data compressor 14, a volume data rendering generator 12, a central processing system 18, and, data security 34.
  • Many image visualization and processing tasks consist of multiple interactive sub-tasks. For example, visualizing a dataset consists of at least two steps (subtasks): 1) generating a processed image to be displayed; 2) displaying the image.
  • subtasks are performed by the client and the other by the server.
  • generating the processed image to be displayed can be performed in entirety on the server, or, partially on the server and partially on the client. Displaying the processed image is performed on the client.
  • FIG. 2 a system is shown wherein, as contemplated in the present invention, several receiving stations 300a-e have access over a network 200 to a transmitting station.
  • Each of the receiving stations 300a-e are structured similarly to the receiving station 300 shown in Fig. 1.
  • the transmitting station may be considered the server and the receiving stations the clients to use the client/server terminology.
  • Fig. 3 a flow chart representing steps performed according to one preferred embodiment is shown.
  • one or more users initiate a session by logging in to the server from one of the receiving stations 300a-e.
  • one of the logged in users issues a command to form a conference, and identifies a list of users who may participate in the conference.
  • the user who initiates the conference may be designated as the conference "driver" by default.
  • the conference driver may be, for example, a consulting radiologist. Other designated conference participants may join the conference. Alternatively, the driver may review the list of users logged in and select persons to participate in the conference. The other participants may be, for example, 3D technologists, other radiologists, referring physicians, or other healthcare personnel.
  • the driver has the ability to accept or reject a request to join. Alternatively, the driver may designate that the conference is "open," i.e., that other users may join in without an express authorization being made by the driver. 17]
  • the driver initiates a processing command from the client side.
  • the driver using interface 32, specifies: 1) at least one image data set to be visualized; 2) at least one data rendering method to be used; 3) the rendering parameters used by each rendering method, 4) data compression parameters, and 5) the data transmission parameters for controlling data transmission over network 200. Examples of state parameters are provided in Fig. 4.
  • the driver may, via user interface 32, adjust rendering parameters, e.g., viewpoint, spatial region, and value range of the data to be rendered, and other settings.
  • the techniques for setting and adjusting these parameters include 1) using preset protocols for some typical settings; 2) inputting a specific setting with a keyboard, a mouse and/or other input devices; and/or 3) interactive navigation using a mouse, a trackball, a joystick, a keyboard and/or other navigating devices.
  • This driver may, via user interface 32, edit (including process) patient data, e.g., remove the bone structures, in a manner similar to the current volume data rendering/visualization systems.
  • driver can, via user interface 32, define and adjust data rendering methods and parameters, control what is to be rendered, transmitted and visualized next, and eventually obtain the final rendering result.
  • a central processing system 24 on the driver's receiving station receives and validates the driver's request.
  • the central processing system 24 then issues the request, which is sent via send request 22 to transmitting station 100 through network 200.
  • the central processing system 18 on the transmitting station 100 receives the request via receive request 20.
  • volume data rendering generator 12 accesses from image data source 10 the image data set which the user has specified, and then generates the data rendering result based on the data rendering method and parameters which the user has specified.
  • the rendering result may be a 2D image, much smaller in size than the original data set.
  • the data transmitter 16 on transmitting station 100 transmits the compressed data to data receiver 26 on receiving stations 300a-e which have sent a request for image data, i.e., on-demand, via network 200 based on data transmission parameters which the user has specified.
  • the on- demand feature of the present invention will be describe further in connection with Fig. 5.
  • the preferred transmission medium i.e., network 200
  • the preferred data transmission protocol is the standard TCP/IP, although the method may be adapted to accommodate other protocols.
  • user 400 can control certain aspects (e.g., the priority level, the speed) of data transmission by selecting transmission parameters via user interface 32.
  • the central processing systems 24 of the various receiving stations 300a-e coordinate the client-side processing. If needed, data decompressor 28 decompresses (or restores) the rendering result.
  • the central processing system 24 may also perform further image processing and operations. The processing is performed in which the final image is computed based on the field of view and the image window/level (i.e., brightness/contrast) settings currently prescribed by the conference driver.
  • the display systems 30 at receiving stations 300a-e display the computed image and other parameters.
  • the driver may further modify parameters, including 1) the image data set to be visualized, 2) the data rendering method to be used, 3) the rendering parameters used, and 4) the data transmission parameters used. This process goes on until a satisfactory rendering and visualization result is obtained.
  • the set of image processing and display parameters collectively called state parameters, keep track of the effect of image processing, performed either at the server or at a client, and if needed, synchronize the display (viewing) of multiple users. Examples of state parameters are given in Fig. 4. Each time when a new subtask is performed, this set of the state parameters is updated at the server. Any further image processing and display task will be performed based on this set of updated state parameters.
  • the resulting images are "pulled" to the clients from the server.
  • a client with the driver authorization prescribes an operation and regardless of whether this operation is performed on the client, the server, or the both, the state parameters will be updated on both the server and the driving client to reflect the resultant change due to this operation.
  • Other clients periodically compare their local copy of the state parameters with the copy on the server. If some differences are found that require updating the local display, that client will issue the update request. Again, depending on the division of subtasks, some requests are fulfilled by the client only, while the others require that the server sends updated image/information. [0024] Referring now to Fig. 5, the steps involved in state parameter updating will be described.
  • State parameter updating is controlled by the client-side conferencing software running on receiving stations 300a-e.
  • a check is made with a system clock, or another timing source, to determine whether the amount of time that has elapsed since the last state parameter update, ⁇ t, is equal to a predetermined timing parameter, P t , which determines the frequency with which the state parameters are updated. If ⁇ t > P t , then step 502 is performed. If ⁇ t ⁇ P t , then control returns to the beginning of the routine. For example, P t may be 0.25 to 0.5 seconds.
  • one of the receiving station 300 sends a request to the transmitting station for current state parameters associated with the current conferencing session.
  • the receiving station 300 compares the state parameters which have been stored locally to the state parameters that are received from the transmitting station after the request made in step 502. If the client state parameters and the server state parameters are equal, then control returns to the beginning of the routine. If the two sets of parameters are not equal, this implies that additional subtasks have been specified by the conference driver, and the routine proceeds to step 504.
  • the client sends a request for new image data if the parameters that have changed indicate that new image data has been generated. On the other hand, if only state parameters relating to brightness or contrast level, for example, are changed, then no new image data need be requested, because this change can be processed on the data already stored at the client.
  • the client state parameters are set equal to the updated server state parameters.
  • ⁇ t is set equal to zero.
  • a "push" implementation is utilized. In the push implementation, state parameters are transmitted to the clients whenever they are changed. Also, new image data is transmitted if, as described above, the change in the state parameters required new server-side image processing.
  • all remote conference participants may have already had the copy of the same data set on each of their local disk. This may be the case for training or educational applications in which a standard set of data is utilized. In this case, no image data transmission is required over the network.
  • the conferencing software running on each participant's computer will generate the new image using the local copy of the data and local computing resources and will synchronize the image display.
  • This embodiment is useful when the conference participants only have relatively-narrow bandwidth connection, such as a phone line, which is adequate to communicate the state parameters interactively, but not adequate for transmitting big data files, such as images, at rate allowing real time interaction.
  • Updated state parameters in this embodiment may be transmitted to the clients either in a push implementation or a pull implementation.
  • any participant in a conference may request to become the driver. Upon approval from the current driver, the driver privilege may be switched to the requesting participant.
  • the new driver will then have the full control of the image/data set under study, i.e., the ability to define new state parameters.
  • the new driver e.g., a surgeon, may fine tune the 3D model or other parameters to achieve the best view for his intended application.
  • the present invention may also be applied to multi-center trial studies, when constant communication of comprehensive information including images and data are needed between multiple participants.
  • MRA Magnetic Resonance Angiography
  • MRA Magnetic Resonance Angiography
  • a 3D volumetric data set comprised of a stack of 2D images, is acquired. This 3D volumetric data set is processed to extract the vascular structure, while minimizing the interference of other unwanted structures.
  • the present invention provides an optimized method for multi-center trial management using the teleradiology conferencing technology. This method is designed to optimize the workflow and management of various tasks, such as protocol selection, training/education, trial monitoring, and data management for expert reading. [0032]
  • the steps for future multi-center trial management using the present invention include: 1) Using the teleradiology conferencing techniques described herein to choose a trial protocol;
  • Reporting Expert reader will report blind read results using the integrated reporting tools provided by a system based on the present invention.

Abstract

An image data manipulation system is described in which users located remotely from an image data storage library (10) may participate in a collaborative image data rendering and evaluation session. The system includes the exchange of state parameters between the client computer (300) of a user controlling the image rendering, the session driver, and a server computer (100) which relays updated state parameters to other client computers participating in a session. The state parameters are used to update the view on each users computer to keep all the displays of the participants in synch with that of the session driver. The server processes extensive image rendering task for which the remote clients are not equipped and transmits newly-processed image data to the clients as appropriate. One embodiment for educational applications utilizes pre-stored image data sets which eliminates the need to transmit large blocks of image data over a network during a collaborative session.

Description

TITLE OF THE INVENTION
ON-LINE IMAGE PROCESSING
AND COMMUNICATION SYSTEM
BACKGROUND OF THE INVENTION
[0001] The present invention generally relates to miniPACS (Picture
Archiving and Communications System) or teleradiology systems, specifically to miniPACS/teleradiology systems with remote volume data processing, visualization, and multi-user conferencing capability. In our previous patent application, United States Patent Application Serial No. 09/434,088, we presented a miniPACS/teleradiology system with remote volume data rendering and visualization capability. The present invention is directed to additional features and enhancements of the architecture described therein. [0002] Teleradiology is a means for electronically transmitting radiographic patient images and consultative text from one location to another.
Teleradiology systems have been widely used by healthcare providers to expand the geographic and/or time coverage of their service and to efficiently utilize the time of healthcare professionals with specialty and subspecialty training and skills (e.g., radiologists). The result is improved healthcare service quality, decreased delivery time, and reduced costs.
[0003] One drawback to some existing teleradiology systems, however, is the lack of the ability for radiologists to communicate interactively with their colleagues and referring physicians from disparate locations for the purpose of consultation, education, and collaborative studies. Collaboration is especially important for studies using volumetric images, where the ability to interactively manipulate the volumetric images and simultaneously view the processed images is essential for rapid and effective communications between multiple participants involved. [0004] There are numerous methods and systems providing multi-media network based conferencing capability. However, these methods and systems only support shared viewing of texts, documents, and videos. Furthermore, a radiology conferencing system presents unique obstacles. For example, the size of data to be transmitted could be very large and the requirement on image (picture) quality could be very high. To be clinically useful, the transmission should be interactively "on-demand" in nature. There are ongoing efforts to develop radiology conferencing capabilities for the communication of two-dimensional (2D) images. However, none of these systems supports interactive communication of volumetric/three-dimensional (3D) images. [0005] As a result, there exists a need for a miniPACS/teleradiology system with network based conferencing capability supporting synchronized distribution and viewing of interactively processed volumetric images. Further, there exists a need for an improved method and procedure for the management of multi-center trials involving volumetric images.
SUMMARY OF THE INVENTION [0006] The present invention provides a computer architecture for a client/server-based advanced image processing and rendering system. The present invention further provides a computer architecture to support multi- user concurrent usage of the processing server. The present invention includes a method and apparatus that combines the network-based conferencing capability with remote interactive advanced image processing capability. The present invention enables users from disparate locations to interactively manipulate images and simultaneously view the processed images in an independent or synchronized fashion. The present invention further enables a user to interactively view and manipulate the images without having to download the entire volumetric data set. The present invention also includes improved methods and procedures for radiology consultation and multi-center trial management involving volumetric images using the above-mentioned technology. [0007] The present invention may be used for radiology consultation. In one step, the acquisition of 2D or 3D/volumetric image/data sets or retrieval of previously acquired image/data sets is performed. The volumetric data set could be three-dimensional in space, or two- or three-dimensional in space and one-dimensional in time, e.g., time-resolved spatial data sets. In another step, data is moved to a server, which could be the scanner workstation itself or a separate computer connected to a network, and which has the conferencing software. In another step, client software is initiated by a remote user/users. Each user is able to remotely access and manipulate the 2D as well as volumetric/3D images with full processing capabilities, including Multiplanar
Reformat (MPR), Maximum Intensity Projection (MIP), Volume Rendering, Image segmentation, and etc. As described in the preferred embodiment, an user may send the image processing request, such as MPR request to the server, the server will render the images accordingly and send the result back. In another step, each user is able to interactively manipulate volumetric images without transferring the entire dataset, employing an "on-demand" image transmission method.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Fig. 1 depicts a block diagram of the present invention. [0009] Fig. 2 depicts an alternative diagram of the present invention. [0010] Fig. 3 depicts a flowchart of method of the present invention. [0011] Fig. 4 depicts a description of state parameters which may be used in one embodiment.
[0012] Fig. 5 depicts a flowchart of the state parameter updating method.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT [0013] Fig. 1 depicts the teleradiology system described in our previous patent application, United States Patent Application Serial No. 09/434,088. The teleradiology system includes as data transmitting station 100, a receiving station 300, and a network 200 connecting the transmitting station 100 and receiving station 300. The system may also include a data security system 34 which extends into the transmitting station 100, receiving station 300, and network 200. Receiving station 300 comprises a data receiver 26, a send request 22, a user interface 32, a data decompressor 28, a display system 30, a central processing system 24, and, data security 34. Transmitting station 100 comprises a data transmitter 16, a receive request 20, a data compressor 14, a volume data rendering generator 12, a central processing system 18, and, data security 34. [0014] Many image visualization and processing tasks (such as volume rendering) consist of multiple interactive sub-tasks. For example, visualizing a dataset consists of at least two steps (subtasks): 1) generating a processed image to be displayed; 2) displaying the image. In a client/server-based image processing system, some subtasks are performed by the client and the other by the server. Using the above example, generating the processed image to be displayed can be performed in entirety on the server, or, partially on the server and partially on the client. Displaying the processed image is performed on the client. [0015] Referring now to Fig. 2, a system is shown wherein, as contemplated in the present invention, several receiving stations 300a-e have access over a network 200 to a transmitting station. Each of the receiving stations 300a-e are structured similarly to the receiving station 300 shown in Fig. 1. The transmitting station may be considered the server and the receiving stations the clients to use the client/server terminology. [0016] Referring now to Fig. 3, a flow chart representing steps performed according to one preferred embodiment is shown. At step 401, one or more users initiate a session by logging in to the server from one of the receiving stations 300a-e. At step 402, one of the logged in users issues a command to form a conference, and identifies a list of users who may participate in the conference. The user who initiates the conference, e.g., the user at receiving station 300a shown in Fig. 2, may be designated as the conference "driver" by default. The conference driver may be, for example, a consulting radiologist. Other designated conference participants may join the conference. Alternatively, the driver may review the list of users logged in and select persons to participate in the conference. The other participants may be, for example, 3D technologists, other radiologists, referring physicians, or other healthcare personnel. The driver has the ability to accept or reject a request to join. Alternatively, the driver may designate that the conference is "open," i.e., that other users may join in without an express authorization being made by the driver. 17] At step 403, the driver initiates a processing command from the client side. In a preferred operation, the driver, using interface 32, specifies: 1) at least one image data set to be visualized; 2) at least one data rendering method to be used; 3) the rendering parameters used by each rendering method, 4) data compression parameters, and 5) the data transmission parameters for controlling data transmission over network 200. Examples of state parameters are provided in Fig. 4. In particular, the driver may, via user interface 32, adjust rendering parameters, e.g., viewpoint, spatial region, and value range of the data to be rendered, and other settings. The techniques for setting and adjusting these parameters include 1) using preset protocols for some typical settings; 2) inputting a specific setting with a keyboard, a mouse and/or other input devices; and/or 3) interactive navigation using a mouse, a trackball, a joystick, a keyboard and/or other navigating devices. This driver may, via user interface 32, edit (including process) patient data, e.g., remove the bone structures, in a manner similar to the current volume data rendering/visualization systems. With the teleradiology system of the invention, driver can, via user interface 32, define and adjust data rendering methods and parameters, control what is to be rendered, transmitted and visualized next, and eventually obtain the final rendering result. A central processing system 24 on the driver's receiving station receives and validates the driver's request. The central processing system 24 then issues the request, which is sent via send request 22 to transmitting station 100 through network 200. [0018] At step 404, the central processing system 18 on the transmitting station 100 receives the request via receive request 20. Coordinated by central processing system 18, volume data rendering generator 12 accesses from image data source 10 the image data set which the user has specified, and then generates the data rendering result based on the data rendering method and parameters which the user has specified. The rendering result may be a 2D image, much smaller in size than the original data set. [0019] At step 405, the data transmitter 16 on transmitting station 100 transmits the compressed data to data receiver 26 on receiving stations 300a-e which have sent a request for image data, i.e., on-demand, via network 200 based on data transmission parameters which the user has specified. The on- demand feature of the present invention will be describe further in connection with Fig. 5. For the teleradiology system of the invention, the preferred transmission medium (i.e., network 200) may be an intranet, the Internet (including the Internet2) or a direct dial-up using a telephone line with a modem. The preferred data transmission protocol is the standard TCP/IP, although the method may be adapted to accommodate other protocols. Furthermore, for some transmission media (e.g., the Internet2), user 400 can control certain aspects (e.g., the priority level, the speed) of data transmission by selecting transmission parameters via user interface 32. [0020] At step 406, the central processing systems 24 of the various receiving stations 300a-e coordinate the client-side processing. If needed, data decompressor 28 decompresses (or restores) the rendering result. The central processing system 24 may also perform further image processing and operations. The processing is performed in which the final image is computed based on the field of view and the image window/level (i.e., brightness/contrast) settings currently prescribed by the conference driver. [0021] At step 407, the display systems 30 at receiving stations 300a-e display the computed image and other parameters. Via user interface 32, the driver may further modify parameters, including 1) the image data set to be visualized, 2) the data rendering method to be used, 3) the rendering parameters used, and 4) the data transmission parameters used. This process goes on until a satisfactory rendering and visualization result is obtained. [0022] The set of image processing and display parameters, collectively called state parameters, keep track of the effect of image processing, performed either at the server or at a client, and if needed, synchronize the display (viewing) of multiple users. Examples of state parameters are given in Fig. 4. Each time when a new subtask is performed, this set of the state parameters is updated at the server. Any further image processing and display task will be performed based on this set of updated state parameters. [0023] In one embodiment, the resulting images are "pulled" to the clients from the server. When a client with the driver authorization prescribes an operation and regardless of whether this operation is performed on the client, the server, or the both, the state parameters will be updated on both the server and the driving client to reflect the resultant change due to this operation. Other clients periodically compare their local copy of the state parameters with the copy on the server. If some differences are found that require updating the local display, that client will issue the update request. Again, depending on the division of subtasks, some requests are fulfilled by the client only, while the others require that the server sends updated image/information. [0024] Referring now to Fig. 5, the steps involved in state parameter updating will be described. State parameter updating is controlled by the client-side conferencing software running on receiving stations 300a-e. At step 501, a check is made with a system clock, or another timing source, to determine whether the amount of time that has elapsed since the last state parameter update, Δt, is equal to a predetermined timing parameter, Pt, which determines the frequency with which the state parameters are updated. If Δt > Pt, then step 502 is performed. If Δt < Pt, then control returns to the beginning of the routine. For example, Pt may be 0.25 to 0.5 seconds. At step 502, one of the receiving station 300 sends a request to the transmitting station for current state parameters associated with the current conferencing session. At step 503, the receiving station 300 compares the state parameters which have been stored locally to the state parameters that are received from the transmitting station after the request made in step 502. If the client state parameters and the server state parameters are equal, then control returns to the beginning of the routine. If the two sets of parameters are not equal, this implies that additional subtasks have been specified by the conference driver, and the routine proceeds to step 504. At step 504, the client sends a request for new image data if the parameters that have changed indicate that new image data has been generated. On the other hand, if only state parameters relating to brightness or contrast level, for example, are changed, then no new image data need be requested, because this change can be processed on the data already stored at the client. At step 505, the client state parameters are set equal to the updated server state parameters. At step 506, Δt is set equal to zero. [0025] What has just been described is an on-demand image transmission method. Unlike the existing conference systems, image transmission occurs only when needed, and therefore, the network utilization efficiency is greatly improved. [0026] In an alternative embodiment, a "push" implementation is utilized. In the push implementation, state parameters are transmitted to the clients whenever they are changed. Also, new image data is transmitted if, as described above, the change in the state parameters required new server-side image processing. [0027] In another alternative embodiment, all remote conference participants may have already had the copy of the same data set on each of their local disk. This may be the case for training or educational applications in which a standard set of data is utilized. In this case, no image data transmission is required over the network. Based on the state parameters maintained on the server, the conferencing software running on each participant's computer will generate the new image using the local copy of the data and local computing resources and will synchronize the image display. This embodiment is useful when the conference participants only have relatively-narrow bandwidth connection, such as a phone line, which is adequate to communicate the state parameters interactively, but not adequate for transmitting big data files, such as images, at rate allowing real time interaction. Updated state parameters in this embodiment may be transmitted to the clients either in a push implementation or a pull implementation. [0028] As part of the preferred embodiment, any participant in a conference may request to become the driver. Upon approval from the current driver, the driver privilege may be switched to the requesting participant. The new driver will then have the full control of the image/data set under study, i.e., the ability to define new state parameters. The new driver, e.g., a surgeon, may fine tune the 3D model or other parameters to achieve the best view for his intended application. [0029] The present invention may also be applied to multi-center trial studies, when constant communication of comprehensive information including images and data are needed between multiple participants. One example is a Magnetic Resonance Angiography (MRA) multi-center trial. In an MRA study, a 3D volumetric data set, comprised of a stack of 2D images, is acquired. This 3D volumetric data set is processed to extract the vascular structure, while minimizing the interference of other unwanted structures. In order to select the highest quality protocols and design the most effective trial, the participants need to view not only the acquisition protocol and the original 2D images, but also the processed 3D MRA images in detail. [0030] The existing multi-center trial procedures face several challenges. First, in order to reach consensus on trial protocols, principle investigators from participating institutions may need to travel to different locations multiple times, making this process time consuming and expensive. Second, the current procedure of site selection, training, and trial monitoring require frequent travel by the trial monitors to various participating sites, making this process heavily dependent on the trial monitors' travel schedule and availability. Third, the current process calls for transferring of all the patient data/images to a centralized position, demanding significant amount of pre - work to modify studies headers and preserve patient privacy. [0031] The present invention provides an optimized method for multi-center trial management using the teleradiology conferencing technology. This method is designed to optimize the workflow and management of various tasks, such as protocol selection, training/education, trial monitoring, and data management for expert reading. [0032] The steps for future multi-center trial management using the present invention include: 1) Using the teleradiology conferencing techniques described herein to choose a trial protocol;
2) Subsequently using the training embodiment of the teleradiology conferencing techniques described herein to conduct interactive conferences hosted by the sites experienced in the selected protocols to provide training/education to other participating sites using the mechanism described in the above section;
3) Using the teleradiology conferencing techniques described herein to conduct interactive conferences between the trial monitor and individual participating sites to review images, in order to assure quality and compliance during trial process;
4) Using the teleradiology techniques described in our application, U.S. Serial No. 09/434,088, and the present invention to allow an expert reader to remotely review, and interactively process if needed, 2D/3D image sets store at centralized or disparate locations, without physically transmitting the entire image sets; and
5) Reporting Expert reader will report blind read results using the integrated reporting tools provided by a system based on the present invention.
[0033] While the present invention has been described in its preferred embodiments, it is understood that the words which have been used are words of description, rather than limitation, and that changes may be made without departing from the true scope and spirit of the invention in its broader aspects. Thus, the scope of the present invention is defined by the claims that follow.

Claims

CLAIMS What we claim is:
1. A system for remote manipulation of image data comprising: a telecommunications network; an image data storage library; an image processing server coupled to the telecommunications network and further coupled to the image data storage library; and a plurality of receiving stations coupled to the telecommunications network, each of the plurality of receiving stations having a memory for storing local copies of state parameters, wherein at least one of the receiving stations transmits state parameters through the telecommunications network to the image processing server, and wherein the image processing server receives image data from the image data storage library and processes the image data in accordance with the received state parameters, and wherein the image processing server transmits processed image data through the telecommunications network to the receiving station.
2. The system of Claim 1 wherein at least one of the receiving stations transmits a request for processed image data through the telecommunications network and wherein the image processing server transmits processed image data to the receiving station through the telecommunications network in response to the request.
3. The system of Claim 1 wherein the image processing server transmits processed imaged data to at least one receiving station upon the completion of the processing of image data.
4. The system of Claim 1 wherein a first one of the plurality of receiving stations includes a user interface means for altering the local copy of state parameters and a means for transmitting the local copy of the state parameters, and wherein the first one of the plurality of receiving stations transmits a copy of the local copy of the state parameters through the telecommunications network to the image processing server.
5. The system of Claim 1 wherein a first one of the plurality of receiving stations includes a user interface means for altering the local copy of state parameters and a means for transmitting the local copy of the state parameters, and wherein the first one of the plurality of receiving stations transmits a copy of the local copy of the state parameters through the telecommunications network to the image processing server and wherein at least one other of the plurality of receiving stations receives the state parameters through the telecommunications network from the image processing server and stores a local copy in the memory of the at least one other of the plurality of receiving stations.
6. The system of Claim 4 wherein the at least one other receiving station transmits a request for processed image data through the telecommunications network and wherein the image processing server transmits processed image data to the receiving station through the telecommunications network in response to the request.
7. The system of Claim 4 wherein the image processing server transmits processed imaged data to the at least one other receiving station upon the completion of the processing of image data.
8. The system of Claim 4 further comprising means for authorizing only the first one of the plurality of receiving stations to alter and transmit the local copy of state parameters, and means for removing the authorization from the first one of the plurality of receiving stations and granting the authorization to another one of the plurality of receiving stations.
9. The system of Claim 1 further comprising a display means coupled to a user input device, the user input device including a means for manipulating the movement of a cursor displayed on the display means and a means for causing text and graphics to be displayed on the display means, wherein the state parameters include parameters indicating the location of the cursor on the display means and the location and content of text and graphics on the display means.
10. The system of Claim 1 wherein the image data are comprised of volumetric data.
11. A system for remote manipulation of image data comprising: a telecommunications network; an image data storage library; an image processing server coupled to the telecommunications network and further coupled to the image data storage library; and a plurality of receiving stations coupled to the telecommunications network; wherein the image processing server includes a first memory for storing a server set of state parameters, a server-side machine-readable medium, and a server-side processor that executes a first program stored in the server-side machine-readable medium, the first program causing the server-side processor to perform the steps of: controlling the reception of an update set of state parameters over the telecommunications network; controlling the determination of whether the received update set of state parameters differs from the server set of state parameters in a manner which requires new processing of the image data; controlling the processing of image data according to the update set of state parameters; controlling the transmission the update set of state parameters from the image processing server to the receiving stations; and controlling the transmission of new image data from the image processing server to the receiving stations if the update set of state parameters required processing of image data at the image processing server; and wherein the plurality of receiving stations include a second memory for storing a local set of state parameters, a client-side machine-readable medium, and a client-side processor that executes a second program stored in the client-side machine-readable medium, the second program causing the client-side processor to perform the steps of: controlling the transmission of a request for new state parameters to the image processing server through the telecommunications network; controlling the reception of state parameters from the image processing . server over the telecommunications network; controlling the determination of whether the received state parameters differ from the local set of state parameters and whether the received state parameters require non-local processing of image data; and controlling the transmission of a request for updated image data from the receiving station to the image processing server if a determination is made in the determining step that the received state parameters require non-local processing of image data.
12. The system of Claim 1 1 wherein at least one of the plurality of receiving stations includes a user interface means for altering a set of state parameters stored in the memory and a transmission means coupled to the telecommunications network for transmitting the set of state parameters stored in the memory.
13. The system of Claim 11 wherein at least one of the receiving stations transmits a request for processed image data through the telecommunications network and wherein the image processing server transmits processed image data to the receiving station through the telecommunications network in response to the request.
14. The system of Claim 11 wherein the image processing server transmits processed imaged data to at least one receiving station upon the completion of the processing of image data.
15. A system for remote manipulation of image data comprising: a telecommunications network; a communications server coupled to the telecommunications network; a plurality of receiving stations, the receiving stations including a first memory for storing a local set of state parameters, a first client-side machine- readable medium containing a pre-stored set of image data, a second client-side machine-readable medium, and a client-side processor that executes a program stored in the second machine-readable medium, the program causing the processor to perform the steps of: controlling the transmission of a request for new state parameters to the communication server through the telecommunications network; controlling the reception of state parameters from the communications server over the telecommunications network; controlling the determination of whether the received state parameters differ from the local set of state parameters; and controlling the processing of the pre-stored image data based on the received state parameters; wherein the communications server includes a second memory for storing a server set of state parameters, a server-side machine -readable medium, and a server-side processor that executes a second program stored in the third machine- readable medium, the second program causing the second processor to perform the steps of: controlling the reception by the communications server of an update set of state parameters over the telecommunications network; and controlling the transmission the update set of state parameters from the communications server to the receiving stations.
16. The system of Claim 15 wherein at least one of the plurality of receiving stations includes a user interface means for altering a set of state parameters stored in the memory and a transmission means coupled to the telecommunications network for transmitting the set of state parameters stored in the memory.
17. The system of Claim 15 wherein at least one of the receiving stations processes image data stored in the local memory according to the received set of state parameters.
PCT/US2002/027895 2001-08-31 2002-08-30 On-line image processing and communication system WO2003021850A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002332797A AU2002332797A1 (en) 2001-08-31 2002-08-30 On-line image processing and communication system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/945,479 2001-08-31
US09/945,479 US7039723B2 (en) 2001-08-31 2001-08-31 On-line image processing and communication system

Publications (2)

Publication Number Publication Date
WO2003021850A2 true WO2003021850A2 (en) 2003-03-13
WO2003021850A3 WO2003021850A3 (en) 2003-11-06

Family

ID=25483155

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/027895 WO2003021850A2 (en) 2001-08-31 2002-08-30 On-line image processing and communication system

Country Status (3)

Country Link
US (1) US7039723B2 (en)
AU (1) AU2002332797A1 (en)
WO (1) WO2003021850A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007111755A2 (en) * 2005-12-29 2007-10-04 Cytyc Corporation Scalable architecture for maximizing slide throughput

Families Citing this family (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
DE19835215C2 (en) * 1998-08-05 2000-07-27 Mannesmann Vdo Ag Combination instrument
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US6621918B1 (en) 1999-11-05 2003-09-16 H Innovation, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US7830399B2 (en) * 2000-10-04 2010-11-09 Shutterfly, Inc. System and method for manipulating digital images
US6934698B2 (en) * 2000-12-20 2005-08-23 Heart Imaging Technologies Llc Medical image management system
US8166381B2 (en) * 2000-12-20 2012-04-24 Heart Imaging Technologies, Llc Medical image management system
US7418480B2 (en) * 2001-10-25 2008-08-26 Ge Medical Systems Global Technology Company, Llc Medical imaging data streaming
EP1306791A3 (en) * 2001-10-25 2006-06-07 Siemens Aktiengesellschaft System for advising physicians under licence controlled constraints
US20030086595A1 (en) * 2001-11-07 2003-05-08 Hui Hu Display parameter-dependent pre-transmission processing of image data
US7426539B2 (en) * 2003-01-09 2008-09-16 Sony Computer Entertainment America Inc. Dynamic bandwidth control
US7685262B2 (en) * 2003-01-24 2010-03-23 General Electric Company Method and system for transfer of imaging protocols and procedures
EP1597703B1 (en) * 2003-02-18 2018-07-04 Koninklijke Philips N.V. Volume visualization using tissue mix
US7827139B2 (en) * 2004-04-15 2010-11-02 Citrix Systems, Inc. Methods and apparatus for sharing graphical screen data in a bandwidth-adaptive manner
US7680885B2 (en) * 2004-04-15 2010-03-16 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
US20060002315A1 (en) * 2004-04-15 2006-01-05 Citrix Systems, Inc. Selectively sharing screen data
US20060031779A1 (en) * 2004-04-15 2006-02-09 Citrix Systems, Inc. Selectively sharing screen data
US7580867B2 (en) * 2004-05-04 2009-08-25 Paul Nykamp Methods for interactively displaying product information and for collaborative product design
KR100606785B1 (en) * 2004-07-07 2006-08-01 엘지전자 주식회사 Synchronization method of video and iamge data in system for providing remote multimedia service through network
WO2006014480A2 (en) * 2004-07-08 2006-02-09 Actuality Systems, Inc. Architecture for rendering graphics on output devices over diverse connections
US20060025667A1 (en) * 2004-07-29 2006-02-02 Edward Ashton Method for tumor perfusion assessment in clinical trials using dynamic contrast enhanced MRI
US20060031187A1 (en) * 2004-08-04 2006-02-09 Advizor Solutions, Inc. Systems and methods for enterprise-wide visualization of multi-dimensional data
US7920152B2 (en) 2004-11-04 2011-04-05 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US7970625B2 (en) 2004-11-04 2011-06-28 Dr Systems, Inc. Systems and methods for retrieval of medical data
US7885440B2 (en) 2004-11-04 2011-02-08 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US7660488B2 (en) 2004-11-04 2010-02-09 Dr Systems, Inc. Systems and methods for viewing medical images
US7787672B2 (en) 2004-11-04 2010-08-31 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US20060101064A1 (en) 2004-11-08 2006-05-11 Sharpcast, Inc. Method and apparatus for a file sharing and synchronization system
US8296441B2 (en) 2005-01-14 2012-10-23 Citrix Systems, Inc. Methods and systems for joining a real-time session of presentation layer protocol data
US8340130B2 (en) * 2005-01-14 2012-12-25 Citrix Systems, Inc. Methods and systems for generating playback instructions for rendering of a recorded computer session
US8935316B2 (en) 2005-01-14 2015-01-13 Citrix Systems, Inc. Methods and systems for in-session playback on a local machine of remotely-stored and real time presentation layer protocol data
US20060159432A1 (en) * 2005-01-14 2006-07-20 Citrix Systems, Inc. System and methods for automatic time-warped playback in rendering a recorded computer session
US8200828B2 (en) * 2005-01-14 2012-06-12 Citrix Systems, Inc. Systems and methods for single stack shadowing
US8230096B2 (en) 2005-01-14 2012-07-24 Citrix Systems, Inc. Methods and systems for generating playback instructions for playback of a recorded computer session
US8428969B2 (en) 2005-01-19 2013-04-23 Atirix Medical Systems, Inc. System and method for tracking medical imaging quality
US8443040B2 (en) * 2005-05-26 2013-05-14 Citrix Systems Inc. Method and system for synchronizing presentation of a dynamic data set to a plurality of nodes
US20060287593A1 (en) * 2005-06-20 2006-12-21 General Electric Company System and method providing communication in a medical imaging system
US7853483B2 (en) * 2005-08-05 2010-12-14 Microsoft Coporation Medium and system for enabling content sharing among participants associated with an event
US20070033142A1 (en) * 2005-08-05 2007-02-08 Microsoft Corporation Informal trust relationship to facilitate data sharing
US8191008B2 (en) * 2005-10-03 2012-05-29 Citrix Systems, Inc. Simulating multi-monitor functionality in a single monitor environment
US7890573B2 (en) * 2005-11-18 2011-02-15 Toshiba Medical Visualization Systems Europe, Limited Server-client architecture in medical imaging
CA2636819A1 (en) * 2006-01-13 2007-07-19 Diginiche Inc. System and method for collaborative information display and markup
US20080028323A1 (en) * 2006-07-27 2008-01-31 Joshua Rosen Method for Initiating and Launching Collaboration Sessions
MX2009001575A (en) * 2006-08-10 2009-04-24 Maria Gaos System and methods for content conversion and distribution.
US8238678B2 (en) * 2006-08-30 2012-08-07 Siemens Medical Solutions Usa, Inc. Providing representative image information
JP5111811B2 (en) * 2006-09-07 2013-01-09 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Image operation history management system, modality, and server device
US8054241B2 (en) * 2006-09-14 2011-11-08 Citrix Systems, Inc. Systems and methods for multiple display support in remote access software
US7791559B2 (en) * 2006-09-14 2010-09-07 Citrix Systems, Inc. System and method for multiple display support in remote access software
WO2008064258A2 (en) * 2006-11-20 2008-05-29 Mckesson Information Solutions Llc Interactive viewing, asynchronous retrieval, and annotation of medical images
US7953614B1 (en) 2006-11-22 2011-05-31 Dr Systems, Inc. Smart placement rules
US20080133736A1 (en) * 2006-11-30 2008-06-05 Ava Mobile, Inc. System, method, and computer program product for tracking digital media in collaborative environments
WO2008079219A1 (en) * 2006-12-19 2008-07-03 Bruce Reiner Pacs portal with automated data mining and software selection
US9098840B2 (en) * 2007-08-22 2015-08-04 Siemens Aktiengesellschaft System and method for providing and activating software licenses
US8392529B2 (en) 2007-08-27 2013-03-05 Pme Ip Australia Pty Ltd Fast file server methods and systems
US8654139B2 (en) * 2007-08-29 2014-02-18 Mckesson Technologies Inc. Methods and systems to transmit, view, and manipulate medical images in a general purpose viewing agent
US8520978B2 (en) * 2007-10-31 2013-08-27 Mckesson Technologies Inc. Methods, computer program products, apparatuses, and systems for facilitating viewing and manipulation of an image on a client device
US10311541B2 (en) 2007-11-23 2019-06-04 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
WO2011065929A1 (en) 2007-11-23 2011-06-03 Mercury Computer Systems, Inc. Multi-user multi-gpu render server apparatus and methods
US9904969B1 (en) 2007-11-23 2018-02-27 PME IP Pty Ltd Multi-user multi-GPU render server apparatus and methods
WO2009067675A1 (en) 2007-11-23 2009-05-28 Mercury Computer Systems, Inc. Client-server visualization system with hybrid data processing
US8548215B2 (en) 2007-11-23 2013-10-01 Pme Ip Australia Pty Ltd Automatic image segmentation of a volume by comparing and correlating slice histograms with an anatomic atlas of average histograms
WO2009094673A2 (en) 2008-01-27 2009-07-30 Citrix Systems, Inc. Methods and systems for remoting three dimensional graphics
US8230319B2 (en) * 2008-01-31 2012-07-24 Microsoft Corporation Web-based visualization, refresh, and consumption of data-linked diagrams
US20090216557A1 (en) * 2008-02-24 2009-08-27 Kyle Lawton Software System for Providing Access Via Pop-Up Windows to Medical Test Results and Information Relating Thereto
US9032295B1 (en) 2008-03-19 2015-05-12 Dropbox, Inc. Method for displaying files from a plurality of devices in a multi-view interface and for enabling operations to be performed on such files through such interface
US8019900B1 (en) 2008-03-25 2011-09-13 SugarSync, Inc. Opportunistic peer-to-peer synchronization in a synchronization system
US9141483B1 (en) 2008-03-27 2015-09-22 Dropbox, Inc. System and method for multi-tier synchronization
US9342320B2 (en) * 2008-05-16 2016-05-17 Mckesson Technologies Inc. Method for facilitating cooperative interaction between software applications
US8386560B2 (en) * 2008-09-08 2013-02-26 Microsoft Corporation Pipeline for network based server-side 3D image rendering
WO2010042578A1 (en) * 2008-10-08 2010-04-15 Citrix Systems, Inc. Systems and methods for real-time endpoint application flow control with network structure component
US8380533B2 (en) 2008-11-19 2013-02-19 DR Systems Inc. System and method of providing dynamic and customizable medical examination forms
US20100202510A1 (en) * 2009-02-09 2010-08-12 Kyle Albert S Compact real-time video transmission module
US8650498B1 (en) 2009-05-04 2014-02-11 SugarSync, Inc. User interface for managing and viewing synchronization settings in a synchronization system
US8701167B2 (en) * 2009-05-28 2014-04-15 Kjaya, Llc Method and system for fast access to advanced visualization of medical scans using a dedicated web portal
US10226303B2 (en) * 2009-05-29 2019-03-12 Jack Wade System and method for advanced data management with video enabled software tools for video broadcasting environments
US10517688B2 (en) * 2009-05-29 2019-12-31 Jack Wade Method for enhanced data analysis with specialized video enabled software tools for medical environments
US11744668B2 (en) * 2009-05-29 2023-09-05 Jack Wade System and method for enhanced data analysis with specialized video enabled software tools for medical environments
US10142641B2 (en) * 2016-06-01 2018-11-27 Jack Wade System and method for parallel image processing and routing
US11744441B2 (en) * 2009-05-29 2023-09-05 Jack Wade System and method for enhanced data analysis with video enabled software tools for medical environments
US11539971B2 (en) * 2009-05-29 2022-12-27 Jack Wade Method for parallel image processing and routing
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
FR2957432B1 (en) * 2010-03-09 2012-04-27 Olivier Cadou METHOD AND SYSTEM FOR REMOTELY CONTROLLING A DISPLAY SCREEN
US20110238618A1 (en) * 2010-03-25 2011-09-29 Michael Valdiserri Medical Collaboration System and Method
US9092551B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Dynamic montage reconstruction
US8615159B2 (en) 2011-09-20 2013-12-24 Citrix Systems, Inc. Methods and systems for cataloging text in a recorded session
US20130147832A1 (en) * 2011-12-07 2013-06-13 Ati Technologies Ulc Method and apparatus for remote extension display
US9043766B2 (en) * 2011-12-16 2015-05-26 Facebook, Inc. Language translation using preprocessor macros
US9633125B1 (en) 2012-08-10 2017-04-25 Dropbox, Inc. System, method, and computer program for enabling a user to synchronize, manage, and share folders across a plurality of client devices and a synchronization server
US10057318B1 (en) 2012-08-10 2018-08-21 Dropbox, Inc. System, method, and computer program for enabling a user to access and edit via a virtual drive objects synchronized to a plurality of synchronization clients
US9495604B1 (en) 2013-01-09 2016-11-15 D.R. Systems, Inc. Intelligent management of computerized advanced processing
US8976190B1 (en) 2013-03-15 2015-03-10 Pme Ip Australia Pty Ltd Method and system for rule based display of sets of images
US10540803B2 (en) 2013-03-15 2020-01-21 PME IP Pty Ltd Method and system for rule-based display of sets of images
US11183292B2 (en) 2013-03-15 2021-11-23 PME IP Pty Ltd Method and system for rule-based anonymized display and data export
US9509802B1 (en) 2013-03-15 2016-11-29 PME IP Pty Ltd Method and system FPOR transferring data to improve responsiveness when sending large data sets
US11244495B2 (en) 2013-03-15 2022-02-08 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US10070839B2 (en) 2013-03-15 2018-09-11 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
KR20150034061A (en) * 2013-09-25 2015-04-02 삼성전자주식회사 The method and apparatus for setting imaging environment by using signals received from a plurality of clients
GB201505506D0 (en) * 2015-03-31 2015-05-13 Optimed Ltd 3D scene co-ordinate capture & songs
US10929508B2 (en) 2015-04-30 2021-02-23 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data
US11599672B2 (en) 2015-07-31 2023-03-07 PME IP Pty Ltd Method and apparatus for anonymized display and data export
US9984478B2 (en) 2015-07-28 2018-05-29 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US10909679B2 (en) 2017-09-24 2021-02-02 PME IP Pty Ltd Method and system for rule based display of sets of images using image content derived parameters
US11334596B2 (en) 2018-04-27 2022-05-17 Dropbox, Inc. Selectively identifying and recommending digital content items for synchronization
KR20220001312A (en) * 2020-06-29 2022-01-05 삼성전자주식회사 Method and apparatus for controlling transmission and reception of data in a wireless communication system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222551B1 (en) * 1999-01-13 2001-04-24 International Business Machines Corporation Methods and apparatus for providing 3D viewpoint selection in a server/client arrangement

Family Cites Families (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4222076A (en) 1978-09-15 1980-09-09 Bell Telephone Laboratories, Incorporated Progressive image transmission
US4475104A (en) 1983-01-17 1984-10-02 Lexidata Corporation Three-dimensional display system
US4748511A (en) 1984-06-07 1988-05-31 Raytel Systems Corporation Teleradiology system
US4910609A (en) 1984-06-07 1990-03-20 Raytel Systems Corporation Teleradiology system
US4625289A (en) 1985-01-09 1986-11-25 Evans & Sutherland Computer Corp. Computer graphics system of general surface rendering by exhaustive sampling
US4737921A (en) 1985-06-03 1988-04-12 Dynamic Digital Displays, Inc. Three dimensional medical image display system
US5005126A (en) 1987-04-09 1991-04-02 Prevail, Inc. System and method for remote presentation of diagnostic image information
US5216596A (en) 1987-04-30 1993-06-01 Corabi International Telemetrics, Inc. Telepathology diagnostic network
US5038302A (en) 1988-07-26 1991-08-06 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations into discrete three-dimensional voxel-based representations within a three-dimensional voxel-based system
US4987554A (en) 1988-08-24 1991-01-22 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations of polygonal objects into discrete three-dimensional voxel-based representations thereof within a three-dimensional voxel-based system
US4985856A (en) 1988-11-10 1991-01-15 The Research Foundation Of State University Of New York Method and apparatus for storing, accessing, and processing voxel-based data
US5027110A (en) 1988-12-05 1991-06-25 At&T Bell Laboratories Arrangement for simultaneously displaying on one or more display terminals a series of images
US5205289A (en) 1988-12-23 1993-04-27 Medical Instrumentation And Diagnostics Corporation Three-dimensional computer graphics simulation and computerized numerical optimization for dose delivery and treatment planning
US5101475A (en) 1989-04-17 1992-03-31 The Research Foundation Of State University Of New York Method and apparatus for generating arbitrary projections of three-dimensional voxel-based data
CA2051939A1 (en) 1990-10-02 1992-04-03 Gary A. Ransford Digital data registration and differencing compression system
EP0487110B1 (en) 1990-11-22 1999-10-06 Kabushiki Kaisha Toshiba Computer-aided diagnosis system for medical use
US5730146A (en) 1991-08-01 1998-03-24 Itil; Turan M. Transmitting, analyzing and reporting EEG data
US5291401A (en) 1991-11-15 1994-03-01 Telescan, Limited Teleradiology system
US5448686A (en) * 1992-01-02 1995-09-05 International Business Machines Corporation Multi-resolution graphic representation employing at least one simplified model for interactive visualization applications
US5442733A (en) 1992-03-20 1995-08-15 The Research Foundation Of State University Of New York Method and apparatus for generating realistic images using a discrete representation
US5441047A (en) 1992-03-25 1995-08-15 David; Daniel Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5360971A (en) 1992-03-31 1994-11-01 The Research Foundation State University Of New York Apparatus and method for eye tracking interface
US5384862A (en) 1992-05-29 1995-01-24 Cimpiter Corporation Radiographic image evaluation apparatus and method
US5321520A (en) 1992-07-20 1994-06-14 Automated Medical Access Corporation Automated high definition/resolution image storage, retrieval and transmission system
US6283761B1 (en) 1992-09-08 2001-09-04 Raymond Anthony Joao Apparatus and method for processing and/or for providing healthcare information and/or healthcare-related information
US5517021A (en) 1993-01-19 1996-05-14 The Research Foundation State University Of New York Apparatus and method for eye tracking interface
IL104636A (en) 1993-02-07 1997-06-10 Oli V R Corp Ltd Apparatus and method for encoding and decoding digital signals
US5590271A (en) 1993-05-21 1996-12-31 Digital Equipment Corporation Interactive visualization environment with improved visual programming interface
EP0626635B1 (en) 1993-05-24 2003-03-05 Sun Microsystems, Inc. Improved graphical user interface with method for interfacing to remote devices
US5544283A (en) 1993-07-26 1996-08-06 The Research Foundation Of State University Of New York Method and apparatus for real-time volume rendering from an arbitrary viewing direction
US5432871A (en) 1993-08-04 1995-07-11 Universal Systems & Technology, Inc. Systems and methods for interactive image data acquisition and compression
JP2605598B2 (en) 1993-08-20 1997-04-30 日本電気株式会社 Fingerprint image transmission device
US5377258A (en) 1993-08-30 1994-12-27 National Medical Research Council Method and apparatus for an automated and interactive behavioral guidance system
US5408249A (en) 1993-11-24 1995-04-18 Radiation Measurements, Inc. Bit extension adapter for computer graphics
US5469353A (en) 1993-11-26 1995-11-21 Access Radiology Corp. Radiological image interpretation apparatus and method
US5660176A (en) 1993-12-29 1997-08-26 First Opinion Corporation Computerized medical diagnostic and treatment advice system
US5482043A (en) 1994-05-11 1996-01-09 Zulauf; David R. P. Method and apparatus for telefluoroscopy
US5594842A (en) 1994-09-06 1997-01-14 The Research Foundation Of State University Of New York Apparatus and method for real-time volume visualization
AU699764B2 (en) 1994-09-06 1998-12-17 Research Foundation Of The State University Of New York, The Apparatus and method for real-time volume visualization
US5838906A (en) 1994-10-17 1998-11-17 The Regents Of The University Of California Distributed hypermedia method for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US5883976A (en) 1994-12-28 1999-03-16 Canon Kabushiki Kaisha Selectively utilizing multiple encoding methods
US5594935A (en) 1995-02-23 1997-01-14 Motorola, Inc. Interactive image display system of wide angle images comprising an accounting system
US5649173A (en) 1995-03-06 1997-07-15 Seiko Epson Corporation Hardware architecture for image generation and manipulation
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US6807558B1 (en) * 1995-06-12 2004-10-19 Pointcast, Inc. Utilization of information “push” technology
US5797515A (en) 1995-10-18 1998-08-25 Adds, Inc. Method for controlling a drug dispensing system
US5805118A (en) 1995-12-22 1998-09-08 Research Foundation Of The State Of New York Display protocol specification with session configuration and multiple monitors
US5715823A (en) 1996-02-27 1998-02-10 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with universal access to diagnostic information and images
US5603323A (en) 1996-02-27 1997-02-18 Advanced Technology Laboratories, Inc. Medical ultrasonic diagnostic system with upgradeable transducer probes and other features
US5988862A (en) * 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
US5903775A (en) 1996-06-06 1999-05-11 International Business Machines Corporation Method for the sequential transmission of compressed video information at varying data rates
US5862346A (en) * 1996-06-28 1999-01-19 Metadigm Distributed group activity data network system and corresponding method
US5917929A (en) 1996-07-23 1999-06-29 R2 Technology, Inc. User interface for computer aided diagnosis system
JP3688822B2 (en) 1996-09-03 2005-08-31 株式会社東芝 Electronic medical record system
US5682328A (en) 1996-09-11 1997-10-28 Bbn Corporation Centralized computer event data logging system
US5971767A (en) 1996-09-16 1999-10-26 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination
US6331116B1 (en) 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US6343936B1 (en) 1996-09-16 2002-02-05 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination, navigation and visualization
US5986662A (en) 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US5974446A (en) 1996-10-24 1999-10-26 Academy Of Applied Science Internet based distance learning system for communicating between server and clients wherein clients communicate with each other or with teacher using different communication techniques via common user interface
US5987345A (en) 1996-11-29 1999-11-16 Arch Development Corporation Method and system for displaying medical images
US6195340B1 (en) 1997-01-06 2001-02-27 Kabushiki Kaisha Toshiba Wireless network system and wireless communication apparatus of the same
JPH10215342A (en) 1997-01-31 1998-08-11 Canon Inc Image display method and device
US5836877A (en) 1997-02-24 1998-11-17 Lucid Inc System for facilitating pathological examination of a lesion in tissue
US6253228B1 (en) * 1997-03-31 2001-06-26 Apple Computer, Inc. Method and apparatus for updating and synchronizing information between a client and a server
US5941945A (en) 1997-06-18 1999-08-24 International Business Machines Corporation Interest-based collaborative framework
US6008813A (en) 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
US6369812B1 (en) 1997-11-26 2002-04-09 Philips Medical Systems, (Cleveland), Inc. Inter-active viewing system for generating virtual endoscopy studies of medical diagnostic data with a continuous sequence of spherical panoramic views and viewing the studies over networks
JPH11239165A (en) 1998-02-20 1999-08-31 Fuji Photo Film Co Ltd Medical network system
US6166732A (en) 1998-02-24 2000-12-26 Microsoft Corporation Distributed object oriented multi-user domain with multimedia presentations
US6088702A (en) 1998-02-25 2000-07-11 Plantz; Scott H. Group publishing system
US6654785B1 (en) * 1998-03-02 2003-11-25 Hewlett-Packard Development Company, L.P. System for providing a synchronized display of information slides on a plurality of computer workstations over a computer network
US6105055A (en) 1998-03-13 2000-08-15 Siemens Corporate Research, Inc. Method and apparatus for asynchronous multimedia collaboration
US6313841B1 (en) 1998-04-13 2001-11-06 Terarecon, Inc. Parallel volume rendering system with a resampling module for parallel and perspective projections
US6260021B1 (en) 1998-06-12 2001-07-10 Philips Electronics North America Corporation Computer-based medical image distribution system and method
US6230162B1 (en) 1998-06-20 2001-05-08 International Business Machines Corporation Progressive interleaved delivery of interactive descriptions and renderers for electronic publishing of merchandise
US6674430B1 (en) * 1998-07-16 2004-01-06 The Research Foundation Of State University Of New York Apparatus and method for real-time volume processing and universal 3D rendering
US6407743B1 (en) * 1998-10-20 2002-06-18 Microsoft Corporation System and method for morphing based on multiple weighted parameters
US6532017B1 (en) * 1998-11-12 2003-03-11 Terarecon, Inc. Volume rendering pipeline
US6297799B1 (en) 1998-11-12 2001-10-02 James Knittel Three-dimensional cursor for a real-time volume rendering system
US6266733B1 (en) 1998-11-12 2001-07-24 Terarecon, Inc Two-level mini-block storage system for volume data sets
US6369816B1 (en) 1998-11-12 2002-04-09 Terarecon, Inc. Method for modulating volume samples using gradient magnitudes and complex functions over a range of values
US6404429B1 (en) 1998-11-12 2002-06-11 Terarecon, Inc. Method for modulating volume samples with gradient magnitude vectors and step functions
US6211884B1 (en) 1998-11-12 2001-04-03 Mitsubishi Electric Research Laboratories, Inc Incrementally calculated cut-plane region for viewing a portion of a volume data set in real-time
JP2000149047A (en) 1998-11-12 2000-05-30 Mitsubishi Electric Inf Technol Center America Inc Volume rendering processor
US6512517B1 (en) 1998-11-12 2003-01-28 Terarecon, Inc. Volume rendering integrated circuit
US6411296B1 (en) 1998-11-12 2002-06-25 Trrarecon, Inc. Method and apparatus for applying modulated lighting to volume data in a rendering pipeline
US6426749B1 (en) 1998-11-12 2002-07-30 Terarecon, Inc. Method and apparatus for mapping reflectance while illuminating volume data in a rendering pipeline
US6342885B1 (en) 1998-11-12 2002-01-29 Tera Recon Inc. Method and apparatus for illuminating volume data in a rendering pipeline
US6356265B1 (en) 1998-11-12 2002-03-12 Terarecon, Inc. Method and apparatus for modulating lighting with gradient magnitudes of volume data in a rendering pipeline
US6166544A (en) 1998-11-25 2000-12-26 General Electric Company MR imaging system with interactive image contrast control
US6310620B1 (en) 1998-12-22 2001-10-30 Terarecon, Inc. Method and apparatus for volume rendering with multiple depth buffers
US6381029B1 (en) 1998-12-23 2002-04-30 Etrauma, Llc Systems and methods for remote viewing of patient images
US6132269A (en) 1999-03-09 2000-10-17 Outboard Marine Corporation Cantilever jet drive package
US6615264B1 (en) * 1999-04-09 2003-09-02 Sun Microsystems, Inc. Method and apparatus for remotely administered authentication and access control
US6407737B1 (en) 1999-05-20 2002-06-18 Terarecon, Inc. Rendering a shear-warped partitioned volume data set
US6952741B1 (en) * 1999-06-30 2005-10-04 Computer Sciences Corporation System and method for synchronizing copies of data in a computer system
US6424346B1 (en) 1999-07-15 2002-07-23 Tera Recon, Inc. Method and apparatus for mapping samples in a rendering pipeline
US6476810B1 (en) 1999-07-15 2002-11-05 Terarecon, Inc. Method and apparatus for generating a histogram of a volume data set
US6421057B1 (en) 1999-07-15 2002-07-16 Terarecon, Inc. Configurable volume rendering pipeline
IL148130A0 (en) * 1999-08-16 2002-09-12 Force Corp Z System of reusable software parts and methods of use
US6618751B1 (en) * 1999-08-20 2003-09-09 International Business Machines Corporation Systems and methods for publishing data with expiration times
US6621918B1 (en) * 1999-11-05 2003-09-16 H Innovation, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US7051110B2 (en) * 1999-12-20 2006-05-23 Matsushita Electric Industrial Co., Ltd. Data reception/playback method and apparatus and data transmission method and apparatus for providing playback control functions
US6847365B1 (en) * 2000-01-03 2005-01-25 Genesis Microchip Inc. Systems and methods for efficient processing of multimedia data
US6704024B2 (en) * 2000-08-07 2004-03-09 Zframe, Inc. Visual content browsing using rasterized representations
US6879996B1 (en) * 2000-09-13 2005-04-12 Edward W. Laves Method and apparatus for displaying personal digital assistant synchronization data using primary and subordinate data fields
US6760755B1 (en) 2000-09-22 2004-07-06 Ge Medical Systems Global Technology Company, Llc Imaging system with user-selectable prestored files for configuring communication with remote devices
US6614447B1 (en) * 2000-10-04 2003-09-02 Terarecon, Inc. Method and apparatus for correcting opacity values in a rendering pipeline
US6680735B1 (en) * 2000-10-04 2004-01-20 Terarecon, Inc. Method for correcting gradients of irregular spaced graphic data
US7877518B2 (en) * 2000-11-30 2011-01-25 Access Systems Americas, Inc. Method and apparatus for updating applications on a mobile device via device synchronization
JP3766608B2 (en) * 2001-05-02 2006-04-12 テラリコン・インコーポレイテッド 3D image display device in network environment
JP3704492B2 (en) * 2001-09-11 2005-10-12 テラリコン・インコーポレイテッド Reporting system in network environment
US20030086595A1 (en) * 2001-11-07 2003-05-08 Hui Hu Display parameter-dependent pre-transmission processing of image data

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6222551B1 (en) * 1999-01-13 2001-04-24 International Business Machines Corporation Methods and apparatus for providing 3D viewpoint selection in a server/client arrangement

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007111755A2 (en) * 2005-12-29 2007-10-04 Cytyc Corporation Scalable architecture for maximizing slide throughput
WO2007111755A3 (en) * 2005-12-29 2007-11-22 Cytyc Corp Scalable architecture for maximizing slide throughput
US7870284B2 (en) 2005-12-29 2011-01-11 Cytyc Corporation Scalable architecture for maximizing slide throughput

Also Published As

Publication number Publication date
US20030055896A1 (en) 2003-03-20
US7039723B2 (en) 2006-05-02
AU2002332797A1 (en) 2003-03-18
WO2003021850A3 (en) 2003-11-06

Similar Documents

Publication Publication Date Title
US7039723B2 (en) On-line image processing and communication system
CN1921495B (en) Distributed image processing for medical images
US8924864B2 (en) System and method for collaboratively communicating on images and saving those communications and images in a standard known format
US6654032B1 (en) Instant sharing of documents on a remote server
US8762856B2 (en) System and method for collaborative information display and markup
CN101427257B (en) Tracking and editing a resource in a real-time collaborative session
US6684255B1 (en) Methods and apparatus for transmission and rendering of complex 3D models over networks using mixed representations
EP0757486B1 (en) Conferencing system, terminal apparatus communication method and storage medium for storing the method
JP5269952B2 (en) Medical image communication system and medical image communication method
US6601087B1 (en) Instant document sharing
KR100304394B1 (en) Animation reuse in three dimensional virtual reality
US20140074913A1 (en) Client-side image rendering in a client-server image viewing architecture
US10638089B2 (en) System and method of collaboratively communication on images via input illustrations and have those illustrations auto erase
US20070115282A1 (en) Server-client architecture in medical imaging
KR100647164B1 (en) Local caching of images for on-line conferencing programs
JP2006101522A (en) Video conference system, video conference system for enabling participant to customize cooperation work model, and method for controlling mixing of data stream for video conference session
KR20020007945A (en) Enlarged Digital Image Providing Method and Apparatus Using Data Communication Networks
US10404763B2 (en) System and method for interactive and real-time visualization of distributed media
CN105493501A (en) Virtual video camera
EP2669830A1 (en) Preparation and display of derived series of medical images
CA2566638A1 (en) Method and system for remote and adaptive visualization of graphical image data
US20080126487A1 (en) Method and System for Remote Collaboration
Sun et al. A hybrid remote rendering method for mobile applications
US20130207990A1 (en) Imaging service for automating the display of images
Schumann et al. Applying augmented reality techniques in the field of interactive collaborative design

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NO NZ OM PH PL PT RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP