WO1997034995A1 - Analytical imaging system and process - Google Patents

Analytical imaging system and process Download PDF

Info

Publication number
WO1997034995A1
WO1997034995A1 PCT/US1997/003467 US9703467W WO9734995A1 WO 1997034995 A1 WO1997034995 A1 WO 1997034995A1 US 9703467 W US9703467 W US 9703467W WO 9734995 A1 WO9734995 A1 WO 9734995A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
photon
activity
video
image
Prior art date
Application number
PCT/US1997/003467
Other languages
French (fr)
Inventor
Robert J. Silver
Original Assignee
Marine Biological Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Marine Biological Laboratory filed Critical Marine Biological Laboratory
Priority to AU23183/97A priority Critical patent/AU2318397A/en
Publication of WO1997034995A1 publication Critical patent/WO1997034995A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/32Micromanipulators structurally combined with microscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N2021/6417Spectrofluorimetric devices

Definitions

  • This invention relates to a system and process for im- aging and analyzing a specimen during an activity.
  • Living cells and tissues perform and coordinate hun ⁇ dreds to thousands of individual processes, and control the location, orientation, and state of assembly of many struc ⁇ tural components in the course of normal life.
  • pro ⁇ Des are usually performed by, and structures are com ⁇ prised of, specialized groups and classes of molecules.
  • Bi ⁇ ologists have used light microscopes to study these pro- Steps, both in living cells and in cells that have been preserved at particular points in the cells' lives. Study of these processes or structures involves the detection of molecules or reactions as signals that are often processed and analyzed to help the biologist learn and understand the particular process or structure. Such detection typically relies on a characteristic interaction of light with the molecules responsible for the process or structure that is subject to study. Because components such as molecules are dynamic in living cells and act in concert with, and rely upon, interactions among similar and dissimilar components, it is desirable to study the relationship of a component with one or more other components in a cell.
  • the present invention includes an image processing sys ⁇ tem and process for imaging a specimen during an activity, such as biological, physical, or chemical activity, to ana ⁇ lyze the specimen during the activity.
  • the system receives and records different spectral images, preferably a visible image and a low intensity photonic image, and synchronously records them, preferably in real time on an image-recording media, such as a video cassette recorder (VCR) or an optical disk recorder. These images can be displayed later in jux ⁇ taposition or in superposition and processed for further analysis.
  • the system includes an image receiving device that includes a microscope with at least one beam-splitter.
  • These images are preferably recorded with cameras with frame sampling times and rates synchronized by a common timing device.
  • the cam ⁇ eras may operate at the same video field and frame rates or at integral multiples of such video field or frame rates to achieve increased temporal resolution for a given spectral band or other optical property of the specimen being ob ⁇ served, or to accommodate special image and information dis ⁇ play devices.
  • the image at the back focal plane of the ob ⁇ jective lens may be directed to any camera in the system with a Bertrand lens or similar device to provide the dif ⁇ fraction image of the specimen.
  • the data is digitized and then processed and analyzed by an image processing system that may be at the site of the activity or remote.
  • the image processing system is pro- grammed to analyze the image data, and to store and classify signals to establish spatial and temporal signatures of the observed activity. These signatures are accumulated and stored together to provide a library of signatures.
  • the observed activity can be displayed on-line and can also be continuously compared with the signatures stored in the li ⁇ brary to determine a correspondence or correlation as an activity is progressing.
  • Variables that affect the activ ⁇ ity such as chemicals or other conditions such as heat or light, can be modified to control that activity while the specimen is under observation.
  • a visible image is that is that of a reproductive division of a cell (mitosis) produced by a microscope, e.g., with a signal influenced by changes in the localized refractive index as revealed through the use of polarized light or through methods of phase contrast enhancement, such as phase contrast, differential interference contrast, or modulation contrast optics .
  • phase contrast enhancement such as phase contrast, differential interference contrast, or modulation contrast optics .
  • Another image indicates an inte'raction of specific ions or other compounds, such as calcium, with an ion-sensitive photonic reporter or other suitable means.
  • a spatial-temporal image of photons attributable to and indicative of the presence of particular ions or other components at a characteristic concentration is recorded during mitosis.
  • further processing preferably includes rendering the calcium-dependent photon spectrum visible to the human eye with a color assigned to the signals detected by the camera whose input spectral band is tuned to detect the photonic reporting of calcium by the interaction of a calcium specific reporter molecule and a calcium ion.
  • a single color may be used, or it can be varied to reflect at least one of a number of factors including a duration of the photon emission from a particular locations, a temporal or spatial frequency of photon emissions relative to each other, and/or specific structural, chemical, or physical features or events in the cell.
  • the emissions are prefer ⁇ ably displayed in an overlying manner to show the locations and time durations of the emissions during mitosis.
  • the present invention allows a user to understand the role of any component or components and its relationship with one or more other components in an image by studying the dynamics of the interactions of these components and processing and analyzing the characteristic signals of sev ⁇ eral different signals and their interrelationships during the course of the activity being studied, i.e., in real ⁇ time.
  • processing and analysis permits more expeditious assimilation of information for the observer, and permits an observer to manipulate a subject and monitor the effects of such manipulation during the course of the observation.
  • the invention provides for comparisons among spectra; for analy- sis of the relationships among various spectra or spectral bands in any combination of parametric axes; for comparison between simulations of data and system under study; and for analysis of relationships among various spectra and various kinetic simulations and other computational models of the data and system under study, in any combination of paramet ⁇ ric axes. Accordingly, the invention is a powerful system and method for analyzing processes subject to multiple spec ⁇ tra. Other features and advantages will be apparent from the following detailed description, the drawings and from the claims.
  • Fig. 1 is a pictorial block diagram of an imaging sys ⁇ tem according to the present invention.
  • Figs. 2-3 are pictorial block diagrams of components of Fig. 1 shown in more detail.
  • Figs. 4 (a) -4(c) are images produced according to the present invention.
  • the present invention relates to a system and method for image analysis, and is particularly useful with a micro ⁇ scope system for observing biological processes. While a biological example is described in detail, the invention has broad applicability to other physical, chemical, and bio ⁇ logical activities, and can be used on a microscopic, endo- scopic, or long-range basis.
  • a specimen is observed with a mi ⁇ croscope system 10 under control of a motion and illumina- tion control system 12.
  • Microscope system 10 provides im ⁇ ages to a video system 14 that captures and records data from the specimen, including at least visual data and point emissive spectral data.
  • the microscope may be equipped with a Bertrand lens or similar device and a. beam steering device to direct the image at the back focal plane of the objective lens to any camera in the system to provide the diffraction image of the specimen in a given spectral band.
  • Video system 14 provides the data to a first local server 16, which digitizes the data and causes digital im ⁇ ages to be stored in a digital video recorder 20.
  • Server 16 also provides the digital images at a high rate of speed to a second server 22, which performs image analysis and pro ⁇ vides the analyzed data to a third server 24.
  • Server 24 is primarily responsible for data compression, for archiving in a local information base 26, and for transmitting data to a remote supercomputer system or systems 40 for processing. This transmission can be through a hard-wired line 28 or via a local uplink/downlink system 30, a local satellite dish 32, a satellite 34, a remote satellite dish 36, and a remote uplink/downlink system 38.
  • servers 16, 22, and 24, and remote su ⁇ percomputer system(s) 40 can be considered an image proces ⁇ sor 42. While the functions of the image processor are al ⁇ located in a particular way with certain local and remote tasks, these various functions can be combined or allocated in other ways, and can be performed with other shared re ⁇ sources, such as a network of workstations.
  • a local multiprocessor 43 may be used to process and analyze images immediately upon acquisition, to perform post-acquisition analysis, and to display images. Such a local multiprocessor can permit computational processing, analysis, and control functions when the time delays due to velocity limits and bidirectional information transfer im ⁇ posed by the distance separating the microscope and the re ⁇ mote processors would be intolerable for the process being studied.
  • a local multiprocessor could be a Princeton Engine, which is available from the David Sarnoff Research Center in Princeton, New Jersey, or a Princeton Engine con ⁇ nected via a HIPPI interface to a supercomputer or cluster of workstations.
  • Figs. 2-3 are block diagrams that illustrate the compo- nents of Fig. 1 in more detail. While the system shown here is a two-camera device, other cameras and sensors can be provided for recording additional information by using ad ⁇ ditional beam splitting.
  • microscope system 10 and control system 12 include a micro- scope controlled by a microcomputer controller and display 100 via an interface 102 in server 16.
  • Server 16 and con ⁇ troller 100 control illumination, focusing of lenses, fil ⁇ ters, motors, and other controls that can be mechanically, hydraulically, pneumatically, magnetically, or electrically driven, to manipulate light incident to or emanating from the specimen under study.
  • Controller 100 causes a monochromatic source 50 to gen ⁇ erate light with a single wavelength preferably a fiber op ⁇ tic monochrome illuminator, in a well-defined spectral band- pass .
  • the source may include a monochrometer and a bandpass limiting adjustable slit, a special bandpass limiting inter ⁇ ference filter, or a similar device.
  • the light is optically coupled through an optical fiber 52 to a polarizer 54, and the resulting polarized light is provided to a compensator 56, a rectifier 58, and a condenser 60 lens.
  • the resulting light illuminates an underside of a transparent stage 62 on which rests the specimen under study (not shown) .
  • the end of the illuminating fiber, and thus the point source of light for the illumination of the specimen, may be provided with a further device (not shown) for positional control to permit positioning of the point source of light at alternating positions synchronized to the scan rate for video frames or fields, thereby providing oblique illumina ⁇ tion and generation of stereoscopic images using a single objective lens and a single condenser lens.
  • An objective lens 64 receives an image of the specimen during the activity and provides the image through a beam splitter 66 that directs the image to two separate and dif ⁇ ferent cameras 68, 70.
  • Camera 70 is preferably a photon counting camera, such as a model made by Hamamatsu Photon ⁇ ics, and camera 68 is preferably a silicon intensified tar ⁇ get (SIT) low level light camera that captures a dif ⁇ ferential interference contrast (DIC) image of the activity illuminated with the light generated by source 50.
  • SIT silicon intensified tar ⁇ get
  • DIC dif ⁇ ferential interference contrast
  • Other cameras and beam-splitters can be provided to receive fur ⁇ ther images, such as a second photon counting camera.
  • An infrared camera can also be provided to detect ther ⁇ mal emissions and to produce a thermal signature of the ac ⁇ tivity.
  • the infrared camera allows assessment of the infra- red absorptive and emissive properties of the specimen being observed.
  • Such thermal emissions are a function of the physical activity of the specimen in an emissive region, including changes in temperature associated with changes in chemical, electrical, and/or physical activity, rates of reactions or changes in concentration of particular compo ⁇ nents or classes of components of the specimen, or other thermodynamic properties of the specimen.
  • the infrared cam ⁇ era is positioned at an optimal location for detection of image information in the bandpass of interest.
  • the optics of the microscope may be fabricated from infrared transmis ⁇ sive or reflective materials, such as germanium, germanium composites, or gallium composites, properties that are com ⁇ patible with the conditions of the observations being per ⁇ formed.
  • Cameras 68, 70 receive analog data and provide the re ⁇ ceived data to respective VCRs 72, 74 to be recorded.
  • the VCRs are driven by a common external horizontal and vertical sync signal source 80 at the level of full video frames and preferably corresponding video image fields (e.g., using subcarrier phase locking) , and are preferably equipped with SMPTE (Society of Motion Picture and Television Executives) time code generators operated to insure sequential numbering of all video frames in a manner that provides temporally accurate time references among and between sets of original recordings. This numbering expedites the registration of multiple image sets .
  • the SMPTE time code generator may also be set to operate on a time signal provided by an external standard, including the Global Positioning Satellite system.
  • the signal path for one or more of the VCRs may be equipped with noise reduction devices to reduce the noise level in the signal, and a VITS (video information test sig ⁇ nal) and VIRS (vertical interval reference signal) video standard reference signal generator to provide a set of standard reference signals in image space not used by the detector faceplate of the camera.
  • VITS video information test sig ⁇ nal
  • VIRS vertical interval reference signal
  • Such reference signals insure accurate reference of recorded signals at each set of transmission to downstream video image reception points.
  • the visual and emissive data provided from VCRs 72, 74 is provided to first server 16 for preprocessing to permit temporal or spatial noise to be filtered, and to permit the extraction of features of interest and importance to the understanding of the activity being observed.
  • Server 16 has two analog to digital (A/D) conversion units 104, 106 for receiving and converting the received analog data into digi ⁇ tal data suitable for initial processing and/or for trans ⁇ mission to other servers.
  • A/D analog to digital
  • First server 16 which is preferably a reduced instruc ⁇ tion set computer (RISC) , such as an IBM RS/6000 10, or a high performance personal computer with high computational power.
  • Server 16 is equipped for video-rate input, through ⁇ put, and output at a rate equal to or in excess of 50 mega ⁇ bytes per second, and is optimized for real time, interrupt- driven I/O activity.
  • Server 16 sends digitized images via high speed fiber optic link 108 to second server 22.
  • the microchannel ports and bus speed allow a total of four video input sensors, as well as one channel to control other features, such as focal plane mo ⁇ tion and shutters .
  • server 16 controls any microstepper motors 112, shutters, filter wheels, a motion controller 116, and other devices associ ⁇ ated with primary imaging activities via controller 100.
  • Actuation of microstepper motor 112 alters the microscope (e.g., focus, focal plane axis, orientation, thickness, X- axis, Y-axis, Z-axis, and rotational stage movements) and the optical configuration (e.g., DIC, POL, lens or filter or other modulator position, orientation and/or actions, or stereoscopic) .
  • the user preferably controls these functions from one of workstations 140, 142 (Fig. 2) by selecting icons with a pointing device, such as a mouse or a track ⁇ ball. Because the activity and the analysis thereof can be observed in real-time, a user can make changes from his/her workstation during the activity.
  • Server 16 can be programmed to initiate motion control of the microscope in response to particular sequences or patterns of activity.
  • One such computer-initiated response changes the microscope's focal plane in discrete time- dependent steps, cycling from top to bottom, then returning to the top surface of the specimen.
  • Such images provide information for tomographic reconstructions in low to moder ⁇ ate resolution in support of on-line experiments, and at high resolution in post-acquisitional analyses .
  • Server 16 also has a video display interface 122 through which images and other information can be displayed on a high resolution video display 120 that can be color or monochromatic.
  • Server 16 may also record the coordinates for each po ⁇ sition at which the microscope is set to record and later study the path of viewing of a given specimen during a given observation, to optimize the path for future observations, or to reconstruct the shape of a specimen following a trac ⁇ ing of the structure using the focal point as a probe of the surface features of the specimen.
  • second server 22 which is also preferably a computer similar to that used for server 16, serves as the primary local unit for image processing, analysis, and interactive graphics.
  • Server 22 performs arithmetic, logical, and interactive imaging functions, such as rotational reorientations and three-dimensional polygon rendering, i.e., the interactive visualization and manipula ⁇ tion of complex shapes, preferably facilitated by a resident graphics program such as PHIGS+ or HIDEM, within the hard ⁇ ware of server 22.
  • server 22 provides images to graphics monitors 130, 132 to present to dimensional im- ages for each video sensor, as well as a display 134 for stereoscopic projections and tomographic constructions, such as a StereoGraphics Z-screen, a CrystalEyes display, a holo ⁇ graphic display, a viewer mounted heads-up display, or a high resolution three-dimensional display.
  • Server 22 pref- erably has a higher level of computational, processing, analysis, and display power and memory than server 16 to permit bitmap animations and to control the graphics moni ⁇ tors and the display.
  • Two graphical workstations 140, 142 linked via a net- work 144 to servers 16, 22, are equipped with software mod ⁇ ules 146, 148 for system control and data management during simulation experiments, and for programming and systems man ⁇ agement.
  • software mod ⁇ ules 146, 148 for system control and data management during simulation experiments, and for programming and systems man ⁇ agement.
  • programming personnel can participate di ⁇ rectly in an experiment by redirecting computational analy- ses during the course of an experiment or helping to address 'what if questions.
  • Microscope system 10, servers 16, 22, and workstations 140, 142 are all preferably located at the site of activity under study.
  • Server 22 provides data over a transmission line 150 to a third server 24, preferably also a RISC computer, which is programmed for data compression, for archiving in the li ⁇ brary of signatures, and for database management.
  • Server 24 uses a direct block multiplex (BMUX) channel 160 or other suitable wide channel line connection to a remote supercom- puter system 40.
  • BMUX direct block multiplex
  • Such proxim ⁇ ity may be extended with suitable high speed, broad band ⁇ width digital information transfer methods.
  • Supercomputer system 40 may consist of one or more high performance computational engines, such as a massively par ⁇ allel computational engine, such as a 1024-processor or 2048-processor Princeton Engine.
  • the supercomputer system may be designed to use electronic and/or optical processing of images and may be of a SIMD or MIMD architecture, using processors operating synchronously or asynchronously, or in various combinations of the above. Depending upon its de ⁇ sign architecture, operating system, size, and capability, this system can enable the user of the system to perform analysis at high levels of sophistication.
  • System 40 may be located at the same site or distributed over a number of sites with appropriate communications .
  • the harnessing of the various computational engines may be predetermined by the operator or determined during the course of a particular application or process by the operator or by one of the com- putational engines by using an algorithm designed to assess computational activity and needs and to recruit needed and appropriate hardware and software resources from among local and remote sites during the course of an application or pro ⁇ cess using the process described herein.
  • the powerful processing system 40 allows numerous and sophisticated image processing applications and analysis to be performed.
  • the analysis preferably includes at least comparing visual signals to signals representing point emis ⁇ sive data, and correlating these signals.
  • the data may also be manipulated and digitally or optically filtered; for ex- ample, the data may be sampled at periodic intervals to de ⁇ termine if there are periodic characteristics to the point emissive data that can be classified and distinguished as a particular type of signal or as a noise of a particular va ⁇ riety.
  • the point emissive data can be filtered to remove random noise and to remove data generated from background emissions and not related to the emissions caused by the activity under study.
  • the results are transmitted to the site of the observed activity via communication links. These results are stored in local information base 26 in a signature library for later study and for comparison with new activity data.
  • the image processor produces signatures for an activity under study, retrieves signatures from the signature library, and compares the sig ⁇ natures of the activity under study to the signatures stored in the signature library. If there is a match, cor ⁇ respondence, or correlation between the signature under study and one or more signatures stored in the library, the image processor can transmit information about such a cor ⁇ relation to the display unit in real-time .
  • the processing performed by the image processor prefer ⁇ ably includes computational edge ction and enhancement procedures "on-the-fly. " For edge c action of a specimen, gray scale values of edges are determined, all other gray scale values are removed from the image, and the remaining background values are brought to a common, intermediate value. This procedure results in a dark curve representing the edge of the object (s) of interest on a neutral or gray background. A low pass spatial filter is applied to mini ⁇ mize the effects of imaging aliasing thereby computationally sharpening the image.
  • the optically determined shape of the specimen serves as the framework within which other images (such as flashes of light of ae- quorin luminescence as discussed below) will be located by superposition of the real-time data onto a 3D model. From this reconstruction, the image processor synthesizes a left and right eye view of the specimen. By displaying these two slightly differing images, one to each eye, stereo percep ⁇ tion can be simulated on the computer screen so that the image of the specimen appears truly 3D. Sound can also be used to produce auditory cues to aid in the perception and recognition of spatial and temporal patterns of the luminescence signals.
  • Separate tones and pitches can be assigned to each position of the video sam ⁇ pling grid, and such tones and pitches can be output as de- tectable sound in monaural or stereoscopic formats.
  • image information dependent audible output may serve to as ⁇ sist the observer in detecting patterns of structure and/or activity, as well as changes in such patterns of structure and/or activity.
  • Observations can additionally be recorded on an audio track of the videotapes by using a microphone connected di ⁇ rectly or indirectly to the video tape recorder or other recorders.
  • the SMPTE time code recorded simultaneously on an audio channel or other suitable recording channel sepa ⁇ rate from that used for voice recording can be used to in ⁇ sure accurate video frame registration of image information among each separate spectral band and of combinations of spectral bands produced in analog and/or digital modes dur- ing or subsequent to the initial observation.
  • a VITS signal can be introduced into the recording path of one or more separate video channels to insure that, follow ⁇ ing standardization of video system performance (e.g., black and white levels, color bars, black burst, VIRS, etc.) prior to the beginning of each recording or observation, the sig ⁇ nals received at remote locations can be adjusted to compen ⁇ sate for losses and noise incurred during transmission of the video image and other signals.
  • a fluorescent calcium-sensitive reporter such as fura-2 or calcium green
  • computational methods should compensate for the effects of diffusion of the calcium-reporter complex and for the buffering effects of the reporter on local concentrations of calcium ions.
  • These emissions follow patterns that have both spatial and tempo ⁇ ral signatures; i.e., the cell releases calcium ions from intracellular stores which concentrate in particular places called microdomains within the cell.
  • the localized concen ⁇ tration of calcium ions varies over time.
  • Calcium ions may be controlling factors or may be transmitters of controls for regulating the process necessary for controlling mito- sis.
  • Figs. 4 (a) -4(c) are exemplary photographs showing, re ⁇ spectively, (a) images representing calcium ions and cell division side-by-side; (b) images representing calcium ions and cell division superimposed; and (c) simultaneous capture of images representing calcium ions and cell division.
  • the specimen includes a cell disposed on a transparent microscope slide.
  • a micromanipulation and microinjection sys ⁇ tem 200 can be used to inject into the cell quantities of one or more samples for use in observing the cell and its activity. Such microinjection is a well known process.
  • cellular activity may be observed by following changes in localized concentrations or displacement of charged or uncharged particles using one or more electrodes specifi ⁇ cally designed to measure such localized concentrations or displacements of particles or waves outside the cell.
  • the microscope allows the cell or a portion of the cell to be illuminated with a quantity of light. Cellular activ ⁇ ity or photolytic release of a compound or compounds that may or may not influence cellular activity can then be ob ⁇ served.
  • Known processes may include the photolytic uncaging of reagents, fluorescence photobleaching, fluorescent recov ⁇ ery after photobleaching, imaging and/or manipulations using acoustic wave energy, laser-mediated ablation, manipulations with high energy particles such as alpha particles or x-rays the use of laser tweezers, and the use of laser scissors.
  • Such manipulations may be under the control of the computer system with motion and illumination control system 12 inter ⁇ acting with, and directed by, local servers 16, 22, and 24, local multiprocessor 43, and remote supercomputer 40, to provide for spatially and temporally accurate manipulations of the specimen.
  • Those manipulations may be the result of the computer system recognizing new or previously catalogued information within patterns observed from the cell, and/or other instructions provided by the observer.
  • mito ⁇ sis calcium ions and other particles are observed at the locations at which the concentration of such particles may be determined at various locations within the cell.
  • Aequorin a luminescent photoprotein from the jellyfish
  • Aequoria serves as a calcium reporter because it emits a
  • 2+ x means of bit-slicing through Ca . concentration levels, and x 2+ displacements and/or changes in Ca concentrations.
  • the luminescence of aequorin appears as discrete flashes on an otherwise dark background. While some regions of a cell have repeated flashes, other regions appear dark for all but stochastic emissions from the cell or shot noise from the imaging and recording system.
  • a first video frame contains the maxi ⁇ mal signal and subsequent frame intensities are linearly reduced to background within two additional frames.
  • each camera provides video pixels at a resolution preferably corresponding to a square of 500 nm per side.
  • the microscope preferably uses a Nikon plan apochromatic 20X/0.75 NA objec ⁇ tive lens, which has a brightness rating of 14.0625. In some circumstances it is preferable to use a Zeiss plan apo ⁇ chromatic water immersion 40X/0.75 numerical aperture objec ⁇ tive lens, which has a brightness rating of 3.156.
  • the image of a cell is preferably formed with a set of high performance, rectified differential interference contrast (DIC) or polarized light (POL) optics. These lenses enable one to optically section a living cell into very narrow image slices, with minimal interference from out-of-focus images.
  • DIC rectified differential interference contrast
  • POL polarized light
  • Images are preferably detected with an SIT camera and a dual microchannel plate intensified saticon tube camera for indicating the incidence of single photons on a target.
  • the microstepper motor driver controls the stage and focus to allow precise positioning of the ver ⁇ tical focal plane to within a single focal plane step of
  • a digital microstepper motor driver con- trollable through a microcontroller, permits remote opera ⁇ tion, a capability that is essential for single photon video observations .
  • the image processor controls data acquisition, process ⁇ ing, analysis, and support with software residing in the computer, and also controls, directly or indirectly, hard ⁇ ware for the A/D conversion of the signal or sequential, full-frame video frames from each camera sampling spectral bands, and the recording of such images in digital format such as video disk recorders, clustered magnetic hard drive arrays, or other such devices.
  • the system preferably per- mits digitization of such video signals to 10 bits per byte of data and permits transmission of the image information as a digitized signal over appropriate communications pathways. Such a system also permits the D/A conversion of the image information for playback and transmission.
  • Software drivers and other tools are provided for performing quantitative, real time, matrix-based regional subsampling of the inten ⁇ sity of video images.
  • Examples of software tools for a personal computer sys ⁇ tem may preferably include tools for (1) digitizing the im- age brightness of a 10 x 10 rectilinear sampling grid of one hundred contiguous 5 x 5 pixel subregions; (2) summing the photonic intensity of each of the box subregions plotted against time in a 10 by 10 array; and (3) producing a gray scale rendition of the files from the summing program.
  • the playback rate ranges from one frame per second (33 msec of recording time) to single frame readout of selected frames.
  • Table I is a list of cellular attributes that can cur- rently be imaged: Table I
  • Luminescence/adenosine triphosphate (ATP) production/photon counting Luminescence/translation of messenger RNA yielding peptide and protein production/photon counting
  • Cytoskeletal components e.g., actin, tubulin, myosin, dynein, kinesin, etc.
  • the SIT or intensified CCD low light level camera cap ⁇ tures a DIC image of a cell illuminated with 710 nm light projected through a condenser lens to an objective lens.
  • the photon counting camera sees the 465 nm photons emitted
  • a second photon counting camera can be used to see photons emitted from the cleavage of specific luciferin analog within the 500 nm to 580 nm range, e.g., for ATP or alkaline phosphatase, by the luciferin luciferase (fire-fly tail) reaction.
  • a fourth or further sensors can be added if the bandpass of a third re ⁇ porter (luminescent or fluorescent) is within a trough of the other three illumination modes .
  • the target phosphors routinely used in the photon counting cameras, while optimal between 450 nm and 550 nm, render these cameras "blind" above 650 nm, thereby facilitating the use of 700+ nm light for DIC and POL visual images.
  • Multiple concurrent images of the cell can be produced including one showing the whole cell organization; a second
  • the analog output signal of each camera is re ⁇ corded on a respective VCR, and then is sent to a first com- puter of the image processor.
  • These images form high reso ⁇ lution tomographic reconstructions in post-acquisition analyses, or low to moderate resolution in support of on line experimentation. Real-time digital mixing of the image records from individual video sensors permits superimposi-
  • the imaging system generates a set of rich visual imag ⁇ ery at a very high rate.
  • the relevant information to be analyzed may be temporal, 3D, or multispectral, and may involve correlating data from sev ⁇ eral different imaging sensors.
  • a portion of this system involves the standard computer vision techniques that have been established. Standard software exists and for analysis of multichannel image data (e.g., maximum likelihood classi ⁇ bomb, clustering procedures), temporal image patterns (e.g., Fourier transform, change detection procedures) , and spatial patterns (e.g. 2D Fourier transforms, geographic information systems) . These tools can be merged and optimized to create a combined spectral, spatial, and temporal pattern analysis procedure.
  • Such programs may be run on clustered worksta ⁇ tions or supercomputers.
  • each half of a dividing cell for example, the edge of each object visualized in one spectral band different from that used for calcium showing calcium-dependent emis ⁇ sion capabilities exhibited by aequorin-base luminescence as visualized in the aequorin luminescence spectral band, may be superimposed onto pairs of Cartesian coordinate planes using the mitotic poles as the origin.
  • the X-axis is de- scribed by the pole-to-pole axis
  • the Y-axis is projected through that pole and is perpendicular to the X-axis of that cell)
  • the Z-axis is parallel to the optical axis of the imaging system, the objective lens.
  • the system tracks particles as trajectories following an initial curve to allow one to extrapolate across discontinuous por ⁇ tions of the particle's path due to optical-mechanical dis ⁇ tortions in a cell and meandering within and outside the depth of field of an image plane (e.g., 1.2 micron thick) used for initial image acquisition.
  • Such "filled-in" tra ⁇ jectories will then be recomputed to provide the best pos ⁇ sible fit for the edge.
  • an equation that best fits the curve form through iterative curve fit algorithms can be determined.
  • Such com ⁇ putational analysis of image data is used to solve for sur ⁇ face topography of isosurfaces depicting cellular or chemi- cal activities and object spatial deformations and displace ⁇ ments over time.
  • a variety of UNIX-based X-window display environments can be used for 2D and 3D data plots and other computational visualizations such as simple X-Y plots, histograms, sur- faces, and volume renderings. Shaded polygon images are possible with a graphics environment and graphics engine, preferably with hardware residing on one computer and soft ⁇ ware on another. Wavefront software on the computers pro ⁇ turn credible images, and is currently used to make video- tape for both local and remote users. Still images can also be displayed in an X-windows environment.
  • a form of X-movie displays Wavefront frames on a workstation screen fastener enough to preview animation and can be sent to remote users by fast mail.
  • a volume rendering program can be used to convert 3D data sets to surfaces. The resulting images can be displayed in X-windows or another suitable format, or saved as a compressed raster for later display in an X- window for transmission to remote workstations.
  • the intermediate videotape step can be omitted and animated sequences can be constructed directly on an intermediate video rate memory device, such as a video disk recorder, optical memory disk recorder, or RAID memory disk array, and then displayed on the screen of the user's workstation.
  • an intermediate video rate memory device such as a video disk recorder, optical memory disk recorder, or RAID memory disk array
  • One benefit of the system is the development of a 3D volume rendered virtual mitotic cell (including intracel ⁇ lular organelles and other compartments) constructed from real time tomographic data and multifaceted in vivo and in vitro experimental data.
  • This virtual mitotic cell will, in turn, be subject to computational experimental manipulations of a wide spectrum of cell physiological and biochemical (molecular and ionic) parameters associated with mitosis and the cell cycle. Large intracellular structures limit the diffusion of
  • Predicting cellular and intracellular behavior from various conceptual models thus is greatly facilitated, and the presence or absence of important unifying precepts can be clarified.
  • Candidate models are run multiple times, and a mean model solution.
  • One application of particular importance for quantal emission photon imaging is analysis of photon scatter and luminescence image noise.
  • One application of this effort is to determine the source of the higher number of photos dis- tributed throughout the image filed, yet outside the bound ⁇ aries of the labeled cell .
  • Monte Carlo simula- tions similar to those developed to model regulation of acetylcholine receptors and ion channels, can help establish a model that describes the ultimate source of these scat ⁇ tered photons.
  • One imaging system of the present invention can be used
  • imaging systems can be combined to probe the effects of various agonists and antagonists of various metabolic and structural or other activities and
  • the system performs computational comparisons of image information from experiments with living cells and experi- ments performed on computational modelings of the system under study with modeling systems operating on computers and workstations such as the object oriented DEEM environment developed by the Argonne National Laboratory.
  • This system corresponds output living cells and computational simula ⁇ tions and indicates likely areas for improvement of the ac ⁇ curacy and precision of the computational model, as well as indicating experimental approaches for studies of living cells .
  • This approach is scalable to specimens other than living cells that may be studied with this system. This approach also provides for establishment of a dialog with the specimen being studied, with the potential outcomes in- eluding imposing an observer-deemed manipulation and/or regulation upon the specimen under observation.
  • the system would utilize the information ob ⁇ tained through the capture, processing, and analysis of im ⁇ age information, including spatial and temporal patterns of chemical, physical, and other reactions and transformations associated with specific events inherent to the specimen, and utilize that information to direct the regulation of the process occurring, about to occur, or capable of occurring in the specimen.
  • im ⁇ age information including spatial and temporal patterns of chemical, physical, and other reactions and transformations associated with specific events inherent to the specimen, and utilize that information to direct the regulation of the process occurring, about to occur, or capable of occurring in the specimen.
  • An example of such an application would be the use of spatially and temporally regulated generation of intracel ⁇ lular calcium signals, such as those detected with aequorin in dividing cells, wherein the localized elevations of in ⁇ tracellular calcium concentration is achieved by flash pho ⁇ tolysis of caged calcium or other caged compounds that would elicit the increase in localized intracellular calcium by other natural or designed method.
  • Such an application can be readily scaled upward to include treatment of tissues, organs, organ system, and entire people, and can be scaled downward to range from single cells to regions of cells and components of cells including organelles, molecules, and groups of molecules .
  • While the invention has been described for use with a microscope, it can be used on a long-range basis for tasks such as assessment of crops, or environmental and industrial activity. These activities can be analyzed by obtaining images using two or more different spectral bands, and com ⁇ paring the data contained in each of the images to develop information relevant to the subject under observation.
  • images can be obtained from imaging devices located in air ⁇ craft, balloons, spacecraft or satellites, and can include the health of a particular crop, cloud formation and other meteorological information that portend a storm or other condition, state of disease of a forest, chemical composi ⁇ tion of smoke emitted from a factory, or the growth state of plants located downstream of a site of effluent discharge into a body of water.
  • the system can also be used in the field of endoscopy for assessment of the metabolic state of a patient's tissue or organ by measuring photonic spectral properties of that tissue or organ.
  • Specific metabolic parameters of a tissue can be determined directly or indirectly as a function of the absorption of light of particular wavelengths and spec- tral bands.
  • various optically active reporter reagents can be used in conjunction with simple absorption methods to assess other metabolic parameters.
  • an imaging system according to the present invention can provide real-time assessments while visualizing the tissue in question, even during the perfor ⁇ mance of a surgical procedure.
  • the ability to observe cells and tissues can allow processes to be observed and recorded from living bodies, including, for example, effects of toxins or radia- tion on tissues or cells.
  • the observation of such cells is not limited to animal cells, but can also apply to plant cells, and to other entities. What is claimed is :

Abstract

An image system captures and records optical and photon images of an activity, such as cellular phenomena. The images are simultaneously recorded on videotape and displayed. Image data is processed and stored for later analysis and for comparison to new data.

Description

ANALYTICAL IMAGING SYSTEM AND PROCESS
Field Of The Invention
This invention relates to a system and process for im- aging and analyzing a specimen during an activity.
Background of the Invention
Living cells and tissues perform and coordinate hun¬ dreds to thousands of individual processes, and control the location, orientation, and state of assembly of many struc¬ tural components in the course of normal life. These pro¬ cesses are usually performed by, and structures are com¬ prised of, specialized groups and classes of molecules. Bi¬ ologists have used light microscopes to study these pro- cesses, both in living cells and in cells that have been preserved at particular points in the cells' lives. Study of these processes or structures involves the detection of molecules or reactions as signals that are often processed and analyzed to help the biologist learn and understand the particular process or structure. Such detection typically relies on a characteristic interaction of light with the molecules responsible for the process or structure that is subject to study. Because components such as molecules are dynamic in living cells and act in concert with, and rely upon, interactions among similar and dissimilar components, it is desirable to study the relationship of a component with one or more other components in a cell.
Summary of the Invention The present invention includes an image processing sys¬ tem and process for imaging a specimen during an activity, such as biological, physical, or chemical activity, to ana¬ lyze the specimen during the activity. The system receives and records different spectral images, preferably a visible image and a low intensity photonic image, and synchronously records them, preferably in real time on an image-recording media, such as a video cassette recorder (VCR) or an optical disk recorder. These images can be displayed later in jux¬ taposition or in superposition and processed for further analysis. In preferred embodiments, the system includes an image receiving device that includes a microscope with at least one beam-splitter. One output from the beam-splitter pro¬ vides a visual image, while another output is filtered to pass only photons of a characteristic wavelength, phase, or orientation as a result of the activity. These images are preferably recorded with cameras with frame sampling times and rates synchronized by a common timing device. The cam¬ eras may operate at the same video field and frame rates or at integral multiples of such video field or frame rates to achieve increased temporal resolution for a given spectral band or other optical property of the specimen being ob¬ served, or to accommodate special image and information dis¬ play devices. The image at the back focal plane of the ob¬ jective lens may be directed to any camera in the system with a Bertrand lens or similar device to provide the dif¬ fraction image of the specimen.
The data is digitized and then processed and analyzed by an image processing system that may be at the site of the activity or remote. The image processing system is pro- grammed to analyze the image data, and to store and classify signals to establish spatial and temporal signatures of the observed activity. These signatures are accumulated and stored together to provide a library of signatures. The observed activity can be displayed on-line and can also be continuously compared with the signatures stored in the li¬ brary to determine a correspondence or correlation as an activity is progressing. Variables that affect the activ¬ ity, such as chemicals or other conditions such as heat or light, can be modified to control that activity while the specimen is under observation. In an exemplary application of this system and process, a visible image is that is that of a reproductive division of a cell (mitosis) produced by a microscope, e.g., with a signal influenced by changes in the localized refractive index as revealed through the use of polarized light or through methods of phase contrast enhancement, such as phase contrast, differential interference contrast, or modulation contrast optics . Another image indicates an inte'raction of specific ions or other compounds, such as calcium, with an ion-sensitive photonic reporter or other suitable means. Thus, a spatial-temporal image of photons attributable to and indicative of the presence of particular ions or other components at a characteristic concentration is recorded during mitosis. In this example, further processing preferably includes rendering the calcium-dependent photon spectrum visible to the human eye with a color assigned to the signals detected by the camera whose input spectral band is tuned to detect the photonic reporting of calcium by the interaction of a calcium specific reporter molecule and a calcium ion. A single color may be used, or it can be varied to reflect at least one of a number of factors including a duration of the photon emission from a particular locations, a temporal or spatial frequency of photon emissions relative to each other, and/or specific structural, chemical, or physical features or events in the cell. The emissions are prefer¬ ably displayed in an overlying manner to show the locations and time durations of the emissions during mitosis.
The present invention allows a user to understand the role of any component or components and its relationship with one or more other components in an image by studying the dynamics of the interactions of these components and processing and analyzing the characteristic signals of sev¬ eral different signals and their interrelationships during the course of the activity being studied, i.e., in real¬ time. Such processing and analysis permits more expeditious assimilation of information for the observer, and permits an observer to manipulate a subject and monitor the effects of such manipulation during the course of the observation. The invention provides for comparisons among spectra; for analy- sis of the relationships among various spectra or spectral bands in any combination of parametric axes; for comparison between simulations of data and system under study; and for analysis of relationships among various spectra and various kinetic simulations and other computational models of the data and system under study, in any combination of paramet¬ ric axes. Accordingly, the invention is a powerful system and method for analyzing processes subject to multiple spec¬ tra. Other features and advantages will be apparent from the following detailed description, the drawings and from the claims.
Brief Description of the Drawings
Fig. 1 is a pictorial block diagram of an imaging sys¬ tem according to the present invention. Figs. 2-3 are pictorial block diagrams of components of Fig. 1 shown in more detail.
Figs. 4 (a) -4(c) are images produced according to the present invention.
Detailed Description
The present invention relates to a system and method for image analysis, and is particularly useful with a micro¬ scope system for observing biological processes. While a biological example is described in detail, the invention has broad applicability to other physical, chemical, and bio¬ logical activities, and can be used on a microscopic, endo- scopic, or long-range basis.
Referring to Fig. 1, a specimen is observed with a mi¬ croscope system 10 under control of a motion and illumina- tion control system 12. Microscope system 10 provides im¬ ages to a video system 14 that captures and records data from the specimen, including at least visual data and point emissive spectral data. The microscope may be equipped with a Bertrand lens or similar device and a. beam steering device to direct the image at the back focal plane of the objective lens to any camera in the system to provide the diffraction image of the specimen in a given spectral band.
Video system 14 provides the data to a first local server 16, which digitizes the data and causes digital im¬ ages to be stored in a digital video recorder 20. Server 16 also provides the digital images at a high rate of speed to a second server 22, which performs image analysis and pro¬ vides the analyzed data to a third server 24. Server 24 is primarily responsible for data compression, for archiving in a local information base 26, and for transmitting data to a remote supercomputer system or systems 40 for processing. This transmission can be through a hard-wired line 28 or via a local uplink/downlink system 30, a local satellite dish 32, a satellite 34, a remote satellite dish 36, and a remote uplink/downlink system 38. Taken together, servers 16, 22, and 24, and remote su¬ percomputer system(s) 40 can be considered an image proces¬ sor 42. While the functions of the image processor are al¬ located in a particular way with certain local and remote tasks, these various functions can be combined or allocated in other ways, and can be performed with other shared re¬ sources, such as a network of workstations.
A local multiprocessor 43 may be used to process and analyze images immediately upon acquisition, to perform post-acquisition analysis, and to display images. Such a local multiprocessor can permit computational processing, analysis, and control functions when the time delays due to velocity limits and bidirectional information transfer im¬ posed by the distance separating the microscope and the re¬ mote processors would be intolerable for the process being studied. Such a local multiprocessor could be a Princeton Engine, which is available from the David Sarnoff Research Center in Princeton, New Jersey, or a Princeton Engine con¬ nected via a HIPPI interface to a supercomputer or cluster of workstations.
Figs. 2-3 are block diagrams that illustrate the compo- nents of Fig. 1 in more detail. While the system shown here is a two-camera device, other cameras and sensors can be provided for recording additional information by using ad¬ ditional beam splitting. Referring particularly to Fig. 2, microscope system 10 and control system 12 include a micro- scope controlled by a microcomputer controller and display 100 via an interface 102 in server 16. Server 16 and con¬ troller 100 control illumination, focusing of lenses, fil¬ ters, motors, and other controls that can be mechanically, hydraulically, pneumatically, magnetically, or electrically driven, to manipulate light incident to or emanating from the specimen under study.
Controller 100 causes a monochromatic source 50 to gen¬ erate light with a single wavelength preferably a fiber op¬ tic monochrome illuminator, in a well-defined spectral band- pass . The source may include a monochrometer and a bandpass limiting adjustable slit, a special bandpass limiting inter¬ ference filter, or a similar device. The light is optically coupled through an optical fiber 52 to a polarizer 54, and the resulting polarized light is provided to a compensator 56, a rectifier 58, and a condenser 60 lens. The resulting light illuminates an underside of a transparent stage 62 on which rests the specimen under study (not shown) .
The end of the illuminating fiber, and thus the point source of light for the illumination of the specimen, may be provided with a further device (not shown) for positional control to permit positioning of the point source of light at alternating positions synchronized to the scan rate for video frames or fields, thereby providing oblique illumina¬ tion and generation of stereoscopic images using a single objective lens and a single condenser lens. An objective lens 64 receives an image of the specimen during the activity and provides the image through a beam splitter 66 that directs the image to two separate and dif¬ ferent cameras 68, 70. Camera 70 is preferably a photon counting camera, such as a model made by Hamamatsu Photon¬ ics, and camera 68 is preferably a silicon intensified tar¬ get (SIT) low level light camera that captures a dif¬ ferential interference contrast (DIC) image of the activity illuminated with the light generated by source 50. Other cameras and beam-splitters can be provided to receive fur¬ ther images, such as a second photon counting camera.
An infrared camera can also be provided to detect ther¬ mal emissions and to produce a thermal signature of the ac¬ tivity. The infrared camera allows assessment of the infra- red absorptive and emissive properties of the specimen being observed. Such thermal emissions are a function of the physical activity of the specimen in an emissive region, including changes in temperature associated with changes in chemical, electrical, and/or physical activity, rates of reactions or changes in concentration of particular compo¬ nents or classes of components of the specimen, or other thermodynamic properties of the specimen. The infrared cam¬ era is positioned at an optimal location for detection of image information in the bandpass of interest. The optics of the microscope may be fabricated from infrared transmis¬ sive or reflective materials, such as germanium, germanium composites, or gallium composites, properties that are com¬ patible with the conditions of the observations being per¬ formed. Cameras 68, 70 receive analog data and provide the re¬ ceived data to respective VCRs 72, 74 to be recorded. The VCRs are driven by a common external horizontal and vertical sync signal source 80 at the level of full video frames and preferably corresponding video image fields (e.g., using subcarrier phase locking) , and are preferably equipped with SMPTE (Society of Motion Picture and Television Executives) time code generators operated to insure sequential numbering of all video frames in a manner that provides temporally accurate time references among and between sets of original recordings. This numbering expedites the registration of multiple image sets . The SMPTE time code generator may also be set to operate on a time signal provided by an external standard, including the Global Positioning Satellite system.
The signal path for one or more of the VCRs may be equipped with noise reduction devices to reduce the noise level in the signal, and a VITS (video information test sig¬ nal) and VIRS (vertical interval reference signal) video standard reference signal generator to provide a set of standard reference signals in image space not used by the detector faceplate of the camera. Such reference signals insure accurate reference of recorded signals at each set of transmission to downstream video image reception points. With such a microscope and camera system, multiple concur¬ rent images of the specimen can be captured and recorded.
The visual and emissive data provided from VCRs 72, 74 is provided to first server 16 for preprocessing to permit temporal or spatial noise to be filtered, and to permit the extraction of features of interest and importance to the understanding of the activity being observed. Server 16 has two analog to digital (A/D) conversion units 104, 106 for receiving and converting the received analog data into digi¬ tal data suitable for initial processing and/or for trans¬ mission to other servers.
First server 16, which is preferably a reduced instruc¬ tion set computer (RISC) , such as an IBM RS/6000 10, or a high performance personal computer with high computational power. Server 16 is equipped for video-rate input, through¬ put, and output at a rate equal to or in excess of 50 mega¬ bytes per second, and is optimized for real time, interrupt- driven I/O activity. Server 16 sends digitized images via high speed fiber optic link 108 to second server 22. In the RS/6000, for example, the microchannel ports and bus speed allow a total of four video input sensors, as well as one channel to control other features, such as focal plane mo¬ tion and shutters .
In addition to acquiring and digitizing images, server 16 controls any microstepper motors 112, shutters, filter wheels, a motion controller 116, and other devices associ¬ ated with primary imaging activities via controller 100. Actuation of microstepper motor 112 alters the microscope (e.g., focus, focal plane axis, orientation, thickness, X- axis, Y-axis, Z-axis, and rotational stage movements) and the optical configuration (e.g., DIC, POL, lens or filter or other modulator position, orientation and/or actions, or stereoscopic) . The user preferably controls these functions from one of workstations 140, 142 (Fig. 2) by selecting icons with a pointing device, such as a mouse or a track¬ ball. Because the activity and the analysis thereof can be observed in real-time, a user can make changes from his/her workstation during the activity.
Server 16 can be programmed to initiate motion control of the microscope in response to particular sequences or patterns of activity. One such computer-initiated response changes the microscope's focal plane in discrete time- dependent steps, cycling from top to bottom, then returning to the top surface of the specimen. Such images provide information for tomographic reconstructions in low to moder¬ ate resolution in support of on-line experiments, and at high resolution in post-acquisitional analyses . Server 16 also has a video display interface 122 through which images and other information can be displayed on a high resolution video display 120 that can be color or monochromatic.
Server 16 may also record the coordinates for each po¬ sition at which the microscope is set to record and later study the path of viewing of a given specimen during a given observation, to optimize the path for future observations, or to reconstruct the shape of a specimen following a trac¬ ing of the structure using the focal point as a probe of the surface features of the specimen.
Referring to Fig. 3, second server 22, which is also preferably a computer similar to that used for server 16, serves as the primary local unit for image processing, analysis, and interactive graphics. Server 22 performs arithmetic, logical, and interactive imaging functions, such as rotational reorientations and three-dimensional polygon rendering, i.e., the interactive visualization and manipula¬ tion of complex shapes, preferably facilitated by a resident graphics program such as PHIGS+ or HIDEM, within the hard¬ ware of server 22. In addition, server 22 provides images to graphics monitors 130, 132 to present to dimensional im- ages for each video sensor, as well as a display 134 for stereoscopic projections and tomographic constructions, such as a StereoGraphics Z-screen, a CrystalEyes display, a holo¬ graphic display, a viewer mounted heads-up display, or a high resolution three-dimensional display. Server 22 pref- erably has a higher level of computational, processing, analysis, and display power and memory than server 16 to permit bitmap animations and to control the graphics moni¬ tors and the display.
Two graphical workstations 140, 142, linked via a net- work 144 to servers 16, 22, are equipped with software mod¬ ules 146, 148 for system control and data management during simulation experiments, and for programming and systems man¬ agement. Thus, programming personnel can participate di¬ rectly in an experiment by redirecting computational analy- ses during the course of an experiment or helping to address 'what if questions. Microscope system 10, servers 16, 22, and workstations 140, 142 are all preferably located at the site of activity under study.
Server 22 provides data over a transmission line 150 to a third server 24, preferably also a RISC computer, which is programmed for data compression, for archiving in the li¬ brary of signatures, and for database management. Server 24 uses a direct block multiplex (BMUX) channel 160 or other suitable wide channel line connection to a remote supercom- puter system 40. The use of such a line that requires that server 24 be in close proximity to supercomputer system 40, permits on-line use of the supercomputer's multiple proces¬ sors for computationally intensive activities, including three-dimensional and tomographic visualizations, intra- and inter-video sensor relational kinetics and pattern analyses, computational models, and kinetic simulations. Such proxim¬ ity may be extended with suitable high speed, broad band¬ width digital information transfer methods.
Supercomputer system 40 may consist of one or more high performance computational engines, such as a massively par¬ allel computational engine, such as a 1024-processor or 2048-processor Princeton Engine. The supercomputer system may be designed to use electronic and/or optical processing of images and may be of a SIMD or MIMD architecture, using processors operating synchronously or asynchronously, or in various combinations of the above. Depending upon its de¬ sign architecture, operating system, size, and capability, this system can enable the user of the system to perform analysis at high levels of sophistication. System 40 may be located at the same site or distributed over a number of sites with appropriate communications . The harnessing of the various computational engines may be predetermined by the operator or determined during the course of a particular application or process by the operator or by one of the com- putational engines by using an algorithm designed to assess computational activity and needs and to recruit needed and appropriate hardware and software resources from among local and remote sites during the course of an application or pro¬ cess using the process described herein. The powerful processing system 40 allows numerous and sophisticated image processing applications and analysis to be performed. The analysis preferably includes at least comparing visual signals to signals representing point emis¬ sive data, and correlating these signals. The data may also be manipulated and digitally or optically filtered; for ex- ample, the data may be sampled at periodic intervals to de¬ termine if there are periodic characteristics to the point emissive data that can be classified and distinguished as a particular type of signal or as a noise of a particular va¬ riety. The point emissive data can be filtered to remove random noise and to remove data generated from background emissions and not related to the emissions caused by the activity under study.
With remote processing, when the visual and point emis¬ sive data are processed, the results are transmitted to the site of the observed activity via communication links. These results are stored in local information base 26 in a signature library for later study and for comparison with new activity data. During operation, the image processor produces signatures for an activity under study, retrieves signatures from the signature library, and compares the sig¬ natures of the activity under study to the signatures stored in the signature library. If there is a match, cor¬ respondence, or correlation between the signature under study and one or more signatures stored in the library, the image processor can transmit information about such a cor¬ relation to the display unit in real-time .
The processing performed by the image processor prefer¬ ably includes computational edge ction and enhancement procedures "on-the-fly. " For edge c action of a specimen, gray scale values of edges are determined, all other gray scale values are removed from the image, and the remaining background values are brought to a common, intermediate value. This procedure results in a dark curve representing the edge of the object (s) of interest on a neutral or gray background. A low pass spatial filter is applied to mini¬ mize the effects of imaging aliasing thereby computationally sharpening the image.
Further, for a given activity, how much spatial tempo- ral patterns may change during a cell cycle under normal or abnormal activation can be determined. Analysis of diffu¬ sion, refractive index, or other physical, chemical, hydro- dynamic or other properties or processes of the specimen under study can be performed, Visualization is achieved by optically sectioning the specimen followed by three-dimensional (3D) reconstruction. The system translates each object within an optical section into a digital contour. Image reconstruction involves iden¬ tifying contours of each object and connecting those two- dimension (2D) contours to form 3D shells. The optically determined shape of the specimen serves as the framework within which other images (such as flashes of light of ae- quorin luminescence as discussed below) will be located by superposition of the real-time data onto a 3D model. From this reconstruction, the image processor synthesizes a left and right eye view of the specimen. By displaying these two slightly differing images, one to each eye, stereo percep¬ tion can be simulated on the computer screen so that the image of the specimen appears truly 3D. Sound can also be used to produce auditory cues to aid in the perception and recognition of spatial and temporal patterns of the luminescence signals. Separate tones and pitches can be assigned to each position of the video sam¬ pling grid, and such tones and pitches can be output as de- tectable sound in monaural or stereoscopic formats. Such image information dependent audible output may serve to as¬ sist the observer in detecting patterns of structure and/or activity, as well as changes in such patterns of structure and/or activity. Observations can additionally be recorded on an audio track of the videotapes by using a microphone connected di¬ rectly or indirectly to the video tape recorder or other recorders. The SMPTE time code recorded simultaneously on an audio channel or other suitable recording channel sepa¬ rate from that used for voice recording can be used to in¬ sure accurate video frame registration of image information among each separate spectral band and of combinations of spectral bands produced in analog and/or digital modes dur- ing or subsequent to the initial observation. In addition, a VITS signal can be introduced into the recording path of one or more separate video channels to insure that, follow¬ ing standardization of video system performance (e.g., black and white levels, color bars, black burst, VIRS, etc.) prior to the beginning of each recording or observation, the sig¬ nals received at remote locations can be adjusted to compen¬ sate for losses and noise incurred during transmission of the video image and other signals. At times it may be use¬ ful to check the video system standardization during an ob- servation or series of observations and analyses to insure that no deviation, gradation and/or drift has occurred in those values. In such cases it is the preferred practice to monitor the on-line performance of the video systems with a video waveform monitor and other such devices. Other data, such as that from recordings of cellular activity may be recorded to other available channels on the video tape or other recording medium.
EXAMPLE — CALCIUM ION CONCENTRATION DURING MITOSIS Some of the capabilities and options of the system of the present invention are described in more detail by way of a particular example.
All living organisms are composed of cells which repro¬ duce through mitosis, a well-known and widely-studied pro- cess by which a cell reproduces by dividing itself into two or more cells. During mitosis, there is a change in distri¬ bution of protons found inside a cell and of intracellular free ions, such as calcium. It has been found that the con¬ centration of intracellular free calcium ions undergoes sub- stantial changes prior to, during, and after mitosis. Some studies indicate that the ion concentration of calcium may be a cause of, or at least closely correlated with, the con¬ trol of mitosis and other processes of the cell.
The division of one cell to two cells is clearly vis- ible with a microscope. Moreover, variations in a level of calcium during cellular activity can be detected because photons emitted by a luminescent calcium-sensitive reporter are emitted from cells that are labelled with the reporter.
The approaches and methods described herein may also be applied in those cases in which it may be useful to use a fluorescent calcium-sensitive reporter, such as fura-2 or calcium green, to follow such changes in calcium concentra¬ tions in cells . When fluorescent reporters are used to ana¬ lyze the distribution of calcium ions computational methods should compensate for the effects of diffusion of the calcium-reporter complex and for the buffering effects of the reporter on local concentrations of calcium ions. These emissions follow patterns that have both spatial and tempo¬ ral signatures; i.e., the cell releases calcium ions from intracellular stores which concentrate in particular places called microdomains within the cell. The localized concen¬ tration of calcium ions varies over time. Calcium ions may be controlling factors or may be transmitters of controls for regulating the process necessary for controlling mito- sis.
Figs. 4 (a) -4(c) are exemplary photographs showing, re¬ spectively, (a) images representing calcium ions and cell division side-by-side; (b) images representing calcium ions and cell division superimposed; and (c) simultaneous capture of images representing calcium ions and cell division. In this exemplary embodiment, the specimen includes a cell disposed on a transparent microscope slide. Referring again to Fig. 2, a micromanipulation and microinjection sys¬ tem 200 can be used to inject into the cell quantities of one or more samples for use in observing the cell and its activity. Such microinjection is a well known process. By following the electrical properties of a cell using elec¬ trodes that measure the flow or waves of ions, electrons, or other charged or uncharged particles across membranes of the cell, cellular activity may be observed by following changes in localized concentrations or displacement of charged or uncharged particles using one or more electrodes specifi¬ cally designed to measure such localized concentrations or displacements of particles or waves outside the cell. The microscope allows the cell or a portion of the cell to be illuminated with a quantity of light. Cellular activ¬ ity or photolytic release of a compound or compounds that may or may not influence cellular activity can then be ob¬ served. Known processes may include the photolytic uncaging of reagents, fluorescence photobleaching, fluorescent recov¬ ery after photobleaching, imaging and/or manipulations using acoustic wave energy, laser-mediated ablation, manipulations with high energy particles such as alpha particles or x-rays the use of laser tweezers, and the use of laser scissors. Such manipulations may be under the control of the computer system with motion and illumination control system 12 inter¬ acting with, and directed by, local servers 16, 22, and 24, local multiprocessor 43, and remote supercomputer 40, to provide for spatially and temporally accurate manipulations of the specimen. Those manipulations may be the result of the computer system recognizing new or previously catalogued information within patterns observed from the cell, and/or other instructions provided by the observer. During mito¬ sis, calcium ions and other particles are observed at the locations at which the concentration of such particles may be determined at various locations within the cell. Aequorin, a luminescent photoprotein from the jellyfish
Aequoria, serves as a calcium reporter because it emits a
2+ photon upon the binding of one Ca at a .particular mtrac-
2+ ellular free calcium (Ca . ) concentration, and provides
1 . 2+ graded sensitivity to different concentrations of Ca . as a
2+ x means of bit-slicing through Ca . concentration levels, and x2+ displacements and/or changes in Ca concentrations. The luminescence of aequorin appears as discrete flashes on an otherwise dark background. While some regions of a cell have repeated flashes, other regions appear dark for all but stochastic emissions from the cell or shot noise from the imaging and recording system.
2+ Ca . signals are localized in time and space. Based
1 on a timing resolution of 30 Hz video circuitry, it appears
2+ that individual Ca . flashes have a lifetime of less than
1
100 msec. Typically, a first video frame contains the maxi¬ mal signal and subsequent frame intensities are linearly reduced to background within two additional frames.
2+ . .
Ca . transients are clustered within the cell: while
1 some regions of the cytoplasm are very active, others appear to be dormant. Rapid photon pulses or bursts are observed around the nucleus before nuclear envelope breakdown (NEB) . Discrete, rapid emissions are found localized in the mitotic pole region during mitosis, and low frequency, high ampli- tude emissions are seen at the site of the contractile ring immediately before and during cytokinesis.
Experimental results indicate that there is a physi¬ ological link between a temporally regulated transient el-
2+ evation in Ca . and the control of mitotic events. Thus,
1 2+ it is important to determine if concentration of Ca in- ι creases in association with the events of NEB and the onset of anaphase during which time the chromosomes are segregated to the daughter cells. Aequorin in one of a number of forms
2 + can be a usable reagent for the study of Ca in dividing
1 cells. When a two-camera system as shown in Fig. 2 is used for studying dividing cells, each camera provides video pixels at a resolution preferably corresponding to a square of 500 nm per side. A resolution of about 365 nm compares fa- vorably with the maximum limit of spatial resolution (ulti¬ mately resolution = f = (0.5) (wavelength) /NA) attainable at c a standard 546 nm working illumination for microscopy. Be¬ cause brightness decreases with magnification, doubling the magnification of the objective lens decreases brightness by a factor of four (holding NA constant) . The microscope preferably uses a Nikon plan apochromatic 20X/0.75 NA objec¬ tive lens, which has a brightness rating of 14.0625. In some circumstances it is preferable to use a Zeiss plan apo¬ chromatic water immersion 40X/0.75 numerical aperture objec¬ tive lens, which has a brightness rating of 3.156. Opti¬ cally, the image of a cell is preferably formed with a set of high performance, rectified differential interference contrast (DIC) or polarized light (POL) optics. These lenses enable one to optically section a living cell into very narrow image slices, with minimal interference from out-of-focus images. Images are preferably detected with an SIT camera and a dual microchannel plate intensified saticon tube camera for indicating the incidence of single photons on a target. Mechanically, the microstepper motor driver controls the stage and focus to allow precise positioning of the ver¬ tical focal plane to within a single focal plane step of
5 31.5 nm (+/- 0.8% over 6 x 10 steps) , and 100 nm horizontal specimen plane. A digital microstepper motor driver, con- trollable through a microcontroller, permits remote opera¬ tion, a capability that is essential for single photon video observations .
The image processor controls data acquisition, process¬ ing, analysis, and support with software residing in the computer, and also controls, directly or indirectly, hard¬ ware for the A/D conversion of the signal or sequential, full-frame video frames from each camera sampling spectral bands, and the recording of such images in digital format such as video disk recorders, clustered magnetic hard drive arrays, or other such devices. The system preferably per- mits digitization of such video signals to 10 bits per byte of data and permits transmission of the image information as a digitized signal over appropriate communications pathways. Such a system also permits the D/A conversion of the image information for playback and transmission. Software drivers and other tools are provided for performing quantitative, real time, matrix-based regional subsampling of the inten¬ sity of video images.
Examples of software tools for a personal computer sys¬ tem may preferably include tools for (1) digitizing the im- age brightness of a 10 x 10 rectilinear sampling grid of one hundred contiguous 5 x 5 pixel subregions; (2) summing the photonic intensity of each of the box subregions plotted against time in a 10 by 10 array; and (3) producing a gray scale rendition of the files from the summing program. The playback rate ranges from one frame per second (33 msec of recording time) to single frame readout of selected frames.
The development of luminescent and fluorescent probes for intracellular chemical events permits the study of cel¬ lular regulation at the level of the single cell. Due to the photon emissive properties of individual reporters, as well as the absorption properties of the cells to be stud¬ ied, one can follow at least two different chemical pathways
2+ in a single cell. For instance, Ca . is monitored in a
1 cell microinjected with aequorin, at the same time ATP con- centration or alkaline phosphatase activity will be moni¬ tored in the same cell also microinjected with luciferase and appropriate luciferin analogues. Quantitative imaging of other cellular properties is possible.
Table I is a list of cellular attributes that can cur- rently be imaged: Table I
Reaction or Process
Bio-luminescence and chemi-luminescence
2+ Intracellular free Ca concentration
Alkaline phosphatase activity
Intracellular reduction-oxidation potential
Conventional illumination methods
Microtubule assembly Nucleus and other cellular organelles
Fluorescent probes
Quantitative Imaging Mode
Aequorin/photon counting
Aequorin/ratio photon counting Luciferase/photon counting
Luminescence/photon counting
Luminescence/superoxide production/photon counting
Luminescence/adenosine triphosphate (ATP) production/photon counting Luminescence/translation of messenger RNA yielding peptide and protein production/photon counting
Luminescence/transcription of messenger RNA as a determinant of gene activity/photon counting
Birefringence, POL and DIC (transmission and/or reflective optics)
Specific/selective intracellular ions
(e.g., calcium, magnesium, potassium, chloride)
Phospholipase A2
Phospholipase C Intracellular pH
Voltage potential across membranes
Mitochondrial distribution
Endoplasmic reticulum distribution
Distribution of Golgi bodies Distribution of nuclei and/or other organelles and structures Acidic endomembrane organelles
Chromosome replication, condensation, movement
Lipid analogues/fluorescence
Lipid fluidity and metabolism BCECF/fluorescence
Voltage sensitive fluorescent dye
Rhodamine 123/fluorescence
Di-i dyes/fluorescence
Acridine orange/fluorescence DNA/vital dye fluorescence
Protein kinases and/or phosphatases
Proteolytic enzyme activity
Cytoskeletal components (e.g., actin, tubulin, myosin, dynein, kinesin, etc. Adenosine triphosphate (ATP)
Cyclic adenosine monophosphate (cAMP)
Fluorescent molecules
Fluorescent derivatives of molecules and/or other bodies
Fluorescence energy transfer within molecules and/or other bodies
Fluorescence energy transfer among molecules and/or other bodies
Diffusion or other translocation of ions, molecules and/or other bodies Heat emitted due to chemical, molecular and/or other activity
Changes in molecular or other level (s) of physical and/or chemical organization and orientation
The SIT or intensified CCD low light level camera cap¬ tures a DIC image of a cell illuminated with 710 nm light projected through a condenser lens to an objective lens.
The photon counting camera sees the 465 nm photons emitted
2+ from a Ca .-dependent aequorin reaction. A second photon counting camera can be used to see photons emitted from the cleavage of specific luciferin analog within the 500 nm to 580 nm range, e.g., for ATP or alkaline phosphatase, by the luciferin luciferase (fire-fly tail) reaction. A fourth or further sensors can be added if the bandpass of a third re¬ porter (luminescent or fluorescent) is within a trough of the other three illumination modes . The target phosphors routinely used in the photon counting cameras, while optimal between 450 nm and 550 nm, render these cameras "blind" above 650 nm, thereby facilitating the use of 700+ nm light for DIC and POL visual images. Multiple concurrent images of the cell can be produced including one showing the whole cell organization; a second
2+ showing Ca . ; and a third showing where another activity is
1 located. The analog output signal of each camera is re¬ corded on a respective VCR, and then is sent to a first com- puter of the image processor. These images form high reso¬ lution tomographic reconstructions in post-acquisition analyses, or low to moderate resolution in support of on line experimentation. Real-time digital mixing of the image records from individual video sensors permits superimposi-
2+ tion of the signals due to Ca . of other visualized param-
1 eters upon structures within the cell, and perform on-line
2+ relational analyses of Ca . and other intracellular regula-
1 tory events in normal and experimentally manipulated single cells. To take full advantage of the quantitative nature of aequorin luminescence, a detailed statistical analyses is applied to the emission patterns of activities. One use of the invention is to determine the spatial and temporal fre-
2+ quencies of Ca . for each region of the cell, and to cor- relate these patterns with particular features within cells (e.g., nuclei, mitotic poles, cleavage furrows) . In this way one may move 2D imaging over time from a semi- quantitative or quantitative form to quantitative and rela¬ tional forms. Pattern analysis algorithms stored by the image processing system are applied to the video image records to discern geometric relationships (i.e. bilateral spatial symmetry, spatial branching, and temporal patterns) within the image sets .
The imaging system generates a set of rich visual imag¬ ery at a very high rate. Depending upon the application, the relevant information to be analyzed may be temporal, 3D, or multispectral, and may involve correlating data from sev¬ eral different imaging sensors. A portion of this system involves the standard computer vision techniques that have been established. Standard software exists and for analysis of multichannel image data (e.g., maximum likelihood classi¬ fier, clustering procedures), temporal image patterns (e.g., Fourier transform, change detection procedures) , and spatial patterns (e.g. 2D Fourier transforms, geographic information systems) . These tools can be merged and optimized to create a combined spectral, spatial, and temporal pattern analysis procedure. Such programs may be run on clustered worksta¬ tions or supercomputers.
In each half of a dividing cell, for example, the edge of each object visualized in one spectral band different from that used for calcium showing calcium-dependent emis¬ sion capabilities exhibited by aequorin-base luminescence as visualized in the aequorin luminescence spectral band, may be superimposed onto pairs of Cartesian coordinate planes using the mitotic poles as the origin. The X-axis is de- scribed by the pole-to-pole axis, the Y-axis is projected through that pole and is perpendicular to the X-axis of that cell) , and the Z-axis is parallel to the optical axis of the imaging system, the objective lens. The edges of some structures to be imaged will not always remain confined to the optical plane of each observation. In such cases, the system tracks particles as trajectories following an initial curve to allow one to extrapolate across discontinuous por¬ tions of the particle's path due to optical-mechanical dis¬ tortions in a cell and meandering within and outside the depth of field of an image plane (e.g., 1.2 micron thick) used for initial image acquisition. Such "filled-in" tra¬ jectories will then be recomputed to provide the best pos¬ sible fit for the edge. .By using appropriate tracking and multidimensional curve and surface fitting software on a cluster of workstations or a high performance computational engine, an equation that best fits the curve form through iterative curve fit algorithms can be determined. Such com¬ putational analysis of image data is used to solve for sur¬ face topography of isosurfaces depicting cellular or chemi- cal activities and object spatial deformations and displace¬ ments over time.
A variety of UNIX-based X-window display environments can be used for 2D and 3D data plots and other computational visualizations such as simple X-Y plots, histograms, sur- faces, and volume renderings. Shaded polygon images are possible with a graphics environment and graphics engine, preferably with hardware residing on one computer and soft¬ ware on another. Wavefront software on the computers pro¬ duces credible images, and is currently used to make video- tape for both local and remote users. Still images can also be displayed in an X-windows environment. A form of X-movie displays Wavefront frames on a workstation screen fastener enough to preview animation and can be sent to remote users by fast mail. A volume rendering program can be used to convert 3D data sets to surfaces. The resulting images can be displayed in X-windows or another suitable format, or saved as a compressed raster for later display in an X- window for transmission to remote workstations.
With sufficient network bandwidth, the intermediate videotape step can be omitted and animated sequences can be constructed directly on an intermediate video rate memory device, such as a video disk recorder, optical memory disk recorder, or RAID memory disk array, and then displayed on the screen of the user's workstation. One benefit of the system is the development of a 3D volume rendered virtual mitotic cell (including intracel¬ lular organelles and other compartments) constructed from real time tomographic data and multifaceted in vivo and in vitro experimental data. This virtual mitotic cell will, in turn, be subject to computational experimental manipulations of a wide spectrum of cell physiological and biochemical (molecular and ionic) parameters associated with mitosis and the cell cycle. Large intracellular structures limit the diffusion of
2+ particles such as Ca throughout the cell. These struc¬ tures can affect interaction with other intracellular reac-
2+ tants such as Ca dependent enzymes. If those reactants are transiently bound to intracellular membranes, the chemi- cal kinetics of the reaction can be very complex. The com¬ plex 3D geometry of the reaction space make this kinetic problem a prime candidate for the use of Monte Carlo model¬ ing techniques, including discrete Monte Carlo modeling techniques . The Monte Carlo method for solving chemical kinetics problems has a distinct advantage in that 3D diffu¬ sion of reactants in a space with complex boundaries is eas¬ ily handled. In contrast, simultaneous partial differential equation methods for problem solving, which are preferred for ID or 2D problems having simple geometric symmetry, become extremely cumbersome and inefficient as the number of dimensions and complexity of the bounding surfaces increase. The data sets and number of simultaneous parameters for this problem are so large that only with the use of a supercom¬ puter or clustered high computational powered workstations will we be able to arrive at a solution in a timely fashion.
Predicting cellular and intracellular behavior from various conceptual models thus is greatly facilitated, and the presence or absence of important unifying precepts can be clarified. Candidate models are run multiple times, and a mean model solution. One application of particular importance for quantal emission photon imaging is analysis of photon scatter and luminescence image noise. One application of this effort is to determine the source of the higher number of photos dis- tributed throughout the image filed, yet outside the bound¬ aries of the labeled cell . These are the result of internal system scattering, that is, a deflection from particles within the cell and from the various surfaces within the entire cell and the imaging systems. Monte Carlo simula- tions, similar to those developed to model regulation of acetylcholine receptors and ion channels, can help establish a model that describes the ultimate source of these scat¬ tered photons.
One imaging system of the present invention can be used
2+ to determine the properties of Ca . release, buffering, and
1 re-uptake for various subregions within a living cell. This determination is made with frame-by-frame frequency and am¬ plitude analysis of photon emissions during each experiment. This is a computationally intensive and voluminous extension of the approaches for data gathering, and relies on the su¬ percomputer and available graphical hardware and software.
One may also perform a number of evaluations, including Fourier analysis of integrated patterns, and chi-squared analysis of each subsampled region within an injected and control cell. In addition, imaging systems can be combined to probe the effects of various agonists and antagonists of various metabolic and structural or other activities and
2+ 2+
Ca , buffers on the levels and patterns of Ca . , lipid i derived second messenger, protein kinases and phosphatases, proteolytic enzymes, endonucleases, various mechanism for post-transcriptional and post-translational modifications, and other reactants. Results from these modeling experi¬ ences can be compared with in vivo and in vitro probes of
2+ the Ca . regulatory system to model the regulation of i-
1 totic processes, and to evaluate various models. The system performs computational comparisons of image information from experiments with living cells and experi- ments performed on computational modelings of the system under study with modeling systems operating on computers and workstations such as the object oriented DEEM environment developed by the Argonne National Laboratory. This system corresponds output living cells and computational simula¬ tions and indicates likely areas for improvement of the ac¬ curacy and precision of the computational model, as well as indicating experimental approaches for studies of living cells . This approach is scalable to specimens other than living cells that may be studied with this system. This approach also provides for establishment of a dialog with the specimen being studied, with the potential outcomes in- eluding imposing an observer-deemed manipulation and/or regulation upon the specimen under observation. In such an embodiment, the system would utilize the information ob¬ tained through the capture, processing, and analysis of im¬ age information, including spatial and temporal patterns of chemical, physical, and other reactions and transformations associated with specific events inherent to the specimen, and utilize that information to direct the regulation of the process occurring, about to occur, or capable of occurring in the specimen. Such an action would take advantage of all elements of the system as described and other components as may be necessary.
An example of such an application would be the use of spatially and temporally regulated generation of intracel¬ lular calcium signals, such as those detected with aequorin in dividing cells, wherein the localized elevations of in¬ tracellular calcium concentration is achieved by flash pho¬ tolysis of caged calcium or other caged compounds that would elicit the increase in localized intracellular calcium by other natural or designed method. Such an application can be readily scaled upward to include treatment of tissues, organs, organ system, and entire people, and can be scaled downward to range from single cells to regions of cells and components of cells including organelles, molecules, and groups of molecules .
While the invention has been described for use with a microscope, it can be used on a long-range basis for tasks such as assessment of crops, or environmental and industrial activity. These activities can be analyzed by obtaining images using two or more different spectral bands, and com¬ paring the data contained in each of the images to develop information relevant to the subject under observation. Such images can be obtained from imaging devices located in air¬ craft, balloons, spacecraft or satellites, and can include the health of a particular crop, cloud formation and other meteorological information that portend a storm or other condition, state of disease of a forest, chemical composi¬ tion of smoke emitted from a factory, or the growth state of plants located downstream of a site of effluent discharge into a body of water.
The system can also be used in the field of endoscopy for assessment of the metabolic state of a patient's tissue or organ by measuring photonic spectral properties of that tissue or organ. Specific metabolic parameters of a tissue can be determined directly or indirectly as a function of the absorption of light of particular wavelengths and spec- tral bands. Similarly, various optically active reporter reagents can be used in conjunction with simple absorption methods to assess other metabolic parameters. In situations of pressing need, such as emergency medical treatment or lapriscopic surgery, an imaging system according to the present invention can provide real-time assessments while visualizing the tissue in question, even during the perfor¬ mance of a surgical procedure.
Having described embodiments of and examples of ap¬ plications for the present invention, it should be apparent that modifications can be made without departing from the scope of the appended claims. Other applications of the system may include on-line computational steering of bio- medical, chemical, and physics experiments, machine directed ophthalmic laser surgery, intrauterine fetal surgery, tele- surgery, distant evaluation for tissues and samples studies for diagnostic pathology studies and other practices of telemedicine, evaluation of toxic conditions in an environ¬ ment, the use of medical imaging technologies such as MRI, PET, and CAT scan for new forms of non-invasive surgical procedures, high energy particle physics, improved real time machine vision for robotics and air traffic control, and inspection, fabrication, and modification of 2D and 3D inte¬ grated circuits. The ability to observe cells and tissues can allow processes to be observed and recorded from living bodies, including, for example, effects of toxins or radia- tion on tissues or cells. The observation of such cells is not limited to animal cells, but can also apply to plant cells, and to other entities. What is claimed is :

Claims

Claims
1. A method for analyzing a specimen during an activ¬ ity, the method comprising the steps of:
(a) recording visual images of the specimen to provide a spatial and temporal recording of the specimen during the ac ivity;
(b) recording photon images representing emissions of a first photon from the specimen during the activity; and
(c) displaying simultaneously the visual images and the photon images .
2. The method of claim 1, wherein step (c) includes displaying the visual images and photon images adjacent one another.
3. The method of claim 1, wherein step (c) includes displaying the visual images and the photon images in a su¬ perimposed manner.
4. The method of claim 1, wherein steps (a) and (b) include recoding the images as signals, the method further comprising a step of processing the signals recorded in steps (a) and (b) to determine correlations between the vi¬ sual images and the photon images.
5. The method of claim 1, further comprising a step, performed simultaneously with steps (a) and (b) , of record¬ ing space and time varying signals representing emissions of a second photon from the specimen during the activity.
6. The method of claim 1, further comprising a step, performed simultaneously with steps (a) and (b) , of record¬ ing space and time varying signals representative of thermal emissions from the specimen during the activity.
7. The method of claim 1, wherein the recorded photon images represent calcium ions.
8. The method of claim 1, wherein steps (a), (b) , and (c) are performed in real time.
9. The method of claim 1, further comprising the step of filtering recorded data to visually display selected data.
10. The method of claim 1, further comprising a step of storing the recorded images in a database.
11. The method of claim 10, further comprising a step of comparing an image received in real-time to images stored in the database to determine if one or more of the previ¬ ously stored images has a correlation with the currently received image.
12. An apparatus for capturing images of a specimen during an activity of a specimen comprising: a first receiver that receives visual images of the specimen during the activity and provides a spatial and tem¬ poral recording of the specimen during the activity; a second receiver that receives space and time varying signals representing emissions of predetermined photons from the specimen during the activity simultaneously with the first receiver; and a display that displays the visual images and the sig- nals representative of emissions of predetermined photons at the same time.
13. The apparatus of claim 12, wherein the display displays the visual images and the photon images in a super- imposed manner.
14. The apparatus of claim 12, wherein the display displays the visual images and the photon images adjacent one another.
15. The apparatus of claim 12, further comprising a microscope and a beam splitter for receiving images from the microscope, the beam splitter providing first and second signals to the respective first and second recorders.
16. The apparatus of claim 15, further comprising a processor that correlates the visual images and the photon emission signals.
17. The apparatus of claim 12, further comprising a third receiver that simultaneously receives and records space and time varying signals representative of emissions of a second predetermined photon from the specimen during the activity.
18. The apparatus of claim 12, further comprising, a thermal detector receiving and recording space and time varying signals representative of thermal emissions from the specimen during the activity, the thermal detector receiving at the same time as the first and second recorders.
19. The apparatus of claim 12, wherein the photon sig¬ nals are photons of calcium ions .
20. The apparatus of claim 12, further comprising means for visually coding the emission signals to represent their elapsed time from initial emission.
21. The apparatus of claim 12, further comprising means for filtering recorded data to visually display se- lected data.
22. The invention of claim 12, further comprising a storage device that stores previously received images, and a processor for comparing the stored previously received im¬ ages with images received in real time.
23. The invention of claim 12, wherein the first re¬ cording includes a video camera and a video cassette re¬ corder.
24. The invention of claim 12, wherein the second re¬ corder includes a photon counting camera.
25. An apparatus comprising: a microscope for imaging a cellular activity; a beam splitter for dividing the image taken by the microscope between two outputs; a video camera for receiving one output from the beam splitter; a photon counting camera for receiving another output from the beam splitter; a first video recorder for recording visual images re¬ ceived by the first video camera; and a second video recorder for recording images from the photon counting camera.
26. The apparatus of claim 25, further comprising a time code synchronizer that synchronizes the first and sec¬ ond video recorders with each other.
27. The apparatus of claim 25, further comprising an image processing system for comparing the images from the video camera and the images from the photon camera.
28. The apparatus of claim 27, wherein the comparison of the outputs of the two cameras forms a composite video recording of the outputs of the two cameras superimposed on one another.
PCT/US1997/003467 1996-03-18 1997-03-17 Analytical imaging system and process WO1997034995A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU23183/97A AU2318397A (en) 1996-03-18 1997-03-17 Analytical imaging system and process

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/618,246 US6078681A (en) 1996-03-18 1996-03-18 Analytical imaging system and process
US08/618,246 1996-03-18

Publications (1)

Publication Number Publication Date
WO1997034995A1 true WO1997034995A1 (en) 1997-09-25

Family

ID=24476922

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/003467 WO1997034995A1 (en) 1996-03-18 1997-03-17 Analytical imaging system and process

Country Status (3)

Country Link
US (1) US6078681A (en)
AU (1) AU2318397A (en)
WO (1) WO1997034995A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999013360A2 (en) * 1997-09-10 1999-03-18 Bellsouth Intellectual Property Corporation Digital telepathology imaging system with bandwidth optimization and virtual focussing
EP1803806A2 (en) * 2005-12-28 2007-07-04 Fujitsu Limited Injection apparatus and injection method
RU2692825C2 (en) * 2017-10-23 2019-06-28 Ирлан Витальевич Шабельников Method of spectral laser scanning of composite materials in accordance with optical density of its matrix and composite components

Families Citing this family (127)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6272235B1 (en) 1997-03-03 2001-08-07 Bacus Research Laboratories, Inc. Method and apparatus for creating a virtual microscope slide
US6404906B2 (en) 1997-03-03 2002-06-11 Bacus Research Laboratories,Inc. Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
US5836877A (en) * 1997-02-24 1998-11-17 Lucid Inc System for facilitating pathological examination of a lesion in tissue
US20040083085A1 (en) * 1998-06-01 2004-04-29 Zeineh Jack A. Integrated virtual slide and live microscope system
US6606413B1 (en) * 1998-06-01 2003-08-12 Trestle Acquisition Corp. Compression packaged image transmission for telemicroscopy
US6310619B1 (en) * 1998-11-10 2001-10-30 Robert W. Rice Virtual reality, tissue-specific body model having user-variable tissue-specific attributes and a system and method for implementing the same
US20030228565A1 (en) * 2000-04-26 2003-12-11 Cytokinetics, Inc. Method and apparatus for predictive cellular bioinformatics
US6876760B1 (en) * 2000-12-04 2005-04-05 Cytokinetics, Inc. Classifying cells based on information contained in cell images
US7151847B2 (en) * 2001-02-20 2006-12-19 Cytokinetics, Inc. Image analysis of the golgi complex
US7088391B2 (en) * 1999-09-01 2006-08-08 Florida Atlantic University Color video camera for film origination with color sensor and luminance sensor
US6558623B1 (en) * 2000-07-06 2003-05-06 Robodesign International, Inc. Microarray dispensing with real-time verification and inspection
WO2001074440A2 (en) * 2000-03-21 2001-10-11 Bechtel Bwxt Idaho, Llc Methods and computer readable medium for improved radiotherapy dosimetry planning
US7518652B2 (en) * 2000-05-03 2009-04-14 Aperio Technologies, Inc. Method and apparatus for pre-focus in a linear array based slide scanner
US7668362B2 (en) 2000-05-03 2010-02-23 Aperio Technologies, Inc. System and method for assessing virtual slide image quality
US6711283B1 (en) 2000-05-03 2004-03-23 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
US7738688B2 (en) * 2000-05-03 2010-06-15 Aperio Technologies, Inc. System and method for viewing virtual slides
US6597799B1 (en) * 2000-06-19 2003-07-22 Scientech, Inc. Optical digital environment compliance system
US6656683B1 (en) 2000-07-05 2003-12-02 Board Of Regents, The University Of Texas System Laser scanning cytology with digital image capture
US7025933B2 (en) * 2000-07-06 2006-04-11 Robodesign International, Inc. Microarray dispensing with real-time verification and inspection
WO2002009483A1 (en) * 2000-07-26 2002-01-31 The Regents Of The University Of California Manipulation of live cells and inorganic objects with optical micro beam arrays
US7194118B1 (en) 2000-11-10 2007-03-20 Lucid, Inc. System for optically sectioning and mapping surgically excised tissue
US6833542B2 (en) * 2000-11-13 2004-12-21 Genoptix, Inc. Method for sorting particles
US20020123112A1 (en) * 2000-11-13 2002-09-05 Genoptix Methods for increasing detection sensitivity in optical dielectric sorting systems
US6936811B2 (en) * 2000-11-13 2005-08-30 Genoptix, Inc. Method for separating micro-particles
US20030007894A1 (en) * 2001-04-27 2003-01-09 Genoptix Methods and apparatus for use of optical forces for identification, characterization and/or sorting of particles
US20020160470A1 (en) * 2000-11-13 2002-10-31 Genoptix Methods and apparatus for generating and utilizing linear moving optical gradients
US6744038B2 (en) 2000-11-13 2004-06-01 Genoptix, Inc. Methods of separating particles using an optical gradient
US6784420B2 (en) * 2000-11-13 2004-08-31 Genoptix, Inc. Method of separating particles using an optical gradient
US6778724B2 (en) * 2000-11-28 2004-08-17 The Regents Of The University Of California Optical switching and sorting of biological samples and microparticles transported in a micro-fluidic device, including integrated bio-chip devices
US7171030B2 (en) * 2000-11-30 2007-01-30 University Of Medicine & Denistry Of New Jersey Systems for analyzing microtissue arrays
US7027633B2 (en) * 2000-11-30 2006-04-11 Foran David J Collaborative diagnostic systems
US7079673B2 (en) * 2002-02-05 2006-07-18 University Of Medicine & Denistry Of Nj Systems for analyzing microtissue arrays
US7218764B2 (en) * 2000-12-04 2007-05-15 Cytokinetics, Inc. Ploidy classification method
CA2434174A1 (en) * 2000-12-15 2002-06-20 Omnicorder Technologies, Inc. Method and apparatus for measuring physiology by means of infrared detector
US6466690C1 (en) * 2000-12-19 2008-11-18 Bacus Res Lab Inc Method and apparatus for processing an image of a tissue sample microarray
US7016787B2 (en) * 2001-02-20 2006-03-21 Cytokinetics, Inc. Characterizing biological stimuli by response curves
US6956961B2 (en) * 2001-02-20 2005-10-18 Cytokinetics, Inc. Extracting shape information contained in cell images
JP2002263063A (en) * 2001-03-12 2002-09-17 Asahi Optical Co Ltd Endoscope system
US6526363B2 (en) * 2001-03-16 2003-02-25 Progeny Systems, Llc System and method for calibration and verification of a sample analysis instrument
JP2002287997A (en) * 2001-03-23 2002-10-04 Kinji Mori Multiple system processing method
US7907765B2 (en) 2001-03-28 2011-03-15 University Of Washington Focal plane tracking for optical microtomography
US6989900B1 (en) * 2001-04-02 2006-01-24 Advanced Micro Devices, Inc. Method of measuring implant profiles using scatterometric techniques
US20040009540A1 (en) * 2001-04-27 2004-01-15 Genoptix, Inc Detection and evaluation of cancer cells using optophoretic analysis
US20030194755A1 (en) * 2001-04-27 2003-10-16 Genoptix, Inc. Early detection of apoptotic events and apoptosis using optophoretic analysis
EP1273928A1 (en) * 2001-07-06 2003-01-08 Leica Geosystems AG Method and device for suppressing electromagnetic background radiation in an image
US20050143791A1 (en) * 2001-07-09 2005-06-30 Stuart Hameroff Process of treating a cell
US20030008287A1 (en) * 2001-07-09 2003-01-09 Sarah Black Pheonotypic correlation process
US6636623B2 (en) * 2001-08-10 2003-10-21 Visiongate, Inc. Optical projection imaging system and method for automatically detecting cells with molecular marker compartmentalization associated with malignancy and disease
US6880387B2 (en) * 2001-08-22 2005-04-19 Sonoscan, Inc. Acoustic micro imaging method providing improved information derivation and visualization
US20040071328A1 (en) * 2001-09-07 2004-04-15 Vaisberg Eugeni A. Classifying cells based on information contained in cell images
DE10206979A1 (en) * 2002-02-20 2003-08-21 Leica Microsystems Method for user training for a scanning microscope, scanning microscope and software for user training for a scanning microscope
US7596249B2 (en) * 2002-02-22 2009-09-29 Olympus America Inc. Focusable virtual microscopy apparatus and method
US7260253B2 (en) 2002-04-19 2007-08-21 Visiongate, Inc. Method for correction of relative object-detector motion between successive views
US7738945B2 (en) 2002-04-19 2010-06-15 University Of Washington Method and apparatus for pseudo-projection formation for optical tomography
US20040033539A1 (en) * 2002-05-01 2004-02-19 Genoptix, Inc Method of using optical interrogation to determine a biological property of a cell or population of cells
US20030211461A1 (en) * 2002-05-01 2003-11-13 Genoptix, Inc Optophoretic detection of durgs exhibiting inhibitory effect on Bcr-Abl positive tumor cells
US6891363B2 (en) 2002-09-03 2005-05-10 Credence Systems Corporation Apparatus and method for detecting photon emissions from transistors
US6943572B2 (en) * 2002-09-03 2005-09-13 Credence Systems Corporation Apparatus and method for detecting photon emissions from transistors
US20040053209A1 (en) * 2002-09-12 2004-03-18 Genoptix, Inc Detection and evaluation of topoisomerase inhibitors using optophoretic analysis
US20040067167A1 (en) * 2002-10-08 2004-04-08 Genoptix, Inc. Methods and apparatus for optophoretic diagnosis of cells and particles
US7606403B2 (en) * 2002-10-17 2009-10-20 Intel Corporation Model-based fusion of scanning probe microscopic images for detection and identification of molecular structures
US6996264B2 (en) * 2002-10-18 2006-02-07 Leco Corporation Indentation hardness test system
US20040121474A1 (en) * 2002-12-19 2004-06-24 Genoptix, Inc Detection and evaluation of chemically-mediated and ligand-mediated t-cell activation using optophoretic analysis
US20040121307A1 (en) * 2002-12-19 2004-06-24 Genoptix, Inc Early detection of cellular differentiation using optophoresis
US7257268B2 (en) * 2003-02-28 2007-08-14 Aperio Technologies, Inc. Systems and methods for image pattern recognition
US7116440B2 (en) 2003-02-28 2006-10-03 Aperio Technologies, Inc. Image processing and analysis framework
US20060226374A1 (en) * 2003-08-06 2006-10-12 Gnothis Holding S.A. Method and device for identifying luminescent molecules according to the fluorescence correlation spectroscopy method
US7745221B2 (en) 2003-08-28 2010-06-29 Celula, Inc. Methods and apparatus for sorting cells using an optical switch in a microfluidic channel network
DE10349649B3 (en) * 2003-10-17 2005-05-19 Karl Storz Gmbh & Co. Kg A method and apparatus for generating an annotated image in a sterile work area of a medical facility
US20050273271A1 (en) * 2004-04-05 2005-12-08 Aibing Rao Method of characterizing cell shape
JP5134365B2 (en) * 2004-05-27 2013-01-30 アペリオ・テクノロジーズ・インコーポレイテッド System and method for generating and visualizing a three-dimensional virtual slide
DE102004029552A1 (en) * 2004-06-18 2006-01-05 Peter Mäckel Method for visualizing and measuring oscillations of oscillating objects by means of a combination of a synchronized, stroboscopic image recording with image correlation method
US20070031818A1 (en) * 2004-07-15 2007-02-08 Cytokinetics, Inc., A Delaware Corporation Assay for distinguishing live and dead cells
US7323318B2 (en) * 2004-07-15 2008-01-29 Cytokinetics, Inc. Assay for distinguishing live and dead cells
US7792338B2 (en) * 2004-08-16 2010-09-07 Olympus America Inc. Method and apparatus of mechanical stage positioning in virtual microscopy image capture
US20060101072A1 (en) * 2004-10-21 2006-05-11 International Business Machines Corproation System and method for interpreting scan data
WO2006076432A2 (en) * 2005-01-11 2006-07-20 University Of Central Florida Interactive multiple gene expression map system
US8774560B2 (en) * 2005-01-11 2014-07-08 University Of Central Florida Research Foundation, Inc. System for manipulation, modification and editing of images via remote device
US7804981B2 (en) * 2005-01-13 2010-09-28 Sensis Corporation Method and system for tracking position of an object using imaging and non-imaging surveillance devices
JP5336088B2 (en) 2005-01-27 2013-11-06 アペリオ・テクノロジーズ・インコーポレイテッド System and method for visualizing a three-dimensional virtual slide
US20060204066A1 (en) * 2005-03-14 2006-09-14 Fujifilm Electronic Imaging Ltd. Monitoring image inspection
CA2803828C (en) * 2005-03-31 2015-11-24 Alcon, Inc. Footswitch operable to control a surgical system
CA2604829C (en) * 2005-04-04 2018-05-15 Hypermed, Inc. Hyperspectral imaging in diabetes and peripheral vascular disease
WO2006111965A2 (en) * 2005-04-20 2006-10-26 Visionsense Ltd. System and method for producing an augmented image of an organ of a patient
US8164622B2 (en) * 2005-07-01 2012-04-24 Aperio Technologies, Inc. System and method for single optical axis multi-detector microscope slide scanner
EP1967886A4 (en) * 2005-12-27 2011-01-05 Olympus Corp Device and method for capturing image of a sample originating from organism
US7864996B2 (en) * 2006-02-17 2011-01-04 Lucid, Inc. System for macroscopic and confocal imaging of tissue
US8010555B2 (en) 2006-06-30 2011-08-30 Aperio Technologies, Inc. System and method for managing images over a network
JP5484048B2 (en) 2006-06-30 2014-05-07 アペリオ・テクノロジーズ・インコーポレイテッド Large image storage and retrieval method via DICOM
US7932034B2 (en) 2006-12-20 2011-04-26 The Board Of Trustees Of The Leland Stanford Junior University Heat and pH measurement for sequencing of DNA
US8465473B2 (en) * 2007-03-28 2013-06-18 Novartis Ag Surgical footswitch with movable shroud
TW200842403A (en) * 2007-04-18 2008-11-01 Ming-Yan Lin Cognition method of plural points in visual space
JP5389016B2 (en) 2007-05-04 2014-01-15 アペリオ・テクノロジーズ・インコーポレイテッド System and method for quality assurance in pathology
US7835561B2 (en) 2007-05-18 2010-11-16 Visiongate, Inc. Method for image processing and reconstruction of images for optical tomography
US7981109B2 (en) 2007-08-15 2011-07-19 Novartis Ag System and method for a user interface
US7787112B2 (en) * 2007-10-22 2010-08-31 Visiongate, Inc. Depth of field extension for optical tomography
JP2009189654A (en) * 2008-02-15 2009-08-27 Olympus Medical Systems Corp Signal processing system
US8143600B2 (en) 2008-02-18 2012-03-27 Visiongate, Inc. 3D imaging of live cells with ultraviolet radiation
US8090183B2 (en) * 2009-03-12 2012-01-03 Visiongate, Inc. Pattern noise correction for pseudo projections
EP2304555A2 (en) * 2008-06-05 2011-04-06 Alcon Research, Ltd. Wireless network and methods of wireless communication for ophthalmic surgical consoles
JP5643210B2 (en) 2008-10-24 2014-12-17 アペリオ・テクノロジーズ・インコーポレイテッドAperio Technologies, Inc. Whole slide fluorescent scanner
US8254023B2 (en) * 2009-02-23 2012-08-28 Visiongate, Inc. Optical tomography system with high-speed scanner
EP2406679B1 (en) 2009-03-11 2017-01-25 Sakura Finetek U.S.A., Inc. Autofocus method and autofocus device
US8155420B2 (en) * 2009-05-21 2012-04-10 Visiongate, Inc System and method for detecting poor quality in 3D reconstructions
JP5385028B2 (en) * 2009-07-06 2014-01-08 オリンパス株式会社 Microscope equipment
WO2011072211A2 (en) 2009-12-11 2011-06-16 Aperio Technologies, Inc. Improved signal to noise ratio in digital pathology image analysis
TWI522085B (en) 2010-04-14 2016-02-21 愛爾康研究有限公司 Display for ophthalmic surgical console with user-selectable sectors
BR112012029152A8 (en) 2010-05-18 2018-02-06 Koninklijke Philips Electronics Nv AUTOFOCUS IMAGE FORMATION SYSTEM FOR A MICROSCOPE, METHOD FOR AUTOFOCUS IMAGE FORMATION OF A MICROSCOPE, COMPUTER READABLE MEDIA AND PROGRAM ELEMENT FOR AUTOFOCUS IMAGE FORMATION OF A MICROSCOPE
US10139613B2 (en) 2010-08-20 2018-11-27 Sakura Finetek U.S.A., Inc. Digital microscope and method of sensing an image of a tissue sample
JP5738580B2 (en) 2010-12-03 2015-06-24 オリンパス株式会社 Microscope apparatus and observation method
US9069175B2 (en) * 2011-04-08 2015-06-30 Kairos Instruments, Llc Adaptive phase contrast microscope
US10620118B2 (en) * 2012-02-27 2020-04-14 Steris Instrument Management Services, Inc. Systems and methods for identifying optical materials
US9632168B2 (en) 2012-06-19 2017-04-25 Lockheed Martin Corporation Visual disruption system, method, and computer program product
US9714815B2 (en) 2012-06-19 2017-07-25 Lockheed Martin Corporation Visual disruption network and system, method, and computer program product thereof
US9185161B2 (en) * 2012-12-31 2015-11-10 General Electric Company Systems and methods for synchronizing non-destructive testing devices
US8854361B1 (en) * 2013-03-13 2014-10-07 Cambridgesoft Corporation Visually augmenting a graphical rendering of a chemical structure representation or biological sequence representation with multi-dimensional information
US9146251B2 (en) 2013-03-14 2015-09-29 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
US9103628B1 (en) * 2013-03-14 2015-08-11 Lockheed Martin Corporation System, method, and computer program product for hostile fire strike indication
US9196041B2 (en) 2013-03-14 2015-11-24 Lockheed Martin Corporation System, method, and computer program product for indicating hostile fire
DE102013103971A1 (en) 2013-04-19 2014-11-06 Sensovation Ag Method for generating an overall picture of an object composed of several partial images
JP6174908B2 (en) * 2013-05-27 2017-08-02 キヤノン株式会社 Information processing apparatus, information processing method, and computer program
US10007102B2 (en) 2013-12-23 2018-06-26 Sakura Finetek U.S.A., Inc. Microscope with slide clamping assembly
US11069054B2 (en) 2015-12-30 2021-07-20 Visiongate, Inc. System and method for automated detection and monitoring of dysplasia and administration of immunotherapy and chemotherapy
US11280803B2 (en) 2016-11-22 2022-03-22 Sakura Finetek U.S.A., Inc. Slide management system
EP3615916A4 (en) * 2017-04-24 2020-12-30 Technologies International Inc. Huron Scanning microscope for 3d imaging using msia
WO2019165480A1 (en) 2018-02-26 2019-08-29 Caliber Imaging & Diagnostics, Inc. System and method for macroscopic and microscopic imaging ex-vivo tissue
JPWO2021060358A1 (en) * 2019-09-26 2021-04-01

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0270251A2 (en) * 1986-11-06 1988-06-08 AMERSHAM INTERNATIONAL plc Imaging method and apparatus
US4755874A (en) * 1987-08-31 1988-07-05 Kla Instruments Corporation Emission microscopy system
EP0404568A2 (en) * 1989-06-22 1990-12-27 Hamamatsu Photonics K.K. Image processing apparatus
US5332905A (en) * 1992-08-26 1994-07-26 Atto Instruments, Inc. Apparatus and method for multiple emission ratio photometry and multiple emission ratio imaging

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4744667A (en) * 1986-02-11 1988-05-17 University Of Massachusetts Microspectrofluorimeter
US5369496A (en) * 1989-11-13 1994-11-29 Research Foundation Of City College Of New York Noninvasive method and apparatus for characterizing biological materials
US5325449A (en) * 1992-05-15 1994-06-28 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
JP2575270B2 (en) * 1992-11-10 1997-01-22 浜松ホトニクス株式会社 Method for determining base sequence of nucleic acid, method for detecting single molecule, apparatus therefor and method for preparing sample

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0270251A2 (en) * 1986-11-06 1988-06-08 AMERSHAM INTERNATIONAL plc Imaging method and apparatus
US4755874A (en) * 1987-08-31 1988-07-05 Kla Instruments Corporation Emission microscopy system
EP0404568A2 (en) * 1989-06-22 1990-12-27 Hamamatsu Photonics K.K. Image processing apparatus
US5332905A (en) * 1992-08-26 1994-07-26 Atto Instruments, Inc. Apparatus and method for multiple emission ratio photometry and multiple emission ratio imaging

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999013360A2 (en) * 1997-09-10 1999-03-18 Bellsouth Intellectual Property Corporation Digital telepathology imaging system with bandwidth optimization and virtual focussing
WO1999013360A3 (en) * 1997-09-10 1999-09-02 Bellsouth Intellect Pty Corp Digital telepathology imaging system with bandwidth optimization and virtual focussing
EP1803806A2 (en) * 2005-12-28 2007-07-04 Fujitsu Limited Injection apparatus and injection method
JP2007175026A (en) * 2005-12-28 2007-07-12 Fujitsu Ltd Injection apparatus and injection method
EP1803806A3 (en) * 2005-12-28 2008-08-13 Fujitsu Limited Injection apparatus and injection method
RU2692825C2 (en) * 2017-10-23 2019-06-28 Ирлан Витальевич Шабельников Method of spectral laser scanning of composite materials in accordance with optical density of its matrix and composite components

Also Published As

Publication number Publication date
US6078681A (en) 2000-06-20
AU2318397A (en) 1997-10-10

Similar Documents

Publication Publication Date Title
US6078681A (en) Analytical imaging system and process
JP7159216B2 (en) Imaging signal extractor and method of using same
EP3776458B1 (en) Augmented reality microscope for pathology with overlay of quantitative biomarker data
US10552956B2 (en) Reconstruction method of biological tissue image, apparatus therefor, and image display apparatus using the biological tissue image
CN102266219B (en) Spectral imaging of deep tissue
US9002077B2 (en) Visualization of stained samples
US20090091566A1 (en) System and methods for thick specimen imaging using a microscope based tissue sectioning device
EP1892517A2 (en) Fluorescent nanoscopy method
US9230319B2 (en) Method of reconstructing a biological tissue image, and method and apparatus for acquiring a biological tissue image
Morimoto et al. Spatial readout of visual looming in the central brain of Drosophila
JP6860064B2 (en) Cell observation device
US9874437B2 (en) Method for the 3-dimensional measurement of a sample with a measuring system comprising a laser scanning microscope and such measuring system
JP2018078880A (en) Image generation device, image generation method, and program
Parot et al. Compressed hadamard microscopy for high-speed optically sectioned neuronal activity recordings
Thomas et al. Four-dimensional imaging: the exploration of space and time
Perrin et al. EyeTrackUAV2: A large-scale binocular eye-tracking dataset for UAV videos
Tian et al. Blood cell analysis: from traditional methods to super-resolution microscopy
Zeile et al. Combining biosensing technology and virtual environments for improved urban planning
JP6457979B2 (en) Tissue sample analyzer and tissue sample analysis system
US20220113671A1 (en) Portable uv holographic microscope for high-contrast protein crystal imaging
Man et al. The Foreground Bias: Differing impacts across depth on visual search in scenes
Taylor et al. Automated interactive microscopy: measuring and manipulating the chemical and molecular dynamics of cells and tissues
Li et al. Whole Slide Imaging in Cytopathology
WO2022075040A1 (en) Image generation system, microscope system, and image generation method
von Wülfingen Traces: generating what was there

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 97533501

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA