WO2003063706A1 - Image processing using measures of similarity - Google Patents

Image processing using measures of similarity Download PDF

Info

Publication number
WO2003063706A1
WO2003063706A1 PCT/US2003/003007 US0303007W WO03063706A1 WO 2003063706 A1 WO2003063706 A1 WO 2003063706A1 US 0303007 W US0303007 W US 0303007W WO 03063706 A1 WO03063706 A1 WO 03063706A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
tissue
regions
determining
segmentation
Prior art date
Application number
PCT/US2003/003007
Other languages
French (fr)
Inventor
Howard Kaufman
Philippe Schmid
Original Assignee
Medispectra, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medispectra, Inc. filed Critical Medispectra, Inc.
Priority to AU2003207787A priority Critical patent/AU2003207787A2/en
Priority to EP03706024A priority patent/EP1476076A1/en
Priority to CA002474417A priority patent/CA2474417A1/en
Publication of WO2003063706A1 publication Critical patent/WO2003063706A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20152Watershed segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • This invention relates generally to image processing. More particularly, in certain embodiments, the invention relates to segmentation of a sequence of colposcopic images based on measures of similarity.
  • colposcopic techniques generally require analysis by a highly trained physician. Colposcopic images may contain complex and confusing patterns. In colposcopic techniques such as aceto-whitening, analysis of a still image does not capture the patterns of change in the appearance of tissue following application of a chemical agent. These patterns of change may be complex and difficult to analyze. Current automated image analysis methods do not allow the capture of the dynamic information available in various colposcopic techniques.
  • Segmentation is a morphological technique that splits an image into different regions according to one or more pre-defined criteria. For example, an image may be divided into regions of similar intensity. It may therefore be possible to determine which sections of a single image have an intensity within a given range. If a given range of intensity indicates suspicion of pathology, the segmentation may be used as part of a diagnostic technique to determine which regions of an image may indicate diseased tissue.
  • a critical factor in discriminating between healthy and diseased tissue may be the manner in which the tissue behaves throughout a diagnostic test, not just at a given time. For example, the rate at which a tissue whitens upon application of a chemical agent may be indicative of disease.
  • the invention provides methods for relating aspects of a plurality of images of a tissue in order to obtain diagnostic information about the tissue.
  • the invention provides methods for image segmentation across a plurality of images instead of only one image at a time.
  • inventive methods enable the compression of a large amount of pertinent information from a sequence of images into a single frame.
  • An important application of methods of the invention is the analysis of a sequence of images of biological tissue in which an agent has been applied to the tissue in order to change its optical properties in a way that is indicative of the physiological state of the tissue. Diagnostic tests which have traditionally required analysis by trained medical personnel may be automatically analyzed using these methods.
  • the invention may be used, for example, in addition to or in place of traditional analysis.
  • the invention provides methods of performing image segmentation using- information from a sequence of images, not just from one image at a time. This is important because it allows the incorporation of time effects in image segmentation. For example, according to an embodiment of the invention, an area depicted in a sequence of images is divided into regions based on a measure of similarity of the changes those regions undergo throughout the sequence.
  • inventive segmentation methods incorporate more information and can be more helpful, for example, in determining a characteristic of a tissue, than segmentation performed using one image at a time.
  • the phrases "segmentation of an image” and “segmenting an image,” as used herein, may apply, for example, to dividing an image into different regions, or dividing into different regions an area in space depicted in one or more images (an image plane).
  • Segmentation methods of this invention allow, for example, the automated analysis of a sequence of images using complex criteria for determining a disease state wliich may be difficult or impossible for a human analyst to perceive by simply viewing the sequence.
  • the invention also allows the very development of these kinds of complex criteria for determining a disease state by permitting the relation of complex behaviors of tissue samples during dynamic diagnostic tests to the known disease states of the tissue samples. Criteria may be developed using the inventive methods described herein to analyze sequences of images for dynamic diagnostic tests that are not yet in existence.
  • One way to relate a plurality of images to each other according to the invention is to create or use a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout a test sequence.
  • Another way to relate images is by creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time. Relating images may also be performed by identifying any particular area represented in an image sequence which satisfies given criteria.
  • the invention is directed to a method of relating a plurality of images of a tissue.
  • the method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based on the relationship; and relating two or more images of the subset of images based at least in part on the segmentation.
  • the step of obtaining a plurality of images of a tissue includes collecting an optical signal.
  • the optical signal includes fluorescence illumination from the tissue.
  • the optical signal includes reflectance, or backscatter, illumination from the tissue.
  • the tissue is illuminated by a white light source, a UV light source, or both.
  • the step of obtaining images includes recording visual images of the tissue.
  • the tissue is or includes cervical tissue.
  • the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue.
  • the plurality of images being related are sequential images.
  • the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue.
  • a chemical agent is applied to the tissue.
  • the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, osmotic agents, ionic agents, and indigo carmine.
  • the method includes filtering two or more of the images.
  • the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter.
  • the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images.
  • determining the measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions.
  • the two regions (of the two or more regions) are neighboring regions.
  • the step of relating images based on the segmentation includes determining a segmentation mask of an image plane, where two or more regions of the image plane are differentiated. In one embodiment, the step of relating images based on the segmentation includes defining one or more data series representing a characteristic of one or more associated segmented regions of the image plane. In one embodiment, this characteristic is mean signal intensity.
  • the step of relating images includes creating or using a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout the plurality of images.
  • the step of relating images includes creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time.
  • the step of relating images includes identifying a particular area represented in the image sequence which satisfies given criteria.
  • the invention is directed to a method of relating a plurality of images of a tissue, where the method includes the steps of: obtaining a plurality of images of a tissue; determining a measure of similarity between two or more regions in each of two or more of the images; and relating at least a subset of the images based at least in part on the measure of similarity.
  • the step of determining a measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions.
  • the two regions are neighboring regions.
  • the invention is directed to a method of determining a tissue characteristic.
  • the method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based at least in part on the relationship; and determining a characteristic of the tissue based at least in part on the segmentation.
  • the step of obtaining a plurality of images of a tissue includes collecting an optical signal.
  • the optical signal includes fluorescence illumination from the tissue.
  • the optical signal includes reflectance, or backscatter, illumination from the tissue.
  • the tissue is illuminated by a white light source, a UV light source, or both.
  • the step of obtaining images includes recording visual images of the tissue.
  • the tissue is or includes cervical tissue.
  • the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue.
  • the plurality of images being related are sequential images.
  • the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue.
  • a chemical agent is applied to the tissue.
  • the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine,
  • the method includes filtering two or more of the images.
  • the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter.
  • the method includes processing two or more images to compensate for a relative motion between the tissue and a detection device.
  • the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images. In certain embodiments, determining this measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions are neighboring regions.
  • the segmenting step includes analyzing an aceto- whitening signal. In one embodiment, the segmenting step includes analyzing a variance signal. In one embodiment, the segmenting step includes determining a gradient image. [0025] According to one embodiment, the method includes processing one or more optical signals based on the segmentation. In one embodiment, the method includes filtering at least one image based at least in part on the segmentation.
  • the step of determining a characteristic of the tissue includes determining one or more regions of the tissue where there is suspicion of pathology. In certain embodiments, the step of determining a characteristic of the tissue includes classifying a region of tissue as one of the following: normal squamous tissue, metaplasia, Cervical Intraepithelial Neoplasia, Grade I (CLN I), and Cervical Intraepithelial Neoplasia, Grade II or Grade III (CLN II/CIN III).
  • the invention is directed to a method of determining a characteristic of a tissue.
  • the method includes the steps of: (a) for each of a first plurality of reference sequences of images of tissue having a first known characteristic, quantifying one or more features of each of a first plurality of mean signal intensity data series corresponding to segmented regions represented in each of the first plurality of reference sequences of images; (b) for a test sequence of images, quantifying one of more features of each of one or more mean signal intensity data series corresponding to one or more segmented regions represented in the test sequence of images; and (c) determining a characteristic of a tissue represented in the test sequence of images based at least in part on a comparison between the one or more features quantified in step (a) and the one or more features quantified in step (b).
  • step (c) includes repeating step (a) for each of a second plurality of reference sequences of images of tissue having a second known characteristic.
  • step (c) includes applying a classification rule based at least in part on the first plurality of reference sequences and the second plurality of reference sequences.
  • step (c) includes performing a linear discriminant analysis to determine the classification rule.
  • one of the one or more features of step (a) includes the slope of a curve at a given point fitted to one of the plurality of mean signal intensity data series.
  • the method includes determining the segmented regions of the test sequence of images by analyzing an aceto whitening signal.
  • the first known characteristic is CIN II/CIN III and the second known characteristic is absence of
  • Figure 1 is a schematic flow diagram depicting steps in the analysis of a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 2 A depicts human cervix tissue and shows an area of which a sequence of images are to be obtained according to an illustrative embodiment of the invention.
  • Figure 2B depicts the characterization of a discrete signal from a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 3 shows a series of graphs depicting mean signal intensity of a region as a function of time, as determined from a sequence of images according to an illustrative embodiment of the invention.
  • Figure 4A depicts a "maximum" RGB image representation used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 4B depicts the image representation of Figure 4A after applying a manual mask, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 4C depicts the image representation of Figure 4B after accounting for glare, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 4D depicts the image representation of Figure 4C after accounting for chromatic artifacts, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 5 shows a graph illustrating the determination of a measure of similarity of time series of mean signal intensity for each of two regions according to an illustrative embodiment of the invention.
  • Figure 6 is a schematic flow diagram depicting a region merging approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 7A represents a segmentation mask produced using a region merging approach according to an illustrative embodiment of the invention.
  • Figure 7B shows a graph depicting mean signal intensities of segmented regions represented in Figure 7 A as functions of time according to an illustrative embodiment of the invention.
  • Figure 8 is a schematic flow diagram depicting a robust region merging approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 9A represents a segmentation mask produced using a robust region merging approach according to an illustrative embodiment of the invention.
  • Figure 9B shows a graph depicting mean variance signals of segmented regions represented in Figure 9 A as functions of time according to an illustrative embodiment of the invention.
  • Figure 10 is a schematic flow diagram depicting a clustering approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 11 A represents a segmentation mask produced using a clustering approach according to an illustrative embodiment of the invention.
  • Figure 1 IB shows a graph depicting mean signal intensities of segmented regions represented in Figure 11 A as functions of time according to an illustrative embodiment of the invention.
  • Figure 11 C represents a segmentation mask produced using a clustering approach according to an illustrative embodiment of the invention.
  • Figure 1 ID shows a graph depicting mean signal intensities of segmented regions represented in Figure 1 IC as functions of time according to an illustrative embodiment of the invention.
  • Figure 12 is a schematic flow diagram depicting a watershed approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 13 represents a gradient image used in a watershed approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 14A represents a segmentation mask produced using a watershed approach according to an illustrative embodiment of the invention.
  • Figure 14B represents a segmentation mask produced using a watershed approach according to an illustrative embodiment of the invention.
  • Figure 15A represents a seed region superimposed on a reference image from a sequence of images, used in a region growing approach of segmentation according to an illustrative embodiment of the invention.
  • Figure 15B represents the completed growth of the "seed region" of Figure 15A using a region growing approach accordmg to an illustrative embodiment of the invention.
  • Figure 16A represents a segmentation mask produced using a combined clustering approach and robust region merging approach according to an illustrative embodiment of the invention.
  • Figure 16B shows a graph depicting mean signal intensities of segmented regions represented in Figure 16A as functions of time according to an illustrative embodiment of the invention.
  • Figure 17A represents a segmentation mask produced using a combined clustering approach and watershed technique according to an illustrative embodiment of the invention.
  • Figure 17B shows a graph depicting mean signal intensities of segmented regions represented in Figure 17A as functions of time according to an illustrative embodiment of the invention.
  • Figure 18A represents a segmentation mask produced using a two-part clustering approach accordmg to an illustrative embodiment of the invention.
  • Figure 18B shows a graph depicting mean signal intensities of segmented regions represented in Figure 18A as functions of time according to an illustrative embodiment of the invention.
  • Figure 19 depicts the human cervix tissue of Figure 2 A with an overlay of manual doctor annotations made after viewing an image sequence.
  • Figure 20A is a representation of a segmentation mask produced using a combined clustering approach and robust region merging approach with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
  • Figure 20B is a representation of a segmentation mask produced using a combined clustering approach and morphological technique with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
  • Figure 21 A depicts a reference image of cervical tissue of a patient from a sequence of images obtained during an aceto whitening test according to an illustrative embodiment of the invention.
  • Figure 2 IB depicts an image from the sequence of Figure 21 A after applying a manual mask, accounting for glare, and accounting for chromatic artifacts according to an illustrative embodiment of the invention.
  • Figure 21C shows a graph depicting mean signal intensities of segmented regions for the sequence of Figure 21 A determined using a clustering segmentation approach according to an illustrative embodiment of the invention.
  • Figure 21D represents a map of regions of tissue as segmented in Figure 21C classified as either high grade disease tissue or not high grade disease tissue using a classification algorithm according to an illustrative embodiment of the invention.
  • the invention provides methods for image segmentation across a plurality of images. Segmentation across a plurality of images provides a much more robust analysis than segmentation in a single image. Segmentation across multiple images according to the invention allows incorporation of a temporal element (e.g., the change of tissue over time in a sequence of images) in optics-based disease diagnosis.
  • the invention provides means to analyze changes in tissue over time in response to a xeatment. It also provides the ability to increase the resolution of segmented imaging by increasing the number of images over time. This allows an additional dimension to image-based tissue analysis, which leads to increase sensitivity and specificity of analysis.
  • the following is a detailed description of a preferred embodiment of the invention.
  • the schematic flow diagram 100 of Figure 1 depicts steps in the analysis bf a sequence of images of tissue according to an illustrative embodiment of the invention.
  • Figure 1 also serves as a general outline of the contents of this description.
  • Each of the steps of Figure 1 is discussed herein in detail. Briefly, the steps include obtaining a sequence of images of the tissue
  • the steps may be preceded by application of a chemical agent onto the tissue, for example. In other embodiments, a chemical agent is applied during the performance of the steps of the schematic flow diagram 100 Figure 1.
  • a chemical agent is applied during the performance of the steps of the schematic flow diagram 100 Figure 1.
  • segmentation techniques of the inventive embodiments discussed herein include region merging, robust region merging, clustering, watershed, and region growing techniques, as well as combinations of these techniques.
  • Figures 2 A and 2B relate to step 102 of Figure 1, obtaining a sequence of images of the tissue.
  • embodiments of the invention are not limited to aceto-whitening tests, an exemplary sequence of images from an aceto-whitening test performed on a patient is used herein to illustrate certain embodiments of the invention.
  • Figure 2A depicts a full-frame image 202 of a human cervix after application of acetic acid, at the start of an aceto-whitening test.
  • the inset image 204 depicts an area of interest to be analyzed herein using embodiment methods of the invention. This area of interest may be determined by a technician or may be determined in a semi-automated fashion using a multi-step segmentation approach such as one of those discussed herein below.
  • Figure 2B depicts the characterization 206 of a discrete signal w(ij;t) from a sequence of images of tissue according to an illustrative embodiment of the invention.
  • the signal could be any type of image signal of interest known in the art.
  • the signal is an intensity signal of an image.
  • images of an area of interest are taken at N time steps
  • time to corresponds to the moment of application of a chemical agent to the tissue, for instance, and time t ⁇ - ⁇ corresponds to the end of the test.
  • time t 0 corresponds to a moment following the application of a chemical agent to the tissue.
  • r x c discrete signals w ' may be constructed describing the evolution of some optically-detectable phenomena, such as aceto-whitening, with time.
  • the "whiteness” may be computed from RGB data of the images.
  • RGB data RGB data of the images.
  • metrics which may be used to define "whiteness.”
  • an illustrative embodiment employs an intensity component, CCIR 601, as a measure of "whiteness" of any particular pixel, defined in terms of red (R), green (G), and blue (B) intensities as follows:
  • w(i ;t) I(i,j;n), for example.
  • the signal w(i ;t) is defined in any of a multiplicity of other ways.
  • Figure 2B shows that the intensity signal w(ij;t) has a value corresponding to each discrete location (i,j) in each of the images taken at N discrete time steps.
  • a location (i,j) in an image corresponds to a single pixel.
  • the whitening signals are background subtracted.
  • each of the signals corresponding to a given location at a particular time step are transformed by subtracting the initial intensity signal at that location as shown in
  • Equation (2) w(i,j;n) ⁇ w(i,j;> ⁇ -w(i,j;n Q ), V ⁇ eT .
  • Figure 3 relates to part of step 104 of Figure 1, preprocessing the images.
  • Figure 3 shows a series of graphs depicting mean signal intensity 304 of a pixel as a function of time 306, as determined from a sequence of images according to an illustrative embodiment of the invention.
  • the graphs depict application of a morphological filter, application of a diffusion filter, modification of intensity data to account for background intensity, and normalization of intensity data, according to an illustrative embodiment of the invention.
  • a first filter is applied to the time axis, individually for each pixel.
  • the images are then spatially filtered.
  • Graph 302 of Figure 3 depicts the application of both a temporal filter and a spatial filter at a representative pixel.
  • the original data is connected by a series of line segments 308. It is evident from graph 302 that noise makes the signal choppy and adversely affects further analysis if not removed.
  • Equation (3) For temporal filtering, the illustrative embodiment of the invention applies the morphological filter of Equation (3):
  • the structuring element has a half circle shape.
  • the temporally-filtered data is connected by a series of line segments 310 in the graph 302 of Figure 3.
  • the noise is decreased from the series 308 to the series 310.
  • the images are then spatially filtered, for example, with either an isotropic or a Gaussian filter.
  • a diffusion equation implemented by an illustrative isotropic filter may be expressed as Equation (4):
  • V is the gradient operator
  • is the Laplacian operator
  • is the diffusion time (distinguished from the time component of the whitening signal itself).
  • An isotropic filter is iterative, wlangle a Gaussian filter is an infinite impulse response (IIR) filter.
  • the iterative filter of Equation (4) is much faster than a Gaussian filter, since the iterative filter allows for increasing smootlmess by performing successive iterations.
  • the Gaussian filter requires re-applying a more complex filter to the original image for increasing degrees of filtration. According to the illustrative embodiment, the methods of the invention perform two iterations.
  • the method performs one iteration or three or more iterations.
  • the spatially- filtered data for a representative pixel is com ected by a series of line segments 312 in graph 302 of Figure 3.
  • the noise is decreased from series 310 to series 312.
  • Graph 314 of Figure 3 shows the application of Equation (2), background subtracting the intensity signal 304.
  • Graph 318 of Figure 3 shows the intensity signal data following normalization 320.
  • normalization includes division of values of the intensity signal 304 by a reference value, such as the maximum intensity signal over the sequence of images. Glare and chromatic artifacts can affect selection of the maximum intensity signal; thus, in an illustrative embodiment, normalization is performed subsequent to correcting for glare and chromatic artifacts.
  • the invention masks glare and chromatic artifacts from images prior to normalization.
  • glare may have a negative impact, since glare is visually similar to the tissue whitening that is the object of the analysis.
  • Chromatic artifacts may have a more limited impact on the intensity of pixels and may be removed with the temporal and spatial filters described above.
  • Thresholding may be used to mask out glare and chromatic artifacts.
  • thresholding is performed in the L * u * v color space.
  • the invention also employs a correlate for hue, expressed as in Equation (5):
  • the illustrative methods of the invention rotate the u -v plane such that the typical reddish color of the cervix correlates to higher values of h . This makes it possible to work with a single threshold for chromatic artifacts.
  • the rotation is given by Equation (6):
  • Figures 4A and 4B relate to part of step 104 of Figure 1, preprocessing the images.
  • Figure 4A depicts a "maximum" RGB image representation 402 used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • the maximum RGB image is computed by taking for each pixel the maximum RGB values in the whole sequence of images.
  • Figure 4B depicts the image representation of Figure 4A after applying a manual mask, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • the method applies the manual mask in addition to the masks for glare and chromatic effects in order to account for obstructions such as hair, foam from the chemical. agent, or other obstruction, and/or to narrow analysis to an area of interest.
  • Area 406 of the frame 404 of Figure 4B has been manually masked in accord with the methods of the embodiment.
  • Figure 4C which depicts the image representation of Figure 4B after accounting for glare, is used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. Note that the areas 408, 410, 412, 414, 416, and 418 of the frame
  • Figure 4D which depicts the image representation of Figure 4C after accounting for chromatic artifacts, is used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
  • the areas 420 and 422 of the frame 407 of Figure 4D have been masked for chromatic artifacts using Equation (8).
  • illustrative methods of the invention pre-segment the image plane into grains.
  • the mean grain surface is about 30 pixels. However, in other embodiments, it is between about a few pixels and about a few hundred pixels.
  • the segmentation methods can be applied starting at either the pixel level or the grain level.
  • One way to "pre-segment" the image plane into grains is to segment each of the images in the sequence using a watershed transform.
  • One goal of the watershed technique is to simplify a gray-level image by viewing it as a three-dimensional surface and by progressively "flooding" the surface from below through “holes” in the surface.
  • the third dimension is the gradient of an intensity signal over the image plane (further discussed herein below).
  • water level reaches each minimum.
  • the flooded minima are called catchment basins, and the borders between neighboring catchment basins are called watersheds.
  • the catchment basins determine the pre-segmented image.
  • Image segmentation with the watershed transform is preferably performed on the image gradient. If the watershed transform is performed on the image itself, and not the gradient, the watershed transform may obliterate important distinctions in the images. Determination of a gradient image is discussed herein below.
  • Segmentation is a process by which an image is split into different regions according to one or more pre-defined criteria. In certain embodiments of the invention, segmentation methods are performed using information from an entire sequence of images, not just from one image at a time. The area depicted in the sequence of images is split into regions based on a measure of similarity of the detected changes those regions undergo throughout the sequence.
  • Segmentation is useful in the analysis of a sequence of images such as in aceto- whitening cervical testing.
  • segmentation is needed.
  • filtering and masking procedures are insufficient to adequately relate regions of an image based on the similarity of the signals those regions produce over a sequence of images. Therefore, the illustrative methods of the invention average time-series data over regions made up of pixels whose signals display similar behavior over time.
  • regions of an image are segmented based at least in part upon a measure of similarity of the detected changes those regions undergo. Since a measure of similarity between regions depends on the way regions are defined, and since regions are defined based upon criteria involving- the measure of similarity, the illustrative embodiment of the invention employs an iterative process for segmentation of an area into regions. In some embodiments, segmentation begins by assuming each pixel or each grain (as determined above) represents a region. These individual pixels or grains are then grouped into regions according to criteria defined by the segmentation method. These regions are then merged together to form new, larger regions, again according to criteria defined by the segmentation method.
  • a problem that arises when processing raw image data is its high dimension.
  • a- typical whitening signal for a single pixel described by, for example, a sixty-or-more- dimensional vector it is often necessary to reduce data dimensionality prior to processing.
  • the invention obtains a scalar that quantifies a leading characteristic of two vectors. More particularly, illustrative methods of the invention take the N-dimensional inner (dot) product of two vectors corresponding to two pixel coordinates. A fitting function based on this dot product is shown in Equation (9). This fitting function quantifies the similarity between the signals at two locations.
  • Figure 5 relates to step 108 of Figure 1, determining a measure of similarity between regions in each of a series of images .
  • Figure 5 shows a graph 502 illustrating the determination of a measure of similarity of a time series of mean signal intensity 504 for each of two regions k and / according to an illustrative embodiment of the invention.
  • Figure 5 represents one step in an illustrative segmentation method in which the similarity between the time-series signals of two neighboring regions is compared against criteria to determine whether those regions should be merged together.
  • the type of measure of similarity chosen may vary depending on the segmentation method employed.
  • Curve 506 in Figure 5 represents the mean signal intensity 504 of region k in each of a sequence of images and is graphed versus time 505.
  • Mean signal intensity 504 of region k is expressed as in Equation (10):
  • Curve 508 of Figure 5 represents the mean signal intensity 504 of region / and is graphed versus time 505.
  • Mean signal intensity 504 of region / is expressed as in Equation (10), replacing "A" with "/" in appropriate locations.
  • the shaded area 510 of Figure 5 represents dissimilarity between mean signals over region k and region /.
  • the chosen measure of similarity, ⁇ k i, also referred to herein as the fitting function, between regions k and / may depend on the segmentation method employed. For the region merging segmentation technique, discussed in more detail below, as well as for other segmentation tecliniques, the measure of similarity used is shown in Equation (11): y ( (n*.,;,..
  • numerator represents the N-dimensional dot product of the background-subtracted mean signal intensities 504 of region k and region I the denominator represents the greater of the energies of the signals corresponding to regions k and ,
  • Equation (11) the numerator of Equation (11) is normalized by the higher signal energy and not by the square root of the product of both energies.
  • (9) can be used to obtain a gradient image representing the variation of whitening values in x-y space.
  • the gradient of an image made up of intensity signals is the approximation of the amplitude of the local gradient of signal intensity at every pixel location.
  • the watershed transform is then applied to the gradient image. This may be done when pre-segmenting images into "grains" as discussed above, as well as when performing the hierarchical watershed segmentation approach and combined method segmentation approaches discussed below.
  • a gradient image representing a sequence of images is calculated for an individual image by computing the fitting value ⁇ (i,j;i 0 ,j 0 ) between a pixel (i 0 , j 0 ) and all its neighbors
  • Equation (13) f l if ⁇ ( ( / () ) > ⁇ (/,y) [-1 il- ⁇ (/ 0 ,. /0 ) ⁇ ⁇ (/,/) ' ⁇ )
  • Equation (14) The norm of the gradient vector is then calculated from the approximations of Equations (14) and (15).
  • the fitting values include information from the entire sequence of images
  • the gradient image may be used in the watershed pre-segmentation technique discussed herein above and the hierarchical watershed technique discussed herein below. Had the gradient image been obtained from a single reference image, less detail would be included, and the application of a watershed segmentation method to the gradient image would segment the image plane based on less data. However, by using a gradient image as determined from Equations (14) and (15), the invention enables a watershed segmentation technique to be applied which divides an image plane into regions based on an entire sequence of data, not just a single reference image.
  • Figure 6 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence.
  • Various techniques may be used to perform the segmentation step 110 of Figure 1.
  • Figure 6 shows a schematic flow diagram 602 depicting a region merging technique of segmentation according to an illustrative embodiment of the invention. In this technique, each grain or pixel is initially a region, and the method merges neighboring according to a predefined criterion in an iterative fashion. The criterion is based on a measure of similarity, also called a fitting function.
  • the segmentation converges to the final result when no pair of neighboring regions satisfies the merging criterion.
  • aceto-whitening data for instance, it is desired to merge regions whose whitening data is similar.
  • the fitting function will therefore quantify the similarity over the sequence between two signals corresponding to two neighboring regions.
  • the segmentation begins at step 604 of Figure 6, computing the fitting function to obtain "fitting values" for all pairs of neighboring regions.
  • this is the measure of similarity provided by Equation (11), where mean signal intensity of a region k is defined by Equation (10).
  • This fitting function is equivalent to the minimum normalized
  • step 606 of Figure 6 the fitting values (measures of similarity) corresponding to pairs of neighboring regions that are larger than a given threshold are sorted from greatest to least.
  • step 608 sorted pairs are merged according to best fit, keeping in mind that each region can only be merged once during one iteration. For instance, if neighboring regions k and / have a fitting value of 0.80 and neighboring regions k and m have a fitting value of 0.79, regions k and / are merged together, not k and m. However, region m may be merged with another of its neighboring regions during this iteration, depending on the fitting values computed.
  • step 609 the method recalculates fitting values for all pairs of neighboring regions containing an updated (newly merged) region. In the embodiment, Fitting values are not recalculated for pairs of neighboring regions whose regions are unchanged.
  • step 610 of Figure 6 it is determined whether the fitting values of all pairs of neighboring regions are below a given tlireshold.
  • the fitting function is a measure of similarity between regions; thus, the higher the threshold, the more similar regions have to be in order to be merged, resulting in fewer iterations and, therefore, more regions that are ultimately defined. If the fitting values of all the regions are below the given threshold, the merging is complete 612.
  • step 606 repeats, and fitting values of the pairs of neighboring regions as newly defined are sorted.
  • segmentation method of Figure 6, or any of the other segmentation methods discussed herein is performed, for example, where each pixel has up to four neighbors: above, below, left, and right. However, in other illustrative embodiments, segmentation is performed where each pixel can has up to eight neighbors or more, which includes diagonal pixels. It should also be noted that images in a given sequence may be sub-sampled to reduce computation time. For instance, a sequence of 100 images may be reduced to 50 images by eliminating every other image from the sequence to be segmented.
  • Figures 7A and 7B illustrate step 112 of Figure 1, relating images after segmentation.
  • Figure 7A depicts a segmentation mask 702 produced using the region merging segmentation technique discussed above for an exemplary aceto-whitening sequence.
  • the tlireshold used in step 610 of Figure 6 to produce this segmentation mask 702 is 0.7.
  • Each region has a different label, and is represented by a different color in the mask 702 in order to improve the contrast between neighboring regions.
  • Other illustrative embodiments use other kinds of display techniques known in the art, in order to relate images of the sequence based on segmentation and/or based on the measure of similarity.
  • Figure 7B shows a graph 750 depicting mean signal intensities 752 of segmented regions represented in Figure 7A as functions of a time index 754 according to an illustrative embodiment of the invention.
  • the color of each data series in Figure 7B corresponds to the same-colored segment depicted in Figure 7A.
  • This is one way to visually relate a sequence of images using the results of segmentation. For instance, according to the illustrative embodiment, regions having a high initial rate of increase of signal intensity 752 are identified by observing data series 756 and 758 in Figure 7B, whose signal intensities 752 increase more quicldy than the other data series. The location of the two regions corresponding to these two data series is found in Figure 7 A.
  • kinetic rate constants are derived from each of the data series determined in Figure 7B, and the regions having data series most closely matching kinetic rate constants of interest are identified.
  • one or more data series are curve fit to obtain a characterization of the mean signal intensities 752 of each data series as functions of time.
  • Mean signal intensity may have a negative value after background subtraction. This is evident, for example, in the first part of data series 760 of Figure 7B. In some examples, this is due to the choice of the reference frame for background subtraction. In other examples, it is due to negative intensities corresponding to regions corrupted by glare that is not completely masked from the analysis.
  • Figure 8 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence.
  • Figure 8 shows a schematic flow diagram 802 depicting a robust region merging approach of segmentation according to an illustrative embodiment of the invention.
  • One objective of the robust region merging approach is to take into account the "homogeneity" of data inside the different regions. While the region merging approach outlined in Figure 6 relies essentially on the mean signal of each region to decide subsequent merging, the robust region merging approach outlined in Figure 8 controls the maximum variability allowed inside each region.
  • the variance signal, c w (k;t), associated with each region, k is computed as in
  • Equation (18) w(k;t) is mean signal intensity of region k as expressed in Equation (10).
  • the merging criterion is then the energy of the standard deviation signal, computed as in Equation (18):
  • the variance signal, cr w (k;t), of Equation (17) is computed for each region k.
  • variance signal energy, or the energy of the standard deviation signal as shown in Equation (18) is calculated for each region k.
  • step 806 of Figure 8 the values of variance signal energy that are larger than a given threshold are sorted. This determines which regions can be merged, but not in which order the regions may be merged.
  • step 808 of Figure 8 the sorted pairs are merged according to the increase in variance each merged pair would create, A ⁇ (k,l), given by Equation (19):
  • the methods of the invention apply an additional criterion as shown in step 807 of Figure 8 prior to merging sorted pairs in step 808.
  • step 807 fitting values corresponding to the pairs of neighboring regions are checked against a threshold. The fitting values are determined as shown in Equation (11), used in the region-merging approach.
  • a candidate pair of regions are not merged if its fitting value is below the threshold
  • ⁇ u values may be used without deviating from the scope of the invention.
  • step 809 of Figure 8 values of the variance signal are recalculated for pairs of neighboring regions containing an updated (newly-merged) region. According to an embodiment of the invention, variance signal values are not recalculated for pairs of neighboring regions whose regions are unchanged.
  • step 810 of Figure 8 the illustrative method of the invention determines whether the values of the variance signal energy for all regions are below a given variance threshold. If all values are below the threshold, the merging is complete 812. If not, the process beginning at step 806 is repeated, and values of variance signal energy of neighboring pairs of regions above the tlireshold are sorted.
  • Figures 9 A and 9B illustrate step 112 of Figure 1, relating images after segmentation.
  • Figure 9A depicts a segmentation mask 902 produced using the robust region merging segmentation technique discussed above for an exemplary aceto-whitening sequence.
  • the variance threshold used in step 810 of Figure 8 to produce the segmentation mask 902 is 70.
  • Other variance thresholds may be employed without deviating from the scope of the invention.
  • Each region has a different label, and is represented by a different color in the mask 902 to improve the contrast between neighboring regions.
  • Other display techniques may be used to relate images of the sequence based on segmentation and/or based on the measure of similarity.
  • Figure 9B shows a graph 950 depicting mean signal intensity 952 of segmented regions represented in Figure 9A as functions of a time index 954 according to an illustrative embodiment of the invention.
  • the color of each curve in Figure 9B corresponds to the same- colored segment depicted in Figure 9A. This is one way to visually relate a sequence of images using the results of segmentation.
  • the method observes data series 956, 958, 960, and 962 in Figure 9B, whose signal intensities 952 increase more quickly than the other data series.
  • the location of the four regions corresponding to these four data series are in Figure 9A.
  • the method derives kinetic rate constants from each of the data series determined in Figure 9B, and the regions having data series most closely matching kinetic rate constants of interest are identified.
  • the method curve fits one or more data series to obtain a characterization of the mean signal intensities 952 of each data series as functions of time.
  • Figure 10 relates to step 110 of Figure 1, segmenting an area represented in a sequence of images into regions based on measures of similarity between regions over the sequence, according to one embodiment of the invention.
  • Figure 10 shows a schematic flow diagram 1002 depicting a segmentation approach based on the clustering of data, or more precisely, a "fuzzy c- means" clustering segmentation approach, used in an embodiment of the invention.
  • An objective of this clustering approach is to group pixels into clusters with similar values.
  • the method does not merge regions based on their spatial relation to each other. In other words, two non-neighboring regions may be merged, depending on the criterion used.
  • v is the "center" of the i tl cluster
  • is any inner product induced norm on
  • Equation (11) The distance,
  • the method sets the initial value of v,- randomly.
  • the method calculates values of u t k, the fuzzy membership of x*. to v,-, according to Equation (21).
  • the method updates values of v,- according to Equation (22), using the previously determined value of w, ,.
  • the algorithm converges when the relative decrease of the functional J m as defined by Equation (20) is below a predefined threshold, for instance, 0.001.
  • step 1010 of Figure 10 an embodiment of the method determines whether the relative decrease of J m is below the threshold. If not, then the process beginning at step L006 is repeated.
  • the segmentation is completed by labeling each pixel according to its highest fuzzy membership, e.g. ⁇ n 2 x " 1 ⁇ ⁇ .
  • Some embodiments have more regions than clusters, since pixels belonging to different regions with similar signals can contribute to the same cluster.
  • FIGS 11 A and 1 IB show an illustrative embodiment of the method at step 112 of
  • Figure 1 IA depicts a segmentation mask 1102 produced using the clustering technique discussed above for an exemplary aceto-whitening sequence, according to the embodiment.
  • the number of clusters, c, chosen in this embodiment is 3. There are more regions than clusters, since portions of some clusters are non-contiguous.
  • the threshold for the relative decrease of J m in step 1010 of Figure 10 is chosen as 0.001 in this embodiment.
  • Each cluster has a different label and is represented by a different color in the mask 1102.
  • Other embodiments employ other kinds of display techniques to relate images of the sequence based on segmentation and/or based on the measure of similarity.
  • Figure 1 IB shows a graph 1120 depicting mean signal intensities 1122 of segmented regions represented in Figure 11 A as functions of a time index 1124 according to an illustrative embodiment of the invention.
  • the color of each data series in Figure 1 IB corresponds to the same-colored cluster depicted in Figure 11 A.
  • the graph 1120 of Figure 1 IB is one way to visually relate a sequence of images using the results of segmentation according to this embodiment.
  • the method identifies a cluster having a high initial rate of increase of signal intensity 1122 by observing data series 1126 in Figure 1 IB, whose signal intensity increases more quickly than the other data series. Regions belonging to the same cluster have data series 1126 of the same color.
  • the method derives kinetic rate constants from each of the data series determined in Figure 1 IB, and the regions having data series most closely matching kinetic rate constants of interest are identified.
  • the method curve fits one or more data series to obtain characterization of the mean signal intensities 1122 of each data series as functions of time.
  • Figures 1 IC and 1 ID illustrate an embodiment of the method at step 112 of Figure 1, relating images after segmentation.
  • Figure 1 IC depicts a segmentation mask 1140 produced using the clustering technique for the exemplary aceto-whitening sequence in Figure 1 IA, according to the embodiment.
  • the number of clusters, c chosen is 2. Again, there are more regions than clusters, since portions of the same clusters are noncontiguous.
  • the threshold for the relative decrease of J m in step 1010 of Figure 10 is 0.001 for this embodiment.
  • Figure 12 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence, according to an illustrative embodiment of the invention.
  • morphological tecliniques of segmentation are continued beyond filtering and pre-segmentation.
  • Figure 12 shows a schematic flow diagram 1202 depicting a hierarchical watershed approach of segmentation according to the illustrative embodiment.
  • step 1204 of Figure 12 the method computes a gradient image from Equations (14) and (15) according to the embodiment, which incorporates data from the entire sequence of images, as discussed above.
  • the method segments data based on information from the entire sequence of images, not just one image.
  • a watershed transform is applied to this gradient image in step 1206 of Figm-e 12.
  • Some embodiments apply first-in-first-out (FIFO) queues, sorted data, and other techniques to speed the performance of the watershed transform.
  • the method applies a sigmoidal scaling function to the gradient image , prior to performing the watershed transform to enhance the contrast between whitish and reddish (dark) regions, which is particularly useful when analyzing images of a cervix.
  • the catchment basins resulting from application of the watershed transform represent the segmented regions in this embodiment.
  • Figure 13 shows a gradient image 1302 calculated from Equations (5) and (6) for an exemplary sequence of images, according to an embodiment of the invention.
  • the method at step 1208 of Figure 12 constructs a new gradient image using geodesic reconstruction.
  • Two different tecliniques of geodesic reconstruction which embodiments may employ include erosion and dilation.
  • the method determines whether over-segmentation has been reduced sufficiently, according to this embodiment. If so, the segmentation may be considered complete, or one or more additional segmentation tecliniques may be applied, such as a region merging or robust region merging technique, both of which are discussed above. If over-segmentation has not been reduced sufficiently, the method calculates the watershed transform of the reconstructed gradient image as in step 1206, and the process is continued.
  • Certain embodiment methods use the hierarchical watershed to segment larger areas, such as large lesions or the background cervix.
  • the number of iterations is less than about 4 such that regions do not become too large, obscuring real details.
  • the method performs one iteration of the hierarchical watershed, and continues merging regions using the robust region merging technique.
  • Figures 14A and 14B show an illustrative embodiment of the method at step 112 of
  • Figure 14A depicts a segmentation mask 1402 produced using one iteration of the hierarchical watershed technique discussed above for an exemplary aceto-whitening sequence, according to the embodiment. Each region has a different label, and is represented by a different color in the mask 1402.
  • Figure 14B depicts a segmentation mask 1430 produced using two iterations of the hierarchical watershed technique discussed above for the exemplary aceto-whitening sequence, according to the embodiment. The segmentation mask 1430 in Figure 14B, produced using two iterations, has fewer regions and is more simplified than the segmentation mask 1402 in Figure 14 A, produced using one iteration.
  • the region growing technique is different from the region merging and robust region merging techniques in that one or more initial regions, called seed regions, grow by merging with neighboring regions.
  • the region merging, robust region merging, and region growing methods are each iterative.
  • the region merging and region growing techniques each use the same fitting function to evaluate the similarity between signals from neighboring regions.
  • the user manually selects seed regions.
  • the seed regions are selected in an automatic fashion. Hard criteria may be used to select areas that are of high interest and/or which behave in a certain way.
  • a user selects seed regions based on that user's experience.
  • the region growing algorithm then proceeds by detecting similar regions and adding them to the set of seeds.
  • One embodiment of the invention is a combined technique using the region growing algorithm, starting with a grain image, followed by performing one iteration of the hierarchical watershed technique, and then growing the selected region according to the robust region merging algorithm.
  • the segmentation techniques discussed herein are combined in various ways.
  • the method processes and analyzes data from a sequence of images in an aceto-whitening test, for instance, using a coarse-to-fme approach.
  • a first segmentation reveals large whitening regions, called background lesions, which are then considered as regions of interest and are masked for additional segmentation.
  • a second segmentation step of the embodiment may outline smaller regions, called foreground lesions. Segmentation steps subsequent to the second step may also be considered.
  • the term "lesion” does not necessarily refer to any diagnosed area, but to an area of interest, such as an area displaying a certain whitening characteristic during the sequence. From the final segmentation, regions are selected for diagnosis, preliminary or otherwise; for further analysis; or for biopsy, for example. Additionally, the segmentation information may be combined with manually drawn biopsy locations for which a pathology report may be generated.
  • the method still applies the pre-processing procedures discussed herein above before performing the multi-step segmentation tecliniques.
  • Figure 16A shows a segmentation mask produced using a combined clustering approach and robust region merging approach for an exemplary aceto-whitening sequence, according to an illustrative embodiment of the invention.
  • the method selects a boomerang-shaped background lesion, con-esponding to a large whitening region.
  • the method masks out, or eliminates from further analysis, the remaining areas of the image frame.
  • a robust region merging procedure is applied, as shown in Figure 8, to the background lesion, according to the embodiment.
  • the method uses a similarity criterion of ⁇ ki - 0.7 in step 807 of Figure 8, and a variance threshold of 120 in step 810 of Figure 8.
  • regions less than 16 pixels large are removed.
  • the resulting segmentation is shown in frame 1602 of Figure 16A.
  • Figure 16B shows a graph 1604 depicting mean signal intensity 1606 of segmented regions represented in Figure 16A as functions of a time index 1608 according to an illustrative embodiment of the invention.
  • the color of each curve in Figure 16B corresponds to the same- colored segment depicted in Figure 16 A.
  • Regions having a high initial rate of increase of signal intensity 1606 include regions 1610 and 1612, shown in Figure 16 A.
  • Figure 17A represents a segmentation mask produced using a combined clustering approach and watershed approach for the exemplary aceto-whitening sequence of Figure 16A, according to an illustrative embodiment of the invention.
  • the method applies a hierarchical watershed segmentation procedure, as shown in Figure 12.
  • the method computes one iteration of the watershed transform, then a region merging technique as per Figure 6, using a fitting value threshold of 0.85 in step 610 of Figure 6. Regions smaller than 16 pixels are removed. The resulting segmentation is shown in frame 1702 of Figure 17A.
  • Figure 17B shows a graph 1720 depicting mean signal intensity 1722 of segmented regions represented in Figure 17A as functions of a time index 1724 according to an illustrative embodiment of the invention.
  • the color of each curve in Figure 17B corresponds to the same- colored segment depicted in Figure 17 A.
  • Regions having a high initial rate of increase of signal intensity 1722 include regions 1726 and 1728, shown in Figure 17A.
  • Figure 18A represents a segmentation mask produced using a two-step clustering approach for the exemplary aceto-whitening sequence of Figure 16 A, according to an illustrative embodiment of the invention. The method performs pre-processing steps, including pre- segmenting pixels into grains using a watershed transform as discussed herein above.
  • Figure 18B shows a graph 1820 depicting mean signal intensity 1822 of segmented regions represented in Figure 18 A as functions of a time index 1824 according to an illustrative embodiment of the invention.
  • the color of each curve in Figure 18B corresponds to the same- colored cluster depicted in Figure 18 A.
  • Regions having a high initial rate of increase of signal intensity 1822 include regions 1828 and 1826, shown in Figure 18 A.
  • Figure 19 depicts the human cervix tissue of Figure 2A with an overlay of manual doctor annotations made after viewing the exemplary aceto-whitening image sequence discussed herein above.
  • the doctor Based on her viewing of the sequence and on her experience with the aceto- whitening procedure, the doctor annotated regions with suspicion of pathology 1904, 1906, 1908, 1910, and 1912. The doctor did not examine results of any segmentation analysis prior to making the annotations. Regions 1910 and 1912 were singled out by the doctor as regions with the highest suspicion of pathology.
  • Figure 20 A is a representation of a segmentation mask produced using a combined clustering approach and robust region merging approach as discussed above and as shown in Figure 16 A, according to an illustrative embodiment of the invention.
  • the segmentation mask in Figure 20 A is shown with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19.
  • Figure 20B is a representation of a segmentation mask produced using a combined clustering approach and watershed technique as discussed above and shown in Figure 18 A.
  • the segmentation mask in Figure 20B is shown with a correspondingly- aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
  • Areas 1912 and 1910 in Figures 20A and 20B correspond to the doctor's annotations of areas of high suspicion of pathology.
  • these areas (1912 and 1910) correspond to regions of rapid, intense whitening.
  • areas of rapid, intense whitening correspond to areas of suspicion of pathology.
  • the techniques discussed herein provide a method of determining a tissue characteristic, namely, the presence or absence of a suspicion of pathology. Certain embodiments of the invention use the techniques in addition to a doctor's analysis or in place of a doctor's analysis. Certain embodiments use combinations of the methods described herein to produce similar results.
  • Some embodiments of the invention for applications other than the analysis of acetowliitening tests of cervical tissue also use various inventive analysis techniques as described herein.
  • a practitioner may customize elements of an analysis technique disclosed herein, based on the attributes of her particular application, according to embodiments of the invention. For instance, the practitioner may choose among the segmentation techniques disclosed herein, depending on the application for which she intends to practice embodiments of the inventive methods.
  • segmentation techniques disclosed herein, depending on the application for which she intends to practice embodiments of the inventive methods.
  • Certain embodiments of the invention methods analyze more complex behavior.
  • Some embodiments segment the image plane of a sequence of images, then feature-extract the resulting mean intensity signals to characterize the signals of each segmented region.
  • Examples of feature extraction procedures include any number of curve fitting techniques or functional analysis techniques used to mathematically and/or statistically describe characteristics of one or more data series. In some embodiments, these features are then used in a manual, automated, or semi- automated method for the classification of tissue.
  • the method classifies a region of cervical tissue either as "high grade disease” tissue, which includes Cervical Intraepithelial Neoplasia II/III (CIN II/III), or as "not high grade disease” tissue, which includes normal squamous (NED - no evidence of disease), metaplasia, and CIN I tissue.
  • the classification for a segmented region may be within a predicted degree of certainty using features extracted from the mean signal intensity curve corresponding to the region. In one embodiment, this classification is performed for each segmented region in an image plane to produce a map of regions of tissue classified as high grade disease tissue.
  • Other embodiments make more specific classifications and distinctions between tissue characteristics, such as distinction between NED, metaplasia, and CIN I tissue.
  • Figure 21 A, Figure 21 B, Figure 21 C, and Figure 21 D depict steps in the classification of regions of tissue in a sequence of images obtained during an acetowhitening procedure performed on a patient with high grade disease according to an illustrative embodiment of the invention.
  • Figure 21 A depicts a reference image 2102 of cervical tissue of the patient from a sequence of images obtained during the acetowhitening test.
  • Figure 21B is a representation 2106 of the reference image 2102 of Figure 21 A after applying a manual mask, accounting for glare, and accounting for chromatic artifacts as discussed herein according to an illustrative embodiment of the invention.
  • Figure 21C shows a graph 2120 depicting mean signal intensities 2122 of segmented regions for the sequence of Figure 21 A as functions of a time index 2124 and as determined using the clustering segmentation approach depicted in the schematic flow diagram 1002 of
  • a classification algorithm was designed using results of a clinical study. In the study, mean signal intensity curves were determined using sequences of images from acetowhitening tests performed on over 200 patients. The classification algorithm may be updated according to an embodiment of the invention upon conducting further or different clinical testing. The present algorithm was based upon two feature parameters extracted from each of certain mean signal intensity data series corresponding to segmented regions of the image sequences for which biopsies were performed. These two feature parameters are as follows:
  • X is the slope of a curve (here, a polynomial curve) fitted to the mean signal intensity data series of a segmented region at the time corresponding to 235 seconds after application of the acetic acid (M235); and
  • Y is the slope of the polynomial curve at the time corresponding to an intensity that is — 16dB from the maximum intensity (about 45% of the maximum mean signal intensity) on the decaying side of the polynomial curve (-16dB slope).
  • DF Function
  • a jackknifed classification matrix linear discriminant analysis was performed on the extracted features X and Y corresponding to certain of the mean signal intensity curves from each of the clinical tests. The curves used were those corresponding to regions for which tissue biopsies were performed. From the linear discriminant analysis, it was determined that a classification algoritlim using the discriminant line shown in Equation (24) results in a diagnostic sensitivity of 88% and a specificity of 88% for the separation of CIN II/III (high grade disease) from the group consisting of normal squamous (NED), metaplasia, and CIN I tissue (not high grade disease):
  • Figure 21D represents a map 2130 of regions of tissue as segmented in Figure 21C classified as either high grade disease tissue or not high grade disease tissue using the classification algoritlim of Equation (24).
  • This embodiment determined this classification for each of the segmented regions by calculating X and Y for each region and determining whether the point (X,Y) falls below the line of Equation (24), in which case the region was classified as high grade disease, or whether the point (X,Y) falls above the line of Equation (24), in which case the region was classified as not high grade disease.
  • the embodiment draws further distinction depending on how far above or below the line of Equation (24) the point (X,Y) falls.
  • the map 2130 of Figure 2 ID indicates segmented regions of high grade disease as red, orange, and yellow, and regions not classifiable as high grade disease as blue.
  • the index 2132 reflects how far the point (X,Y) of a given segment falls below the line of Equation (24).
  • Other embodiments include those employing other segmentation techniques as described herein.
  • Still other embodiments include those employing different classification algorithms, including those using feature parameters other than X and Y, extracted from mean signal data series corresponding to segmented regions.

Abstract

The invention provides methods of relating a plurality of images (104) based on measures of similarity (108). The methods of the invention are useful in the segmentation (110) of a sequence of colposcopic images of tissue, for example. The methods may be applied in the determination of tissue characteristics (114) in acetowhitening testing of cervical tissue, for example.

Description

IMAGE PROCESSING USING MEASURES OF SIMILARITY
Prior Applications [0001] This application claims the benefit of U.S. Provisional Patent Application Serial Number 60/353,978, filed January 31, 2002, the contents of which are hereby incorporated by reference.
Government Rights [0002] This invention was made with government support under Grant No.1 -R44-CA-91618- 01 awarded by the U.S. Department of Health and Human Services. The government has certain rights in the invention.
Field of the Invention [0003] This invention relates generally to image processing. More particularly, in certain embodiments, the invention relates to segmentation of a sequence of colposcopic images based on measures of similarity.
Background of the Invention [0004] It is common in the medical field to perform visual examination to diagnose disease. For example, visual examination of the cervix can discern areas where there is a suspicion of pathology. However, direct visual observation alone is often inadequate for identification of abnormalities in a tissue.
[0005] In some instances, when tissues of the cervix are examined in vivo, chemical agents such as acetic acid are applied to enhance the differences in appearance between normal and pathological areas. Aceto-whitening tecliniques may aid a colposcopist in the determination of areas where there is a suspicion of pathology. [0006] However, colposcopic techniques generally require analysis by a highly trained physician. Colposcopic images may contain complex and confusing patterns. In colposcopic techniques such as aceto-whitening, analysis of a still image does not capture the patterns of change in the appearance of tissue following application of a chemical agent. These patterns of change may be complex and difficult to analyze. Current automated image analysis methods do not allow the capture of the dynamic information available in various colposcopic techniques.
[0007] Traditional image analysis methods include segmentation of individual images.
Segmentation is a morphological technique that splits an image into different regions according to one or more pre-defined criteria. For example, an image may be divided into regions of similar intensity. It may therefore be possible to determine which sections of a single image have an intensity within a given range. If a given range of intensity indicates suspicion of pathology, the segmentation may be used as part of a diagnostic technique to determine which regions of an image may indicate diseased tissue.
[0008] However, standard segmentation tecliniques do not take into account dynamic information, such as a change of intensity over time. This kind of dynamic information is important to consider in various diagnostic techniques such as aceto-whitening colposcopy. A critical factor in discriminating between healthy and diseased tissue may be the manner in which the tissue behaves throughout a diagnostic test, not just at a given time. For example, the rate at which a tissue whitens upon application of a chemical agent may be indicative of disease.
Traditional segmentation techniques do not take into account time-dependent behavior, such as rate of whitening.
Summary of the Invention
[0009] The invention provides methods for relating aspects of a plurality of images of a tissue in order to obtain diagnostic information about the tissue. In particular, the invention provides methods for image segmentation across a plurality of images instead of only one image at a time. In a sense, inventive methods enable the compression of a large amount of pertinent information from a sequence of images into a single frame. An important application of methods of the invention is the analysis of a sequence of images of biological tissue in which an agent has been applied to the tissue in order to change its optical properties in a way that is indicative of the physiological state of the tissue. Diagnostic tests which have traditionally required analysis by trained medical personnel may be automatically analyzed using these methods. The invention may be used, for example, in addition to or in place of traditional analysis.
[0010] The invention provides methods of performing image segmentation using- information from a sequence of images, not just from one image at a time. This is important because it allows the incorporation of time effects in image segmentation. For example, according to an embodiment of the invention, an area depicted in a sequence of images is divided into regions based on a measure of similarity of the changes those regions undergo throughout the sequence.
In this way, inventive segmentation methods incorporate more information and can be more helpful, for example, in determining a characteristic of a tissue, than segmentation performed using one image at a time. The phrases "segmentation of an image" and "segmenting an image," as used herein, may apply, for example, to dividing an image into different regions, or dividing into different regions an area in space depicted in one or more images (an image plane).
[0011] Segmentation methods of this invention allow, for example, the automated analysis of a sequence of images using complex criteria for determining a disease state wliich may be difficult or impossible for a human analyst to perceive by simply viewing the sequence. The invention also allows the very development of these kinds of complex criteria for determining a disease state by permitting the relation of complex behaviors of tissue samples during dynamic diagnostic tests to the known disease states of the tissue samples. Criteria may be developed using the inventive methods described herein to analyze sequences of images for dynamic diagnostic tests that are not yet in existence. [0012] One way to relate a plurality of images to each other according to the invention is to create or use a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout a test sequence. Another way to relate images is by creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time. Relating images may also be performed by identifying any particular area represented in an image sequence which satisfies given criteria.
[0013] In one aspect, the invention is directed to a method of relating a plurality of images of a tissue. The method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based on the relationship; and relating two or more images of the subset of images based at least in part on the segmentation.
[0014] According to one embodiment, the step of obtaining a plurality of images of a tissue includes collecting an optical signal. In one embodiment, the optical signal includes fluorescence illumination from the tissue. In one embodiment, the optical signal includes reflectance, or backscatter, illumination from the tissue. In one embodiment, the tissue is illuminated by a white light source, a UV light source, or both. According to one embodiment, the step of obtaining images includes recording visual images of the tissue.
[0015] According to one embodiment, the tissue is or includes cervical tissue. In another embodiment, the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue. In one embodiment, the plurality of images being related are sequential images. In one embodiment, the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue. According to one embodiment, a chemical agent is applied to the tissue. In one embodiment, the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, osmotic agents, ionic agents, and indigo carmine. In certain embodiments, the method includes filtering two or more of the images. In one embodiment, the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter.
[0016] In one embodiment, the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images. In one embodiment, determining the measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions (of the two or more regions) are neighboring regions.
[0017] According to one embodiment, the step of relating images based on the segmentation includes determining a segmentation mask of an image plane, where two or more regions of the image plane are differentiated. In one embodiment, the step of relating images based on the segmentation includes defining one or more data series representing a characteristic of one or more associated segmented regions of the image plane. In one embodiment, this characteristic is mean signal intensity.
[0018] According to one embodiment, the step of relating images includes creating or using a segmentation mask that represents an image plane divided into regions that exhibit similar behavior throughout the plurality of images. In one embodiment, the step of relating images includes creating or using graphs or other means of data representation which show mean signal intensities throughout each of a plurality of segmented regions as a function of time. In one embodiment, the step of relating images includes identifying a particular area represented in the image sequence which satisfies given criteria.
[0019] In another aspect, the invention is directed to a method of relating a plurality of images of a tissue, where the method includes the steps of: obtaining a plurality of images of a tissue; determining a measure of similarity between two or more regions in each of two or more of the images; and relating at least a subset of the images based at least in part on the measure of similarity. In one embodiment, the step of determining a measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions are neighboring regions.
[0020] In another aspect, the invention is directed to a method of determining a tissue characteristic. The method includes the steps of: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of the images; segmenting at least a subset of the two or more images based at least in part on the relationship; and determining a characteristic of the tissue based at least in part on the segmentation.
[0021] According to one embodiment, the step of obtaining a plurality of images of a tissue includes collecting an optical signal. In one embodiment, the optical signal includes fluorescence illumination from the tissue. In one embodiment, the optical signal includes reflectance, or backscatter, illumination from the tissue. In one embodiment, the tissue is illuminated by a white light source, a UV light source, or both. According to one embodiment, the step of obtaining images includes recording visual images of the tissue.
[0022] According to one embodiment, the tissue is or includes cervical tissue. In another embodiment, the tissue is one of the group consisting of epithelial tissue, colorectal tissue, skin, and uterine tissue. In one embodiment, the plurality of images being related are sequential images. In one embodiment, the chemical agent is applied to change its optical properties in a way that is indicative of the physiological state of the tissue. According to one embodiment, a chemical agent is applied to the tissue. In one embodiment, the chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine,
Shiller's iodine, methylene blue, toluidine blue, osmotic agents, ionic agents, and indigo carmine. . In certain embodiments, the method includes filtering two or more of the images. In one embodiment, the method includes applying either or both of a temporal filter, such as a morphological filter, and a spatial filter, such as a diffusion filter. In one embodiment, the method includes processing two or more images to compensate for a relative motion between the tissue and a detection device.
[0023] According to one embodiment, the step of determining a relationship between two or more regions in each of two or more of the images includes determining a measure of similarity between at least two of the two or more images. In certain embodiments, determining this measure of similarity includes computing an N-dimensional dot product of the mean signal intensities of two of the two or more regions. In one embodiment, the two regions are neighboring regions.
[0024] According to one embodiment, the segmenting step includes analyzing an aceto- whitening signal. In one embodiment, the segmenting step includes analyzing a variance signal. In one embodiment, the segmenting step includes determining a gradient image. [0025] According to one embodiment, the method includes processing one or more optical signals based on the segmentation. In one embodiment, the method includes filtering at least one image based at least in part on the segmentation.
[0026] In certain embodiments, the step of determining a characteristic of the tissue includes determining one or more regions of the tissue where there is suspicion of pathology. In certain embodiments, the step of determining a characteristic of the tissue includes classifying a region of tissue as one of the following: normal squamous tissue, metaplasia, Cervical Intraepithelial Neoplasia, Grade I (CLN I), and Cervical Intraepithelial Neoplasia, Grade II or Grade III (CLN II/CIN III).
[0027] In another aspect, the invention is directed to a method of determining a characteristic of a tissue. The method includes the steps of: (a) for each of a first plurality of reference sequences of images of tissue having a first known characteristic, quantifying one or more features of each of a first plurality of mean signal intensity data series corresponding to segmented regions represented in each of the first plurality of reference sequences of images; (b) for a test sequence of images, quantifying one of more features of each of one or more mean signal intensity data series corresponding to one or more segmented regions represented in the test sequence of images; and (c) determining a characteristic of a tissue represented in the test sequence of images based at least in part on a comparison between the one or more features quantified in step (a) and the one or more features quantified in step (b).
[0028] According to one embodiment, step (c) includes repeating step (a) for each of a second plurality of reference sequences of images of tissue having a second known characteristic. In one embodiment, step (c) includes applying a classification rule based at least in part on the first plurality of reference sequences and the second plurality of reference sequences. In one embodiment, step (c) includes performing a linear discriminant analysis to determine the classification rule. In one embodiment, one of the one or more features of step (a) includes the slope of a curve at a given point fitted to one of the plurality of mean signal intensity data series. According to one embodiment, the method includes determining the segmented regions of the test sequence of images by analyzing an aceto whitening signal. In one embodiment, the first known characteristic is CIN II/CIN III and the second known characteristic is absence of
CΓN II/CΓN III.
Brief Description of the Drawings [0029] The objects and features of the invention can be better understood with reference to the drawings described below, and the claims. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views. [0030] The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the U.S.
Patent and Trademark Office upon request and payment of the necessary fee.
[0031] Figure 1 is a schematic flow diagram depicting steps in the analysis of a sequence of images of tissue according to an illustrative embodiment of the invention.
[0032] Figure 2 A depicts human cervix tissue and shows an area of which a sequence of images are to be obtained according to an illustrative embodiment of the invention.
[0033] Figure 2B depicts the characterization of a discrete signal from a sequence of images of tissue according to an illustrative embodiment of the invention.
[0034] Figure 3 shows a series of graphs depicting mean signal intensity of a region as a function of time, as determined from a sequence of images according to an illustrative embodiment of the invention.
[0035] Figure 4A depicts a "maximum" RGB image representation used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
[0036] Figure 4B depicts the image representation of Figure 4A after applying a manual mask, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
[0037] Figure 4C depicts the image representation of Figure 4B after accounting for glare, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention.
[0038] Figure 4D depicts the image representation of Figure 4C after accounting for chromatic artifacts, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. [0039] Figure 5 shows a graph illustrating the determination of a measure of similarity of time series of mean signal intensity for each of two regions according to an illustrative embodiment of the invention.
[0040] Figure 6 is a schematic flow diagram depicting a region merging approach of segmentation according to an illustrative embodiment of the invention.
[0041] Figure 7A represents a segmentation mask produced using a region merging approach according to an illustrative embodiment of the invention.
[0042] Figure 7B shows a graph depicting mean signal intensities of segmented regions represented in Figure 7 A as functions of time according to an illustrative embodiment of the invention.
[0043] Figure 8 is a schematic flow diagram depicting a robust region merging approach of segmentation according to an illustrative embodiment of the invention.
[0044] Figure 9A represents a segmentation mask produced using a robust region merging approach according to an illustrative embodiment of the invention.
[0045] Figure 9B shows a graph depicting mean variance signals of segmented regions represented in Figure 9 A as functions of time according to an illustrative embodiment of the invention.
[0046] Figure 10 is a schematic flow diagram depicting a clustering approach of segmentation according to an illustrative embodiment of the invention.
[0047] Figure 11 A represents a segmentation mask produced using a clustering approach according to an illustrative embodiment of the invention.
[0048] Figure 1 IB shows a graph depicting mean signal intensities of segmented regions represented in Figure 11 A as functions of time according to an illustrative embodiment of the invention. [0049] Figure 11 C represents a segmentation mask produced using a clustering approach according to an illustrative embodiment of the invention.
[0050] Figure 1 ID shows a graph depicting mean signal intensities of segmented regions represented in Figure 1 IC as functions of time according to an illustrative embodiment of the invention.
[0051] Figure 12 is a schematic flow diagram depicting a watershed approach of segmentation according to an illustrative embodiment of the invention.
[0052] Figure 13 represents a gradient image used in a watershed approach of segmentation according to an illustrative embodiment of the invention.
[0053] Figure 14A represents a segmentation mask produced using a watershed approach according to an illustrative embodiment of the invention.
[0054] Figure 14B represents a segmentation mask produced using a watershed approach according to an illustrative embodiment of the invention.
[0055] Figure 15A represents a seed region superimposed on a reference image from a sequence of images, used in a region growing approach of segmentation according to an illustrative embodiment of the invention.
[0056] Figure 15B represents the completed growth of the "seed region" of Figure 15A using a region growing approach accordmg to an illustrative embodiment of the invention.
[0057] Figure 16A represents a segmentation mask produced using a combined clustering approach and robust region merging approach according to an illustrative embodiment of the invention.
[0058] Figure 16B shows a graph depicting mean signal intensities of segmented regions represented in Figure 16A as functions of time according to an illustrative embodiment of the invention. [0059] Figure 17A represents a segmentation mask produced using a combined clustering approach and watershed technique according to an illustrative embodiment of the invention.
[0060] Figure 17B shows a graph depicting mean signal intensities of segmented regions represented in Figure 17A as functions of time according to an illustrative embodiment of the invention.
[0061] Figure 18A represents a segmentation mask produced using a two-part clustering approach accordmg to an illustrative embodiment of the invention.
[0062] Figure 18B shows a graph depicting mean signal intensities of segmented regions represented in Figure 18A as functions of time according to an illustrative embodiment of the invention.
[0063] Figure 19 depicts the human cervix tissue of Figure 2 A with an overlay of manual doctor annotations made after viewing an image sequence.
[0064] Figure 20A is a representation of a segmentation mask produced using a combined clustering approach and robust region merging approach with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
[0065] Figure 20B is a representation of a segmentation mask produced using a combined clustering approach and morphological technique with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
[0066] Figure 21 A depicts a reference image of cervical tissue of a patient from a sequence of images obtained during an aceto whitening test according to an illustrative embodiment of the invention.
[0067] Figure 2 IB depicts an image from the sequence of Figure 21 A after applying a manual mask, accounting for glare, and accounting for chromatic artifacts according to an illustrative embodiment of the invention. [0068] Figure 21C shows a graph depicting mean signal intensities of segmented regions for the sequence of Figure 21 A determined using a clustering segmentation approach according to an illustrative embodiment of the invention.
[0069] Figure 21D represents a map of regions of tissue as segmented in Figure 21C classified as either high grade disease tissue or not high grade disease tissue using a classification algorithm according to an illustrative embodiment of the invention.
Description of the Illustrative Embodiment
[0070] In general, the invention provides methods for image segmentation across a plurality of images. Segmentation across a plurality of images provides a much more robust analysis than segmentation in a single image. Segmentation across multiple images according to the invention allows incorporation of a temporal element (e.g., the change of tissue over time in a sequence of images) in optics-based disease diagnosis. The invention provides means to analyze changes in tissue over time in response to a xeatment. It also provides the ability to increase the resolution of segmented imaging by increasing the number of images over time. This allows an additional dimension to image-based tissue analysis, which leads to increase sensitivity and specificity of analysis. The following is a detailed description of a preferred embodiment of the invention.
[0071] The schematic flow diagram 100 of Figure 1 depicts steps in the analysis bf a sequence of images of tissue according to an illustrative embodiment of the invention. Figure 1 also serves as a general outline of the contents of this description. Each of the steps of Figure 1 is discussed herein in detail. Briefly, the steps include obtaining a sequence of images of the tissue
102, preprocessing the images 104, determining a measure of similarity between regions in each of the images 108, segmenting the images 110, relating the images 112, and finally, determining a tissue characteristic 114. Though not pictured in Figure 1, the steps may be preceded by application of a chemical agent onto the tissue, for example. In other embodiments, a chemical agent is applied during the performance of the steps of the schematic flow diagram 100 Figure 1. [0072] Among the key steps of the inventive embodiments discussed here are determining a measure of similarity between regions of tissue represented in a sequence of images and segmenting the images based on the measure of similarity. Much of the mathematical complexity presented in this description regards various methods of performing these key steps.
As will become evident, different segmentation methods have different advantages. The segmentation techniques of the inventive embodiments discussed herein include region merging, robust region merging, clustering, watershed, and region growing techniques, as well as combinations of these techniques.
[0073] Figures 2 A and 2B relate to step 102 of Figure 1, obtaining a sequence of images of the tissue. Although embodiments of the invention are not limited to aceto-whitening tests, an exemplary sequence of images from an aceto-whitening test performed on a patient is used herein to illustrate certain embodiments of the invention. Figure 2A depicts a full-frame image 202 of a human cervix after application of acetic acid, at the start of an aceto-whitening test. The inset image 204 depicts an area of interest to be analyzed herein using embodiment methods of the invention. This area of interest may be determined by a technician or may be determined in a semi-automated fashion using a multi-step segmentation approach such as one of those discussed herein below.
[0074] Figure 2B depicts the characterization 206 of a discrete signal w(ij;t) from a sequence of images of tissue according to an illustrative embodiment of the invention. The signal could be any type of image signal of interest known in the art. In the illustrative embodiment, the signal is an intensity signal of an image.
[0075] In the illustrative embodiment, images of an area of interest are taken at N time steps
{to, ti, ..., t-N-i}. In one embodiment, time to corresponds to the moment of application of a chemical agent to the tissue, for instance, and time tκ-ι corresponds to the end of the test. In another embodiment, time t0 corresponds to a moment following the application of a chemical agent to the tissue. For example, let Jl = {0,...,r -l}χ {0,...,c- l}and T = {«0, .... _,] be the image and time domains, respectively, where r is the number of rows and c is the number of columns.
Then, r x c discrete signals w'(i,j;t) may be constructed describing the evolution of some optically-detectable phenomena, such as aceto-whitening, with time. For an aceto-whitening example, the "whiteness" may be computed from RGB data of the images. There are any number of metrics which may be used to define "whiteness." For instance, an illustrative embodiment employs an intensity component, CCIR 601, as a measure of "whiteness" of any particular pixel, defined in terms of red (R), green (G), and blue (B) intensities as follows:
I = 0.299R + 0.587G + 0.114B . (1)
[0076] The "whitening" data is then given by w(i ;t) = I(i,j;n), for example. Alternatively, the signal w(i ;t) is defined in any of a multiplicity of other ways. The characterization 206 of
Figure 2B shows that the intensity signal w(ij;t) has a value corresponding to each discrete location (i,j) in each of the images taken at N discrete time steps. According to the illustrative embodiment, a location (i,j) in an image corresponds to a single pixel. In an aceto-whitening example, since it is the whitening of the cervix that is of interest and not the absolute intensity of the cervix surface, the whitening signals are background subtracted. In one example of background subtraction, each of the signals corresponding to a given location at a particular time step are transformed by subtracting the initial intensity signal at that location as shown in
Equation (2): w(i,j;n) \→ w(i,j;>ή-w(i,j;nQ), Vιι eT . (2)
[0077] Noise, glare, and sometimes chromatic artifacts may corrupt images in a sequence. Signal noise due to misaligned image pairs and local deformations of the tissue may be taken into account as well. Alignment functions and image restoration techniques often do not adequately reduce this type of noise. Therefore, it may be necessary to apply temporal and spatial filters. [0078] Figure 3 relates to part of step 104 of Figure 1, preprocessing the images. Figure 3 shows a series of graphs depicting mean signal intensity 304 of a pixel as a function of time 306, as determined from a sequence of images according to an illustrative embodiment of the invention. The graphs depict application of a morphological filter, application of a diffusion filter, modification of intensity data to account for background intensity, and normalization of intensity data, according to an illustrative embodiment of the invention.
[0079] According to the illustrative embodiment, a first filter is applied to the time axis, individually for each pixel. The images are then spatially filtered. Graph 302 of Figure 3 depicts the application of both a temporal filter and a spatial filter at a representative pixel. The original data is connected by a series of line segments 308. It is evident from graph 302 that noise makes the signal choppy and adversely affects further analysis if not removed.
[0080] For temporal filtering, the illustrative embodiment of the invention applies the morphological filter of Equation (3):
w(i)Qb = -[( v ob)»b + (wb) b] , (3) where b is the structuring element, ° is the opening operator, and • is the closing operator. According to the illustrative embodiment, the structuring element has a half circle shape. The temporally-filtered data is connected by a series of line segments 310 in the graph 302 of Figure 3. The noise is decreased from the series 308 to the series 310.
[0081] Illustratively, the images are then spatially filtered, for example, with either an isotropic or a Gaussian filter. A diffusion equation implemented by an illustrative isotropic filter may be expressed as Equation (4):
^- = *Y .Vw = /cΔw , (4) dτ where V is the gradient operator, Δ is the Laplacian operator, and τ is the diffusion time (distinguished from the time component of the whitening signal itself). An isotropic filter is iterative, wliile a Gaussian filter is an infinite impulse response (IIR) filter. The iterative filter of Equation (4) is much faster than a Gaussian filter, since the iterative filter allows for increasing smootlmess by performing successive iterations. The Gaussian filter requires re-applying a more complex filter to the original image for increasing degrees of filtration. According to the illustrative embodiment, the methods of the invention perform two iterations. However, in other embodiments, the method performs one iteration or three or more iterations. The spatially- filtered data for a representative pixel is com ected by a series of line segments 312 in graph 302 of Figure 3. The noise is decreased from series 310 to series 312.
[0082] Graph 314 of Figure 3 shows the application of Equation (2), background subtracting the intensity signal 304. Graph 318 of Figure 3 shows the intensity signal data following normalization 320. In the illustrative embodiment, as explained below in further detail, normalization includes division of values of the intensity signal 304 by a reference value, such as the maximum intensity signal over the sequence of images. Glare and chromatic artifacts can affect selection of the maximum intensity signal; thus, in an illustrative embodiment, normalization is performed subsequent to correcting for glare and chromatic artifacts. [0083] In the illustrative embodiment, the invention masks glare and chromatic artifacts from images prior to normalization. In the case of wl itening data, glare may have a negative impact, since glare is visually similar to the tissue whitening that is the object of the analysis. Chromatic artifacts may have a more limited impact on the intensity of pixels and may be removed with the temporal and spatial filters described above.
[0084] Thresholding may be used to mask out glare and chromatic artifacts. In the illustrative embodiment thresholding is performed in the L*u*v color space. Preferably, the invention also employs a correlate for hue, expressed as in Equation (5):
Figure imgf000019_0001
Since the hue h* is a periodic function, the illustrative methods of the invention rotate the u -v plane such that the typical reddish color of the cervix correlates to higher values of h . This makes it possible to work with a single threshold for chromatic artifacts. The rotation is given by Equation (6):
Figure imgf000020_0001
The masks for glare and chromatic artifacts are then respectively obtained using Equations (7) and (8): maskghre — U > 90 ^x
maskhm = * <ζ- , (8)
5 where L* e [0, 100] and A* e[0,2ττ] . According to the illustrative embodiment, the masks are eroded to create a safety margin, such that they are slightly larger than the corrupted areas. [0085] Figures 4A and 4B relate to part of step 104 of Figure 1, preprocessing the images. Figure 4A depicts a "maximum" RGB image representation 402 used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. In the illustrative embodiment, the maximum RGB image is computed by taking for each pixel the maximum RGB values in the whole sequence of images.
[0086] Figure 4B depicts the image representation of Figure 4A after applying a manual mask, used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. According to the illustrative embodiment, the method applies the manual mask in addition to the masks for glare and chromatic effects in order to account for obstructions such as hair, foam from the chemical. agent, or other obstruction, and/or to narrow analysis to an area of interest. Area 406 of the frame 404 of Figure 4B has been manually masked in accord with the methods of the embodiment.
[0087] Figure 4C, which depicts the image representation of Figure 4B after accounting for glare, is used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. Note that the areas 408, 410, 412, 414, 416, and 418 of the frame
405 of Figure 4C have been masked for glare using Equation (7).
[0088] Figure 4D, which depicts the image representation of Figure 4C after accounting for chromatic artifacts, is used in preprocessing a sequence of images of tissue according to an illustrative embodiment of the invention. The areas 420 and 422 of the frame 407 of Figure 4D have been masked for chromatic artifacts using Equation (8).
[0089] To reduce the amount of data to process and to improve the signal-to-noise ratio of the signals used in the segmentation techniques discussed below, illustrative methods of the invention pre-segment the image plane into grains. Illustratively, the mean grain surface is about 30 pixels. However, in other embodiments, it is between about a few pixels and about a few hundred pixels. The segmentation methods can be applied starting at either the pixel level or the grain level.
[0090] One way to "pre-segment" the image plane into grains is to segment each of the images in the sequence using a watershed transform. One goal of the watershed technique is to simplify a gray-level image by viewing it as a three-dimensional surface and by progressively "flooding" the surface from below through "holes" in the surface. In one embodiment, the third dimension is the gradient of an intensity signal over the image plane (further discussed herein below). A
"hole" is located at each minimum of the surface, and areas are progressively flooded as the
"water level" reaches each minimum. The flooded minima are called catchment basins, and the borders between neighboring catchment basins are called watersheds. The catchment basins determine the pre-segmented image.
[0091] Image segmentation with the watershed transform is preferably performed on the image gradient. If the watershed transform is performed on the image itself, and not the gradient, the watershed transform may obliterate important distinctions in the images. Determination of a gradient image is discussed herein below. [0092] Segmentation is a process by which an image is split into different regions according to one or more pre-defined criteria. In certain embodiments of the invention, segmentation methods are performed using information from an entire sequence of images, not just from one image at a time. The area depicted in the sequence of images is split into regions based on a measure of similarity of the detected changes those regions undergo throughout the sequence. [0093] Segmentation is useful in the analysis of a sequence of images such as in aceto- whitening cervical testing. In an illustrative embodiment, since the analysis of time-series data with a one-pixel resolution is not possible unless motion, artifacts, and noise are absent or can be precisely identified, segmentation is needed. Often, filtering and masking procedures are insufficient to adequately relate regions of an image based on the similarity of the signals those regions produce over a sequence of images. Therefore, the illustrative methods of the invention average time-series data over regions made up of pixels whose signals display similar behavior over time.
[0094] In the illustrative embodiment, regions of an image are segmented based at least in part upon a measure of similarity of the detected changes those regions undergo. Since a measure of similarity between regions depends on the way regions are defined, and since regions are defined based upon criteria involving- the measure of similarity, the illustrative embodiment of the invention employs an iterative process for segmentation of an area into regions. In some embodiments, segmentation begins by assuming each pixel or each grain (as determined above) represents a region. These individual pixels or grains are then grouped into regions according to criteria defined by the segmentation method. These regions are then merged together to form new, larger regions, again according to criteria defined by the segmentation method. [0095] A problem that arises when processing raw image data is its high dimension. With a- typical whitening signal for a single pixel described by, for example, a sixty-or-more- dimensional vector, it is often necessary to reduce data dimensionality prior to processing. In the illustrative embodiment, the invention obtains a scalar that quantifies a leading characteristic of two vectors. More particularly, illustrative methods of the invention take the N-dimensional inner (dot) product of two vectors corresponding to two pixel coordinates. A fitting function based on this dot product is shown in Equation (9). This fitting function quantifies the similarity between the signals at two locations.
' }
Figure imgf000023_0001
where \ and x are two pixel coordinates, and Ω(x{) = { ( ι; ), (x1. )) is the energy of the signal at location xi.
[0096] Figure 5 relates to step 108 of Figure 1, determining a measure of similarity between regions in each of a series of images . Figure 5 shows a graph 502 illustrating the determination of a measure of similarity of a time series of mean signal intensity 504 for each of two regions k and / according to an illustrative embodiment of the invention. Figure 5 represents one step in an illustrative segmentation method in which the similarity between the time-series signals of two neighboring regions is compared against criteria to determine whether those regions should be merged together. The type of measure of similarity chosen may vary depending on the segmentation method employed.
[0097] Curve 506 in Figure 5 represents the mean signal intensity 504 of region k in each of a sequence of images and is graphed versus time 505. Mean signal intensity 504 of region k is expressed as in Equation (10):
w(k;t) =- - ∑ w( ,./; , (10) where §λ. c N2 is the set of all pixels that belong to the kth region and N is the size of §k .
[0098] Curve 508 of Figure 5 represents the mean signal intensity 504 of region / and is graphed versus time 505. Mean signal intensity 504 of region / is expressed as in Equation (10), replacing "A" with "/" in appropriate locations. The shaded area 510 of Figure 5 represents dissimilarity between mean signals over region k and region /. The chosen measure of similarity, ψki, also referred to herein as the fitting function, between regions k and / may depend on the segmentation method employed. For the region merging segmentation technique, discussed in more detail below, as well as for other segmentation tecliniques, the measure of similarity used is shown in Equation (11): y ( (n*.,;,.. /),,wι V(/,;ι/) ,/) ψuv (11) mmaaxx((ΩΩ((AA)),,ΩΩ((/.)))) where the numerator represents the N-dimensional dot product of the background-subtracted mean signal intensities 504 of region k and region I the denominator represents the greater of the energies of the signals corresponding to regions k and ,
Ω(i) and. il(I) ; and -1 < ψu≤ 1. In this embodiment, the numerator of Equation (11) is normalized by the higher signal energy and not by the square root of the product of both energies.
[0099] In the case of whitening signals, for example, the fitting function defined by Equation
(9) can be used to obtain a gradient image representing the variation of whitening values in x-y space. The gradient of an image made up of intensity signals is the approximation of the amplitude of the local gradient of signal intensity at every pixel location. The watershed transform is then applied to the gradient image. This may be done when pre-segmenting images into "grains" as discussed above, as well as when performing the hierarchical watershed segmentation approach and combined method segmentation approaches discussed below.
[0100] A gradient image representing a sequence of images is calculated for an individual image by computing the fitting value φ(i,j;i0,j0) between a pixel (i0, j0) and all its neighbors
('"' J 6 N(W») ' where A ) = (('o ~ l Λ ) , ( > . - 1 ) » ('o + ]> . o ) , fo, ./o + 1 )} [0101] Since the best fit corresponds to a null gradient, the derivative of the fitting value is computed as in Equation (12):
Figure imgf000024_0001
where φv (i,j o,Jo) (~∞,∞) ■ The sign of φ (',./; Wo) is given by Equation (13): f l ifΩ( ( /()) > Ω(/,y) [-1 il-Ω(/0,./0) < Ω(/,/) ' ^ )
[0102] Then, the derivatives of the signals are approximated as the mean of the forward and backward differences shown in Equations (14) and (15). d ,. . λ . , ^ ( Q - 1: O ) - O + 1; 'O) ,Λ Λ\
(15) d , . . N gy ( - jo o,Jo)-φy ik + o; Wo) W ( θ ) = 2
The norm of the gradient vector is then calculated from the approximations of Equations (14) and (15).
[0103] Since the fitting values include information from the entire sequence of images, one may obtain a gradient image which includes information from the entire sequence of images, and which, therefore, shows details not visible in all of the images. The gradient image may be used in the watershed pre-segmentation technique discussed herein above and the hierarchical watershed technique discussed herein below. Had the gradient image been obtained from a single reference image, less detail would be included, and the application of a watershed segmentation method to the gradient image would segment the image plane based on less data. However, by using a gradient image as determined from Equations (14) and (15), the invention enables a watershed segmentation technique to be applied which divides an image plane into regions based on an entire sequence of data, not just a single reference image. [0104] Figure 6 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence. Various techniques may be used to perform the segmentation step 110 of Figure 1. Figure 6 shows a schematic flow diagram 602 depicting a region merging technique of segmentation according to an illustrative embodiment of the invention. In this technique, each grain or pixel is initially a region, and the method merges neighboring according to a predefined criterion in an iterative fashion. The criterion is based on a measure of similarity, also called a fitting function.
The segmentation converges to the final result when no pair of neighboring regions satisfies the merging criterion. In the case of aceto-whitening data, for instance, it is desired to merge regions whose whitening data is similar. The fitting function will therefore quantify the similarity over the sequence between two signals corresponding to two neighboring regions.
[0105] Thus, the segmentation begins at step 604 of Figure 6, computing the fitting function to obtain "fitting values" for all pairs of neighboring regions. In the region merging approach, this is the measure of similarity provided by Equation (11), where mean signal intensity of a region k is defined by Equation (10). This fitting function is equivalent to the minimum normalized
Euclidean distance, δ u, between the mean signal intensities of regions k and / shown in Equation
(16):
|| (/c;/)- (/;/)| _ t ^ min(Ω(/c),Ω(/)) (w(/c;t), (/;/)) δ?, = (16) max(Ω(jfc),Ω(/)) 2^ max(Ω(/c),Ω(/))J ax(Ω(/c),Ω{/)) This notation reveals the effect of normalizing using the higher energy of the two signals instead of normalizing each signal by its 2 norm. The method using Equation (16) or Equation (11) applies an additional "penalty" when both signals have different energies, and therefore, fitting values are below 1.0 when the scaled versions of the two signals are the same, but their absolute values are different.
[0106] In step 606 of Figure 6, the fitting values (measures of similarity) corresponding to pairs of neighboring regions that are larger than a given threshold are sorted from greatest to least. In step 608, sorted pairs are merged according to best fit, keeping in mind that each region can only be merged once during one iteration. For instance, if neighboring regions k and / have a fitting value of 0.80 and neighboring regions k and m have a fitting value of 0.79, regions k and / are merged together, not k and m. However, region m may be merged with another of its neighboring regions during this iteration, depending on the fitting values computed. [0107] In step 609, the method recalculates fitting values for all pairs of neighboring regions containing an updated (newly merged) region. In the embodiment, Fitting values are not recalculated for pairs of neighboring regions whose regions are unchanged.
[0108] In step 610 of Figure 6, it is determined whether the fitting values of all pairs of neighboring regions are below a given tlireshold. The fitting function is a measure of similarity between regions; thus, the higher the threshold, the more similar regions have to be in order to be merged, resulting in fewer iterations and, therefore, more regions that are ultimately defined. If the fitting values of all the regions are below the given threshold, the merging is complete 612.
If not, the process beginning at step 606 repeats, and fitting values of the pairs of neighboring regions as newly defined are sorted.
[0109] Once merging is complete 612, a size rule is applied that forces each region whose size is below a given value to be merged with its best fitting neighboring region, even though the fitting value is lower than the threshold. In this way, very small regions not larger than a few pixels are avoided.
[0110] The segmentation method of Figure 6, or any of the other segmentation methods discussed herein, is performed, for example, where each pixel has up to four neighbors: above, below, left, and right. However, in other illustrative embodiments, segmentation is performed where each pixel can has up to eight neighbors or more, which includes diagonal pixels. It should also be noted that images in a given sequence may be sub-sampled to reduce computation time. For instance, a sequence of 100 images may be reduced to 50 images by eliminating every other image from the sequence to be segmented.
[0111] Figures 7A and 7B illustrate step 112 of Figure 1, relating images after segmentation.
Figure 7A depicts a segmentation mask 702 produced using the region merging segmentation technique discussed above for an exemplary aceto-whitening sequence. In an illustrative embodiment, the tlireshold used in step 610 of Figure 6 to produce this segmentation mask 702 is 0.7. Each region has a different label, and is represented by a different color in the mask 702 in order to improve the contrast between neighboring regions. Other illustrative embodiments use other kinds of display techniques known in the art, in order to relate images of the sequence based on segmentation and/or based on the measure of similarity.
[0112] Figure 7B shows a graph 750 depicting mean signal intensities 752 of segmented regions represented in Figure 7A as functions of a time index 754 according to an illustrative embodiment of the invention. The color of each data series in Figure 7B corresponds to the same-colored segment depicted in Figure 7A. This is one way to visually relate a sequence of images using the results of segmentation. For instance, according to the illustrative embodiment, regions having a high initial rate of increase of signal intensity 752 are identified by observing data series 756 and 758 in Figure 7B, whose signal intensities 752 increase more quicldy than the other data series. The location of the two regions corresponding to these two data series is found in Figure 7 A. In another example, kinetic rate constants are derived from each of the data series determined in Figure 7B, and the regions having data series most closely matching kinetic rate constants of interest are identified. In another example, one or more data series are curve fit to obtain a characterization of the mean signal intensities 752 of each data series as functions of time.
[0113] Mean signal intensity may have a negative value after background subtraction. This is evident, for example, in the first part of data series 760 of Figure 7B. In some examples, this is due to the choice of the reference frame for background subtraction. In other examples, it is due to negative intensities corresponding to regions corrupted by glare that is not completely masked from the analysis.
[0114] Figure 8 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence.
Figure 8 shows a schematic flow diagram 802 depicting a robust region merging approach of segmentation according to an illustrative embodiment of the invention. One objective of the robust region merging approach is to take into account the "homogeneity" of data inside the different regions. While the region merging approach outlined in Figure 6 relies essentially on the mean signal of each region to decide subsequent merging, the robust region merging approach outlined in Figure 8 controls the maximum variability allowed inside each region.
More specifically, the variance signal, c w(k;t), associated with each region, k, is computed as in
Equation (17):
ol (lr, t) τ ±- ∑ (w(iJ;0 - w(A; 0)z
N k, fj' leSj,
Figure imgf000029_0001
where w(k;t) is mean signal intensity of region k as expressed in Equation (10). The merging criterion is then the energy of the standard deviation signal, computed as in Equation (18):
Figure imgf000029_0002
[0115] Segmentation using the illustrative robust region merging approach begins at step 804 of the schematic flow diagram 802 of Figure 8. In step 804 of Figure 8, the variance signal, crw(k;t), of Equation (17) is computed for each region k. Then variance signal energy, or the energy of the standard deviation signal as shown in Equation (18), is calculated for each region k. In step 806 of Figure 8, the values of variance signal energy that are larger than a given threshold are sorted. This determines which regions can be merged, but not in which order the regions may be merged. In step 808 of Figure 8, the sorted pairs are merged according to the increase in variance each merged pair would create, Aσ(k,l), given by Equation (19):
*,/) = ∑f (* U/;0 - (k;t) + (/; t)]) , (19)
Figure imgf000029_0003
where k and / represent two neighboring regions to be merged. Thus, if a region can merge with more than one of its neighbors, it merges with the one that increases less the variance shown in Equation (19). Another neighbor may merge with the region in the next iteration, given that it still meets the fitting criterion with the updated region.
[0116] According to the illustrative embodiment, it is possible that a large region neighboring a small region will absorb the small region even when the regions have different signals. This results from the illustrative merging criterion being size-dependent, and the change in variance is small if the smaller region is merged into the larger region. According to a further embodiment, the methods of the invention apply an additional criterion as shown in step 807 of Figure 8 prior to merging sorted pairs in step 808. In step 807, fitting values corresponding to the pairs of neighboring regions are checked against a threshold. The fitting values are determined as shown in Equation (11), used in the region-merging approach. According to the illustrative embodiment, a candidate pair of regions are not merged if its fitting value is below the threshold
(e.g., if the two regions are too dissimilar). In the illustrative embodiment, the invention employs a fixed similarity criterion threshold of about ψki = 0.7. This value is low enough not to become the main criterion, yet the value is high enough to avoid the merging of regions with very different signals. However, other ψu values may be used without deviating from the scope of the invention.
[0117] In step 809 of Figure 8, values of the variance signal are recalculated for pairs of neighboring regions containing an updated (newly-merged) region. According to an embodiment of the invention, variance signal values are not recalculated for pairs of neighboring regions whose regions are unchanged.
[0118] In step 810 of Figure 8, the illustrative method of the invention determines whether the values of the variance signal energy for all regions are below a given variance threshold. If all values are below the threshold, the merging is complete 812. If not, the process beginning at step 806 is repeated, and values of variance signal energy of neighboring pairs of regions above the tlireshold are sorted. [0119] Figures 9 A and 9B illustrate step 112 of Figure 1, relating images after segmentation.
Figure 9A depicts a segmentation mask 902 produced using the robust region merging segmentation technique discussed above for an exemplary aceto-whitening sequence. In the illustrative embodiment, the variance threshold used in step 810 of Figure 8 to produce the segmentation mask 902 is 70. However, other variance thresholds may be employed without deviating from the scope of the invention. Each region has a different label, and is represented by a different color in the mask 902 to improve the contrast between neighboring regions. Other display techniques may be used to relate images of the sequence based on segmentation and/or based on the measure of similarity.
[0120] Figure 9B shows a graph 950 depicting mean signal intensity 952 of segmented regions represented in Figure 9A as functions of a time index 954 according to an illustrative embodiment of the invention. The color of each curve in Figure 9B corresponds to the same- colored segment depicted in Figure 9A. This is one way to visually relate a sequence of images using the results of segmentation.
[0121] According to the illustrative embodiment, the method observes data series 956, 958, 960, and 962 in Figure 9B, whose signal intensities 952 increase more quickly than the other data series. The location of the four regions corresponding to these four data series are in Figure 9A. In another embodiment, the method derives kinetic rate constants from each of the data series determined in Figure 9B, and the regions having data series most closely matching kinetic rate constants of interest are identified. In another example, the method curve fits one or more data series to obtain a characterization of the mean signal intensities 952 of each data series as functions of time.
[0122] Figure 10 relates to step 110 of Figure 1, segmenting an area represented in a sequence of images into regions based on measures of similarity between regions over the sequence, according to one embodiment of the invention. Figure 10 shows a schematic flow diagram 1002 depicting a segmentation approach based on the clustering of data, or more precisely, a "fuzzy c- means" clustering segmentation approach, used in an embodiment of the invention. An objective of this clustering approach is to group pixels into clusters with similar values. Unlike the region merging and robust region merging approaches above, the method does not merge regions based on their spatial relation to each other. In other words, two non-neighboring regions may be merged, depending on the criterion used.
[0123] Let X = { χv ... , χπ } c d be a set of n ^/-dimensional vectors. An objective of clustering is to split X into c subsets, called partitions, that minimize a given functional, Jm. In the case of the fuzzy c-means, this functional is given by Equation (20):
Figure imgf000032_0001
where v; is the "center" of the itl cluster, u, e [0,1] is called the fuzzy membership of x/cto v;-, c with 2 Λ = 1, and m e [1,∞] is a weighting exponent. The inventors have used m=2 in exemplary embodiments. The distance ||| is any inner product induced norm on
Krf . The minimization of J,„ as defined by Equation (20) leads to the following iterative system:
Figure imgf000032_0002
[0124] The distance, ||xA -v, I , is based on the fitting function given in Equation (11). If it is assumed that similar signals are at a short distance from each other, then Equation (23) results:
Figure imgf000032_0003
where pA is given by Equation (11).
[0125] Thus, in this embodiment, the segmentation begins at step 1004 of Figure 10; and the method initializes values of v,, where i=l to c and c is the total number of clusters. In one embodiment, the method sets the initial value of v,- randomly. In step 1006 of Figure 10, the method calculates values of utk, the fuzzy membership of x*. to v,-, according to Equation (21). In step 1008 of Figure 10, the method updates values of v,- according to Equation (22), using the previously determined value of w, ,. The algorithm converges when the relative decrease of the functional Jm as defined by Equation (20) is below a predefined threshold, for instance, 0.001.
Thus, in step 1010 of Figure 10, an embodiment of the method determines whether the relative decrease of Jm is below the threshold. If not, then the process beginning at step L006 is repeated.
If the relative decrease of Jm is below the threshold, then the segmentation is completed by labeling each pixel according to its highest fuzzy membership, e.g. ιn2x"1 { } . Some embodiments have more regions than clusters, since pixels belonging to different regions with similar signals can contribute to the same cluster.
[0126] Figures 11 A and 1 IB show an illustrative embodiment of the method at step 112 of
Figure 1. Figure 1 IA depicts a segmentation mask 1102 produced using the clustering technique discussed above for an exemplary aceto-whitening sequence, according to the embodiment. The number of clusters, c, chosen in this embodiment is 3. There are more regions than clusters, since portions of some clusters are non-contiguous. The threshold for the relative decrease of Jm in step 1010 of Figure 10 is chosen as 0.001 in this embodiment. Each cluster has a different label and is represented by a different color in the mask 1102. Other embodiments employ other kinds of display techniques to relate images of the sequence based on segmentation and/or based on the measure of similarity.
[0127] Figure 1 IB shows a graph 1120 depicting mean signal intensities 1122 of segmented regions represented in Figure 11 A as functions of a time index 1124 according to an illustrative embodiment of the invention. The color of each data series in Figure 1 IB corresponds to the same-colored cluster depicted in Figure 11 A. The graph 1120 of Figure 1 IB is one way to visually relate a sequence of images using the results of segmentation according to this embodiment. In this embodiment, the method identifies a cluster having a high initial rate of increase of signal intensity 1122 by observing data series 1126 in Figure 1 IB, whose signal intensity increases more quickly than the other data series. Regions belonging to the same cluster have data series 1126 of the same color. In another embodiment, the method derives kinetic rate constants from each of the data series determined in Figure 1 IB, and the regions having data series most closely matching kinetic rate constants of interest are identified. In another example, the method curve fits one or more data series to obtain characterization of the mean signal intensities 1122 of each data series as functions of time.
[0128] Similarly, Figures 1 IC and 1 ID illustrate an embodiment of the method at step 112 of Figure 1, relating images after segmentation. Figure 1 IC depicts a segmentation mask 1140 produced using the clustering technique for the exemplary aceto-whitening sequence in Figure 1 IA, according to the embodiment. In Figure 1 IC, however, the number of clusters, c, chosen is 2. Again, there are more regions than clusters, since portions of the same clusters are noncontiguous. The threshold for the relative decrease of Jm in step 1010 of Figure 10 is 0.001 for this embodiment.
[0129] Figure 12 relates to step 110 of Figure 1, segmenting the area represented in a sequence of images into regions based on measures of similarity between regions over the sequence, according to an illustrative embodiment of the invention. In this embodiment, morphological tecliniques of segmentation are continued beyond filtering and pre-segmentation. Figure 12 shows a schematic flow diagram 1202 depicting a hierarchical watershed approach of segmentation according to the illustrative embodiment.
[0130] In step 1204 of Figure 12, the method computes a gradient image from Equations (14) and (15) according to the embodiment, which incorporates data from the entire sequence of images, as discussed above. The method segments data based on information from the entire sequence of images, not just one image. A watershed transform is applied to this gradient image in step 1206 of Figm-e 12. Some embodiments apply first-in-first-out (FIFO) queues, sorted data, and other techniques to speed the performance of the watershed transform. Also, in some embodiments, the method applies a sigmoidal scaling function to the gradient image , prior to performing the watershed transform to enhance the contrast between whitish and reddish (dark) regions, which is particularly useful when analyzing images of a cervix. The catchment basins resulting from application of the watershed transform represent the segmented regions in this embodiment.
[0131] Figure 13 shows a gradient image 1302 calculated from Equations (5) and (6) for an exemplary sequence of images, according to an embodiment of the invention.
[0132] According to an embodiment, the method at step 1208 of Figure 12 constructs a new gradient image using geodesic reconstruction. Two different tecliniques of geodesic reconstruction which embodiments may employ include erosion and dilation. In step 1210 of
Figure 12, the method determines whether over-segmentation has been reduced sufficiently, according to this embodiment. If so, the segmentation may be considered complete, or one or more additional segmentation tecliniques may be applied, such as a region merging or robust region merging technique, both of which are discussed above. If over-segmentation has not been reduced sufficiently, the method calculates the watershed transform of the reconstructed gradient image as in step 1206, and the process is continued.
[0133] Certain embodiment methods use the hierarchical watershed to segment larger areas, such as large lesions or the background cervix. In some embodiments, the number of iterations is less than about 4 such that regions do not become too large, obscuring real details.
[0134] In some embodiments, the method performs one iteration of the hierarchical watershed, and continues merging regions using the robust region merging technique.
[0135] Figures 14A and 14B show an illustrative embodiment of the method at step 112 of
Figure 1, relating images after segmentation. Figure 14A depicts a segmentation mask 1402 produced using one iteration of the hierarchical watershed technique discussed above for an exemplary aceto-whitening sequence, according to the embodiment. Each region has a different label, and is represented by a different color in the mask 1402. Figure 14B depicts a segmentation mask 1430 produced using two iterations of the hierarchical watershed technique discussed above for the exemplary aceto-whitening sequence, according to the embodiment. The segmentation mask 1430 in Figure 14B, produced using two iterations, has fewer regions and is more simplified than the segmentation mask 1402 in Figure 14 A, produced using one iteration.
[0136] Other embodiments employ a "region growing technique" to performing step 110 of
Figure 1, segmenting an area represented in a sequence of images into regions based on measures of similarity between regions over the sequence. The region growing technique is different from the region merging and robust region merging techniques in that one or more initial regions, called seed regions, grow by merging with neighboring regions. The region merging, robust region merging, and region growing methods are each iterative. The region merging and region growing techniques each use the same fitting function to evaluate the similarity between signals from neighboring regions. In some embodiments of the region growing algorithm, the user manually selects seed regions. In other embodiments, the seed regions are selected in an automatic fashion. Hard criteria may be used to select areas that are of high interest and/or which behave in a certain way. In some embodiments, a user selects seed regions based on that user's experience. The region growing algorithm then proceeds by detecting similar regions and adding them to the set of seeds.
[0137] One embodiment of the invention is a combined technique using the region growing algorithm, starting with a grain image, followed by performing one iteration of the hierarchical watershed technique, and then growing the selected region according to the robust region merging algorithm. [0138] In some embodiments, the segmentation techniques discussed herein are combined in various ways. In some embodiments, the method processes and analyzes data from a sequence of images in an aceto-whitening test, for instance, using a coarse-to-fme approach. In one embodiment, a first segmentation reveals large whitening regions, called background lesions, which are then considered as regions of interest and are masked for additional segmentation.
[0139] A second segmentation step of the embodiment may outline smaller regions, called foreground lesions. Segmentation steps subsequent to the second step may also be considered.
As used here, the term "lesion" does not necessarily refer to any diagnosed area, but to an area of interest, such as an area displaying a certain whitening characteristic during the sequence. From the final segmentation, regions are selected for diagnosis, preliminary or otherwise; for further analysis; or for biopsy, for example. Additionally, the segmentation information may be combined with manually drawn biopsy locations for which a pathology report may be generated.
In one illustrative embodiment, the method still applies the pre-processing procedures discussed herein above before performing the multi-step segmentation tecliniques.
[0140] Figure 16A shows a segmentation mask produced using a combined clustering approach and robust region merging approach for an exemplary aceto-whitening sequence, according to an illustrative embodiment of the invention. In the embodiment, the method performs pre-processing steps, including pre-segmenting pixels into grains using a watershed transform as discussed herein above. Then, the method applies the clustering technique to the sequence as discussed in Figure 10, using c = 3 clusters and J,„ = 0.001. This produces a
"coarse" segmentation. From this coarse segmentation, the method selects a boomerang-shaped background lesion, con-esponding to a large whitening region. The method masks out, or eliminates from further analysis, the remaining areas of the image frame.
[0141] Then, a robust region merging procedure is applied, as shown in Figure 8, to the background lesion, according to the embodiment. Here, the method uses a similarity criterion of ψki - 0.7 in step 807 of Figure 8, and a variance threshold of 120 in step 810 of Figure 8. In this and other embodiments, regions less than 16 pixels large are removed. The resulting segmentation is shown in frame 1602 of Figure 16A.
[0142] Figure 16B shows a graph 1604 depicting mean signal intensity 1606 of segmented regions represented in Figure 16A as functions of a time index 1608 according to an illustrative embodiment of the invention. The color of each curve in Figure 16B corresponds to the same- colored segment depicted in Figure 16 A. Regions having a high initial rate of increase of signal intensity 1606 include regions 1610 and 1612, shown in Figure 16 A. [0143] Figure 17A represents a segmentation mask produced using a combined clustering approach and watershed approach for the exemplary aceto-whitening sequence of Figure 16A, according to an illustrative embodiment of the invention. The method performs pre-processing steps, including the pre-segmenting of pixels into grains using a watershed transform as discussed herein above. Then, the method applies the clustering technique to the sequence as discussed in Figure 10, using c = 3 clusters and Jm = 0.001. This produces a "coarse" segmentation. From this coarse segmentation, the method selects a boomerang-shaped background lesion, corresponding to a large whitening region. The method masks out remaining areas of the image frame.
[0144] Then, the method applies a hierarchical watershed segmentation procedure, as shown in Figure 12. In this embodiment, the method computes one iteration of the watershed transform, then a region merging technique as per Figure 6, using a fitting value threshold of 0.85 in step 610 of Figure 6. Regions smaller than 16 pixels are removed. The resulting segmentation is shown in frame 1702 of Figure 17A.
[0145] Figure 17B shows a graph 1720 depicting mean signal intensity 1722 of segmented regions represented in Figure 17A as functions of a time index 1724 according to an illustrative embodiment of the invention. The color of each curve in Figure 17B corresponds to the same- colored segment depicted in Figure 17 A. Regions having a high initial rate of increase of signal intensity 1722 include regions 1726 and 1728, shown in Figure 17A. [0146] Figure 18A represents a segmentation mask produced using a two-step clustering approach for the exemplary aceto-whitening sequence of Figure 16 A, according to an illustrative embodiment of the invention. The method performs pre-processing steps, including pre- segmenting pixels into grains using a watershed transform as discussed herein above. Then, the method applies a clustering technique to the sequence as discussed in Figure 10, using c = 3 clusters and J„, = 0.001. This produces a "coarse" segmentation. From this coarse segmentation, the method selects a boomerang-shaped background lesion, corresponding to a large whitening region. The method masks out the remaining areas of the image frame from further analysis. [0147] Then, the method applies a second clustering procedure, as shown in Figure 10, to the background lesion. Here again, the method uses c = 3 clusters and Jm = 0.001. Regions less than 16 pixels large are removed. This produces a foreground lesion, shown in frame 1802 of Figure 18 A.
[0148] Figure 18B shows a graph 1820 depicting mean signal intensity 1822 of segmented regions represented in Figure 18 A as functions of a time index 1824 according to an illustrative embodiment of the invention. The color of each curve in Figure 18B corresponds to the same- colored cluster depicted in Figure 18 A. Regions having a high initial rate of increase of signal intensity 1822 include regions 1828 and 1826, shown in Figure 18 A. [0149] Figure 19 depicts the human cervix tissue of Figure 2A with an overlay of manual doctor annotations made after viewing the exemplary aceto-whitening image sequence discussed herein above. Based on her viewing of the sequence and on her experience with the aceto- whitening procedure, the doctor annotated regions with suspicion of pathology 1904, 1906, 1908, 1910, and 1912. The doctor did not examine results of any segmentation analysis prior to making the annotations. Regions 1910 and 1912 were singled out by the doctor as regions with the highest suspicion of pathology.
[0150] Figure 20 A is a representation of a segmentation mask produced using a combined clustering approach and robust region merging approach as discussed above and as shown in Figure 16 A, according to an illustrative embodiment of the invention. The segmentation mask in Figure 20 A, however, is shown with a correspondingly-aligned overlay of the manual doctor annotations of Figure 19. Figure 20B is a representation of a segmentation mask produced using a combined clustering approach and watershed technique as discussed above and shown in Figure 18 A. The segmentation mask in Figure 20B, however, is shown with a correspondingly- aligned overlay of the manual doctor annotations of Figure 19, according to an embodiment of the invention.
[0151] Areas 1912 and 1910 in Figures 20A and 20B correspond to the doctor's annotations of areas of high suspicion of pathology. In the segmentation masks produced from the combined techniques of both Figures 20A and 20B, these areas (1912 and 1910) correspond to regions of rapid, intense whitening. In the doctor's experience, areas of rapid, intense whitening correspond to areas of suspicion of pathology. Thus, the techniques discussed herein provide a method of determining a tissue characteristic, namely, the presence or absence of a suspicion of pathology. Certain embodiments of the invention use the techniques in addition to a doctor's analysis or in place of a doctor's analysis. Certain embodiments use combinations of the methods described herein to produce similar results.
[0152] Some embodiments of the invention for applications other than the analysis of acetowliitening tests of cervical tissue also use various inventive analysis techniques as described herein. A practitioner may customize elements of an analysis technique disclosed herein, based on the attributes of her particular application, according to embodiments of the invention. For instance, the practitioner may choose among the segmentation techniques disclosed herein, depending on the application for which she intends to practice embodiments of the inventive methods. By using the techniques described herein, it is possible to visually capture all the frames of a sequence at once and relate regions according to their signals over a period of time. [0153] Certain embodiments of the invention methods analyze more complex behavior. Some embodiments segment the image plane of a sequence of images, then feature-extract the resulting mean intensity signals to characterize the signals of each segmented region. Examples of feature extraction procedures include any number of curve fitting techniques or functional analysis techniques used to mathematically and/or statistically describe characteristics of one or more data series. In some embodiments, these features are then used in a manual, automated, or semi- automated method for the classification of tissue.
[0154] For example, in certain embodiments, the method classifies a region of cervical tissue either as "high grade disease" tissue, which includes Cervical Intraepithelial Neoplasia II/III (CIN II/III), or as "not high grade disease" tissue, which includes normal squamous (NED - no evidence of disease), metaplasia, and CIN I tissue. The classification for a segmented region may be within a predicted degree of certainty using features extracted from the mean signal intensity curve corresponding to the region. In one embodiment, this classification is performed for each segmented region in an image plane to produce a map of regions of tissue classified as high grade disease tissue. Other embodiments make more specific classifications and distinctions between tissue characteristics, such as distinction between NED, metaplasia, and CIN I tissue.
[0155] Figure 21 A, Figure 21 B, Figure 21 C, and Figure 21 D depict steps in the classification of regions of tissue in a sequence of images obtained during an acetowhitening procedure performed on a patient with high grade disease according to an illustrative embodiment of the invention. Figure 21 A depicts a reference image 2102 of cervical tissue of the patient from a sequence of images obtained during the acetowhitening test. Figure 21B is a representation 2106 of the reference image 2102 of Figure 21 A after applying a manual mask, accounting for glare, and accounting for chromatic artifacts as discussed herein according to an illustrative embodiment of the invention. For example, areas such as areas 2110 and 2112 of Figure 2 IB have been masked for glare and chromatic effects, respectively, using techniques as discussed herein. Figure 21C shows a graph 2120 depicting mean signal intensities 2122 of segmented regions for the sequence of Figure 21 A as functions of a time index 2124 and as determined using the clustering segmentation approach depicted in the schematic flow diagram 1002 of
Figure 10 and as discussed herein according to an illustrative embodiment of the invention.
[0156] It was desired to classify each of the segmented regions as either "indicative of high grade disease" or "not indicative of high grade disease." Thus, an embodiment of the invention extracted specific features from each of the mean signal intensity data series depicted in the graph 2120 of Figure 21C, and used these features in a classification algorithm.
[0157] A classification algorithm was designed using results of a clinical study. In the study, mean signal intensity curves were determined using sequences of images from acetowhitening tests performed on over 200 patients. The classification algorithm may be updated according to an embodiment of the invention upon conducting further or different clinical testing. The present algorithm was based upon two feature parameters extracted from each of certain mean signal intensity data series corresponding to segmented regions of the image sequences for which biopsies were performed. These two feature parameters are as follows:
1. X is the slope of a curve (here, a polynomial curve) fitted to the mean signal intensity data series of a segmented region at the time corresponding to 235 seconds after application of the acetic acid (M235); and
2. Y is the slope of the polynomial curve at the time corresponding to an intensity that is — 16dB from the maximum intensity (about 45% of the maximum mean signal intensity) on the decaying side of the polynomial curve (-16dB slope). The choice of the feature parameters X and Y above was made by conducting a Discrimination
Function (DF) analysis of the data sets from the clinical study. A wide range of candidate feature parameters, including X and Y, were tested. X and Y provided a classification algorithm having the best accuracy.
[0158] A jackknifed classification matrix linear discriminant analysis was performed on the extracted features X and Y corresponding to certain of the mean signal intensity curves from each of the clinical tests. The curves used were those corresponding to regions for which tissue biopsies were performed. From the linear discriminant analysis, it was determined that a classification algoritlim using the discriminant line shown in Equation (24) results in a diagnostic sensitivity of 88% and a specificity of 88% for the separation of CIN II/III (high grade disease) from the group consisting of normal squamous (NED), metaplasia, and CIN I tissue (not high grade disease):
Y = -0.9282X - 0.1348 . (24)
Varying the classification model parameters by as much as 10% yields very similar model outcomes, suggesting the model features are highly stable.
[0159] Figure 21D represents a map 2130 of regions of tissue as segmented in Figure 21C classified as either high grade disease tissue or not high grade disease tissue using the classification algoritlim of Equation (24). This embodiment determined this classification for each of the segmented regions by calculating X and Y for each region and determining whether the point (X,Y) falls below the line of Equation (24), in which case the region was classified as high grade disease, or whether the point (X,Y) falls above the line of Equation (24), in which case the region was classified as not high grade disease. The embodiment draws further distinction depending on how far above or below the line of Equation (24) the point (X,Y) falls.
The map 2130 of Figure 2 ID indicates segmented regions of high grade disease as red, orange, and yellow, and regions not classifiable as high grade disease as blue. The index 2132 reflects how far the point (X,Y) of a given segment falls below the line of Equation (24). Other embodiments include those employing other segmentation techniques as described herein. Still other embodiments include those employing different classification algorithms, including those using feature parameters other than X and Y, extracted from mean signal data series corresponding to segmented regions.
Equivalents [0160] Wliile the invention has been particularly shown and described with reference to specific preferred embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. What is claimed is:

Claims

1. A method of relating a plurality of images of a tissue, said method comprising: obtaining a plurality of images of a tissue; determining a relationship between two or more regions in each of two or more of said images; segmenting at least a subset of said two or more images based at least in part on said relationship; and relating two or more images of said subset of images based at least in part on said segmenting.
2. The method of claim 1 , wherein said determining of said relationship comprises determining a measure of similarity between at least two of said two or more regions in each of said two or more of said images.
3. The method of claim 2, wherein said determining of said measure of similarity comprises computing an N-dimensional dot product of mean signal intensities of two of said two or more regions.
4. The method of claim 1 , wherein said tissue comprises cervical tissue.
5. The method of claim 1 , wherein said plurality of images comprises sequential images of said tissue.
6. The method of claim 1 , further comprising filtering said subset of said two or more images.
7. The method of claim 6, wherein said filtering comprises applying at least one of a temporal filter and a spatial filter.
8. The method of claim 1, fiirther comprising applying a chemical agent to said tissue.
9. The method of claim 8, wherein said chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, and indigo carmine.
10. The method of claim 1, wherein said obtaining step comprises collecting an optical signal.
11. The method of claim 10, wherein said optical signal comprises fluorescence illumination.
12. The method of claim 10, wherein said optical signal comprises reflectance illumination.
13. The method of claim 1 , wherein said obtaining of said plurality of images comprises recording visual images of said tissue.
14. The method of claim 1, wherein said relating step comprises determining a segmentation mask of an image plane wherein two or more regions of said image plane are differentiated.
15. The method of claim 1, wherein said relating step comprises defining one or more data series representing a characteristic of one or more associated segmented regions of an image plane.
16. A method of relating a plurality of images of a tissue, said method comprising: obtaining a plurality of images of a tissue; determining a measure of similarity between two or more regions in each of two or more of said images; and relating at least a subset of said two or more images based at least in part on said measure of similarity.
17. The method of claim 16, wherein determining said measure of similarity comprises computing an N-dimensional dot product of mean signal intensities of two of said two or more regions.
18. A method of determining a tissue characteristic, said method comprising: obtaining a plurality of images of a tissue; detennining a relationship between two or more regions in each of two or more of said images; segmenting at least a subset of said two or more images based at least in part on said relationship; and determining a characteristic of said tissue based at least in part on said segmenting.
19. The method of claim 18, further comprising applying a chemical agent to said tissue.
20. The method of claim 19, wherein said chemical agent is selected from the group consisting of acetic acid, formic acid, propionic acid, butyric acid, Lugol's iodine, Shiller's iodine, methylene blue, toluidine blue, and indigo carmine.
21. The method of claim 18, further comprising filtering said two or more images.
22. The method of claim 21, wherein said filtering comprises applying at least one of a temporal filter and a spatial filter.
23. The method of claim 18, further comprising processing said two or more images to compensate for a relative motion between said tissue and a detection device.
24. The method of claim 18, wherein said tissue comprises cervical tissue.
25. The method of claim 18, wherein said segmenting comprises analyzing an acetowhitening signal.
26. The method of claim 18, wherein said plurality of images comprises sequential images of said tissue.
27. The method of claim 18, wherein said segmenting comprises analyzing a variance signal.
28. The method of claim 18, wherein said segmenting comprises determining a gradient image.
29. The method of claim 18, further comprising processing one or more optical signals based at least in part on said segmenting.
30. The method of claim 18, further comprising filtering at least one image based at least in part on said segmenting.
31. The method of claim 18, wherein said determining a characteristic of said tissue comprises determining one or more regions of said tissue with a suspicion of pathology.
32. The method of claim 18, wherein said determining a characteristic of said tissue comprises classifying a region of tissue as one of the group consisting of normal squamous tissue, metaplasia, CIN I, and CIN II/CIN III.
33. A method of determining a characteristic of a tissue comprising the steps of:
(a) for each of a first plurality of reference sequences of images of tissue having a first known characteristic, quantifying one or more features of each of a plurality of mean signal intensity data series corresponding to segmented regions represented in said each of said first plurality of reference sequences of images;
(b) for a test sequence of images, quantifying one or more features of each of one or more mean signal intensity data series corresponding to one or more segmented regions represented in said test sequence of images; and
(c) determining a characteristic of a tissue represented in said test sequence of images based at least in part on a comparison between said one or more features quantified in step (a) and said one or more features quantified in step (b).
34. The method of claim 33, wherein step (c) further comprises repeating step (a) for each of a second plurality of reference sequences of images of tissue having a second known characteristic.
35. The method of claim 34, wherein step (c) further comprises applying a classification rule based at least in part on said first plurality of reference sequences and said second plurality of reference sequences.
36. The method of claim 35, wherein step (c) comprises performing a linear discriminant analysis to determine said classification rule.
37. The method of claim 33, wherein one of said one or more features quantified in step (a) comprises the slope of a curve at a given point fitted to one of said plurality of mean signal intensity data series.
38. The method of claim 33, further comprising determining said segmented regions of said test sequence of images by analyzing an acetowhitening signal.
39. The method of claim 34, wherein said first known characteristic is CIN II/CIN III and said second known characteristic is absence of CIN II/CIN III.
PCT/US2003/003007 2002-01-31 2003-01-31 Image processing using measures of similarity WO2003063706A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2003207787A AU2003207787A2 (en) 2002-01-31 2003-01-31 Image processing using measures of similarity
EP03706024A EP1476076A1 (en) 2002-01-31 2003-01-31 Image processing using measures of similarity
CA002474417A CA2474417A1 (en) 2002-01-31 2003-01-31 Image processing using measures of similarity

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US35397802P 2002-01-31 2002-01-31
US60/353,978 2002-01-31
US10/099,881 2002-03-15
US10/099,881 US7260248B2 (en) 1999-12-15 2002-03-15 Image processing using measures of similarity

Publications (1)

Publication Number Publication Date
WO2003063706A1 true WO2003063706A1 (en) 2003-08-07

Family

ID=27616078

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2003/003007 WO2003063706A1 (en) 2002-01-31 2003-01-31 Image processing using measures of similarity

Country Status (5)

Country Link
US (1) US7260248B2 (en)
EP (1) EP1476076A1 (en)
AU (1) AU2003207787A2 (en)
CA (1) CA2474417A1 (en)
WO (1) WO2003063706A1 (en)

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0028491D0 (en) * 2000-11-22 2001-01-10 Isis Innovation Detection of features in images
US6839661B2 (en) * 2000-12-15 2005-01-04 Medispectra, Inc. System for normalizing spectra
US7263538B2 (en) * 2002-04-19 2007-08-28 City University Of Hong Kong Curve tracing system
WO2004057439A2 (en) * 2002-05-31 2004-07-08 University Of Utah Research Foundation System and method for visual annotation and knowledge representation
US6818903B2 (en) 2002-07-09 2004-11-16 Medispectra, Inc. Method and apparatus for identifying spectral artifacts
US6768918B2 (en) 2002-07-10 2004-07-27 Medispectra, Inc. Fluorescent fiberoptic probe for tissue health discrimination and method of use thereof
EP1611548A1 (en) * 2003-03-27 2006-01-04 Koninklijke Philips Electronics N.V. Medical imaging system and a method for segmenting an object of interest.
US7983446B2 (en) * 2003-07-18 2011-07-19 Lockheed Martin Corporation Method and apparatus for automatic object identification
US20080013814A1 (en) * 2004-05-06 2008-01-17 Koninklijke Philips Electronics, N.V. Pharmacokinetic Image Registration
US8795195B2 (en) * 2004-11-29 2014-08-05 Senorx, Inc. Graphical user interface for tissue biopsy system
US7607079B2 (en) * 2005-07-08 2009-10-20 Bruce Reiner Multi-input reporting and editing tool
US9101282B2 (en) * 2005-09-22 2015-08-11 Brainlab Ag Brain tissue classification
WO2007048853A2 (en) * 2005-10-28 2007-05-03 Basf Se Method for the synthesis of aromatic hydrocarbons from c1-c4 alkanes, and utilization of a c1-c4 alkane-containing product flow
US8509965B2 (en) * 2006-12-12 2013-08-13 American Gnc Corporation Integrated collision avoidance system for air vehicle
DE102007009485A1 (en) * 2007-02-22 2008-08-28 Perner, Petra, Dr.-Ing. Method and data processing system for modeling the image segmentation
US8335358B2 (en) * 2007-03-29 2012-12-18 Palodex Group Oy Method and system for reconstructing a medical image of an object
US8073277B2 (en) * 2007-06-21 2011-12-06 The University Of Southern Mississippi Apparatus and methods for image restoration
JP2010535579A (en) * 2007-08-03 2010-11-25 エスティーアイ・メディカル・システムズ・エルエルシー Computer image analysis of acetic acid processed cervical intraepithelial neoplasia
WO2009078957A1 (en) 2007-12-14 2009-06-25 Flashfoto, Inc. Systems and methods for rule-based segmentation for objects with full or partial frontal view in color images
JP5291955B2 (en) * 2008-03-10 2013-09-18 富士フイルム株式会社 Endoscopy system
US9581723B2 (en) 2008-04-10 2017-02-28 Schlumberger Technology Corporation Method for characterizing a geological formation traversed by a borehole
US8725477B2 (en) 2008-04-10 2014-05-13 Schlumberger Technology Corporation Method to generate numerical pseudocores using borehole images, digital rock samples, and multi-point statistics
US8437570B2 (en) * 2008-05-23 2013-05-07 Microsoft Corporation Geodesic image and video processing
US20100008576A1 (en) * 2008-07-11 2010-01-14 Robinson Piramuthu System and method for segmentation of an image into tuned multi-scaled regions
US8306302B2 (en) * 2008-09-29 2012-11-06 Carestream Health, Inc. Noise suppression in diagnostic images
EP2344982A4 (en) * 2008-10-10 2012-09-19 Sti Medical Systems Llc Methods for tissue classification in cervical imagery
WO2010110138A1 (en) * 2009-03-24 2010-09-30 オリンパス株式会社 Fluorescence observation device, fluorescence observation system, and fluorescence image processing method
US8311788B2 (en) * 2009-07-01 2012-11-13 Schlumberger Technology Corporation Method to quantify discrete pore shapes, volumes, and surface areas using confocal profilometry
US8351654B2 (en) * 2009-04-28 2013-01-08 Microsoft Corporation Image processing using geodesic forests
US8670615B2 (en) * 2009-09-30 2014-03-11 Flashfoto, Inc. Refinement of segmentation markup
US9311567B2 (en) 2010-05-10 2016-04-12 Kuang-chih Lee Manifold learning and matting
WO2012123881A2 (en) 2011-03-16 2012-09-20 Koninklijke Philips Electronics N.V. Medical instrument for examining the cervix
JP5926909B2 (en) 2011-09-07 2016-05-25 オリンパス株式会社 Fluorescence observation equipment
US8781173B2 (en) 2012-02-28 2014-07-15 Microsoft Corporation Computing high dynamic range photographs
US9262834B2 (en) * 2012-07-30 2016-02-16 General Electric Company Systems and methods for performing segmentation and visualization of images
WO2014074399A1 (en) * 2012-11-07 2014-05-15 Sony Corporation Method and apparatus for tissue region identification
US8942447B2 (en) 2012-11-07 2015-01-27 Sony Corporation Method and apparatus for tissue region identification
WO2014168734A1 (en) 2013-03-15 2014-10-16 Cedars-Sinai Medical Center Time-resolved laser-induced fluorescence spectroscopy systems and uses thereof
EP2856926A1 (en) 2013-10-04 2015-04-08 Tidi Products, LLC Sheath for a medical or dental instrument
US9852354B2 (en) * 2014-05-05 2017-12-26 Dako Denmark A/S Method and apparatus for image scoring and analysis
US10898079B2 (en) * 2016-03-04 2021-01-26 University Of Manitoba Intravascular plaque detection in OCT images
WO2017173315A1 (en) 2016-04-01 2017-10-05 Black Light Surgical, Inc. Systems, devices, and methods for time-resolved fluorescent spectroscopy
CN108171712B (en) * 2016-12-07 2022-02-11 富士通株式会社 Method and device for determining image similarity
US10721477B2 (en) 2018-02-07 2020-07-21 Netflix, Inc. Techniques for predicting perceptual video quality based on complementary perceptual quality models
US10887602B2 (en) * 2018-02-07 2021-01-05 Netflix, Inc. Techniques for modeling temporal distortions when predicting perceptual video quality
CN111868783B (en) * 2019-02-14 2021-03-23 中国水利水电科学研究院 Region merging image segmentation algorithm based on boundary extraction
CN109977955B (en) * 2019-04-03 2021-11-30 南昌航空大学 Cervical carcinoma pre-lesion identification method based on deep learning
US11654635B2 (en) 2019-04-18 2023-05-23 The Research Foundation For Suny Enhanced non-destructive testing in directed energy material processing
US11527329B2 (en) 2020-07-28 2022-12-13 Xifin, Inc. Automatically determining a medical recommendation for a patient based on multiple medical images from multiple different medical imaging modalities
US11688517B2 (en) 2020-10-30 2023-06-27 Guerbet Multiple operating point false positive removal for lesion identification
US11688063B2 (en) 2020-10-30 2023-06-27 Guerbet Ensemble machine learning model architecture for lesion detection
US11436724B2 (en) 2020-10-30 2022-09-06 International Business Machines Corporation Lesion detection artificial intelligence pipeline computing system
US11749401B2 (en) * 2020-10-30 2023-09-05 Guerbet Seed relabeling for seed-based segmentation of a medical image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5690106A (en) * 1995-06-30 1997-11-25 Siemens Corporate Research, Inc. Flexible image registration for rotational angiography
US5995645A (en) * 1993-08-18 1999-11-30 Applied Spectral Imaging Ltd. Method of cancer cell detection
US6058322A (en) * 1997-07-25 2000-05-02 Arch Development Corporation Methods for improving the accuracy in differential diagnosis on radiologic examinations
US6312385B1 (en) * 2000-05-01 2001-11-06 Ge Medical Systems Global Technology Company, Llc Method and apparatus for automatic detection and sizing of cystic objects
US6317617B1 (en) * 1997-07-25 2001-11-13 Arch Development Corporation Method, computer program product, and system for the automated analysis of lesions in magnetic resonance, mammogram and ultrasound images

Family Cites Families (163)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3013467A (en) 1957-11-07 1961-12-19 Minsky Marvin Microscopy apparatus
US3632865A (en) 1969-12-23 1972-01-04 Bell Telephone Labor Inc Predictive video encoding using measured subject velocity
US3809072A (en) 1971-10-07 1974-05-07 Med General Inc Sterile sheath apparatus for fiber optic illuminator with compatible lens
IT985204B (en) 1972-05-26 1974-11-30 Adelman Stuart Lee IMPROVEMENT IN ENDOSCOPES AND THE LIKE
US3890462A (en) 1974-04-17 1975-06-17 Bell Telephone Labor Inc Speed and direction indicator for video systems
US3963019A (en) 1974-11-25 1976-06-15 Quandt Robert S Ocular testing method and apparatus
US4017192A (en) 1975-02-06 1977-04-12 Neotec Corporation Optical analysis of biomedical specimens
US4071020A (en) 1976-06-03 1978-01-31 Xienta, Inc. Apparatus and methods for performing in-vivo measurements of enzyme activity
GB1595422A (en) 1977-04-28 1981-08-12 Nat Res Dev Scaning microscopes
FR2430754A1 (en) 1978-07-13 1980-02-08 Groux Jean ULTRAVIOLET ENDOSCOPE
US4218703A (en) 1979-03-16 1980-08-19 Bell Telephone Laboratories, Incorporated Technique for estimation of displacement and/or velocity of objects in video scenes
US4357075A (en) 1979-07-02 1982-11-02 Hunter Thomas M Confocal reflector system
US4349510A (en) 1979-07-24 1982-09-14 Seppo Kolehmainen Method and apparatus for measurement of samples by luminescence
US4254421A (en) 1979-12-05 1981-03-03 Communications Satellite Corporation Integrated confocal electromagnetic wave lens and feed antenna system
DE2951459C2 (en) 1979-12-20 1984-03-29 Heimann Gmbh, 6200 Wiesbaden Optical arrangement for a smoke detector based on the light scattering principle
US4515165A (en) 1980-02-04 1985-05-07 Energy Conversion Devices, Inc. Apparatus and method for detecting tumors
ATE23752T1 (en) 1980-08-21 1986-12-15 Oriel Scient Ltd OPTICAL ANALYZER.
US4396579A (en) 1981-08-06 1983-08-02 Miles Laboratories, Inc. Luminescence detection device
AU556742B2 (en) 1982-02-01 1986-11-20 Sony Corporation Digital tape jitter compensation
JPS5952359A (en) 1982-09-02 1984-03-26 Hitachi Medical Corp Automatic corrector for picture distortion during inter-picture operation
US5139025A (en) 1983-10-14 1992-08-18 Somanetics Corporation Method and apparatus for in vivo optical spectroscopic examination
SE455736B (en) 1984-03-15 1988-08-01 Sarastro Ab PROCEDURE KIT AND MICROPHOTOMETRATION AND ADDITIONAL IMAGE COMPOSITION
US4641352A (en) 1984-07-12 1987-02-03 Paul Fenster Misregistration correction
US4662360A (en) 1984-10-23 1987-05-05 Intelligent Medical Systems, Inc. Disposable speculum
US5179936A (en) * 1984-10-23 1993-01-19 Intelligent Medical Systems, Inc. Disposable speculum with membrane bonding ring
US4646722A (en) 1984-12-10 1987-03-03 Opielab, Inc. Protective endoscope sheath and method of installing same
US4803049A (en) 1984-12-12 1989-02-07 The Regents Of The University Of California pH-sensitive optrode
US5199431A (en) 1985-03-22 1993-04-06 Massachusetts Institute Of Technology Optical needle for spectroscopic diagnosis
US5042494A (en) 1985-11-13 1991-08-27 Alfano Robert R Method and apparatus for detecting cancerous tissue using luminescence excitation spectra
US4930516B1 (en) 1985-11-13 1998-08-04 Laser Diagnostic Instr Inc Method for detecting cancerous tissue using visible native luminescence
GB8529889D0 (en) 1985-12-04 1986-01-15 Cardiff Energy & Resources Luminometer construction
JPS62138819A (en) 1985-12-13 1987-06-22 Hitachi Ltd Scanning type laser microscope
JPS62247232A (en) 1986-04-21 1987-10-28 Agency Of Ind Science & Technol Fluorescence measuring apparatus
US4852955A (en) 1986-09-16 1989-08-01 Laser Precision Corporation Microscope for use in modular FTIR spectrometer system
US5011243A (en) 1986-09-16 1991-04-30 Laser Precision Corporation Reflectance infrared microscope having high radiation throughput
US4741326A (en) 1986-10-01 1988-05-03 Fujinon, Inc. Endoscope disposable sheath
US4891829A (en) 1986-11-19 1990-01-02 Exxon Research And Engineering Company Method and apparatus for utilizing an electro-optic detector in a microtomography system
NL8603108A (en) 1986-12-08 1988-07-01 Philips Nv MICROSALE.
CA1300369C (en) 1987-03-24 1992-05-12 Timothy P. Dabbs Distance measuring device
US5235457A (en) 1987-09-24 1993-08-10 Washington University Kit for converting a standard microscope into a single aperture confocal scanning epi-illumination microscope
US4945478A (en) 1987-11-06 1990-07-31 Center For Innovative Technology Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like
US4800571A (en) 1988-01-11 1989-01-24 Tektronix, Inc. Timing jitter measurement display
US4844617A (en) 1988-01-20 1989-07-04 Tencor Instruments Confocal measuring microscope with automatic focusing
FR2626383B1 (en) 1988-01-27 1991-10-25 Commissariat Energie Atomique EXTENDED FIELD SCAN AND DEPTH CONFOCAL OPTICAL MICROSCOPY AND DEVICES FOR CARRYING OUT THE METHOD
US4997242A (en) 1988-03-07 1991-03-05 Medical Research Council Achromatic scanning system
US5032720A (en) 1988-04-21 1991-07-16 White John G Confocal imaging system
US4877033A (en) 1988-05-04 1989-10-31 Seitz Jr H Michael Disposable needle guide and examination sheath for transvaginal ultrasound procedures
DE8808299U1 (en) 1988-06-25 1989-07-20 Effner Gmbh, 1000 Berlin, De
EP1245987B1 (en) 1988-07-13 2008-01-23 Optiscan Pty Ltd Scanning confocal microscope
CA1325537C (en) 1988-08-01 1993-12-28 Timothy Peter Dabbs Confocal microscope
US5036853A (en) 1988-08-26 1991-08-06 Polartechnics Ltd. Physiological probe
US4972258A (en) 1989-07-31 1990-11-20 E. I. Du Pont De Nemours And Company Scanning laser microscope system and methods of use
US5101825A (en) 1988-10-28 1992-04-07 Blackbox, Inc. Method for noninvasive intermittent and/or continuous hemoglobin, arterial oxygen content, and hematocrit determination
US5205291A (en) 1988-11-08 1993-04-27 Health Research, Inc. In vivo fluorescence photometer
US5022757A (en) 1989-01-23 1991-06-11 Modell Mark D Heterodyne system and method for sensing a target substance
US4878485A (en) 1989-02-03 1989-11-07 Adair Edwin Lloyd Rigid video endoscope with heat sterilizable sheath
US5003979A (en) 1989-02-21 1991-04-02 University Of Virginia System and method for the noninvasive identification and display of breast lesions and the like
US5201318A (en) 1989-04-24 1993-04-13 Rava Richard P Contour mapping of spectral diagnostics
JPH0378720A (en) 1989-08-22 1991-04-03 Nikon Corp Confocal laser scanning microscope
US5267179A (en) * 1989-08-30 1993-11-30 The United States Of America As Represented By The United States Department Of Energy Ferroelectric optical image comparator
US5065008A (en) 1989-10-18 1991-11-12 Fuji Photo Film Co., Ltd. Scanning microscope and scanning mechanism for the same
DE8912757U1 (en) 1989-10-27 1989-12-07 Fa. Carl Zeiss, 7920 Heidenheim, De
US4979498A (en) 1989-10-30 1990-12-25 Machida Incorporated Video cervicoscope system
US5034613A (en) 1989-11-14 1991-07-23 Cornell Research Foundation, Inc. Two-photon laser microscopy
US5257617A (en) 1989-12-25 1993-11-02 Asahi Kogaku Kogyo Kabushiki Kaisha Sheathed endoscope and sheath therefor
US5028802A (en) 1990-01-11 1991-07-02 Eye Research Institute Of Retina Foundation Imaging apparatus and methods utilizing scannable microlaser source
US5274240A (en) 1990-01-12 1993-12-28 The Regents Of The University Of California Capillary array confocal fluorescence scanner and method
US5091652A (en) 1990-01-12 1992-02-25 The Regents Of The University Of California Laser excited confocal microscope fluorescence scanner and method
EP0448931B1 (en) * 1990-01-26 1996-04-03 Canon Kabushiki Kaisha Method for measuring a specimen by the use of fluorescence light
JPH0742401Y2 (en) 1990-02-01 1995-10-04 株式会社町田製作所 Endoscope cover
JPH03101903U (en) 1990-02-01 1991-10-23
US5074306A (en) 1990-02-22 1991-12-24 The General Hospital Corporation Measurement of burn depth in skin
US5083220A (en) 1990-03-22 1992-01-21 Tandem Scanning Corporation Scanning disks for use in tandem scanning reflected light microscopes and other optical systems
JP2613118B2 (en) 1990-04-10 1997-05-21 富士写真フイルム株式会社 Confocal scanning microscope
US5048946A (en) 1990-05-15 1991-09-17 Phoenix Laser Systems, Inc. Spectral division of reflected light in complex optical diagnostic and therapeutic systems
GB9014263D0 (en) 1990-06-27 1990-08-15 Dixon Arthur E Apparatus and method for spatially- and spectrally- resolvedmeasurements
GB9016587D0 (en) 1990-07-27 1990-09-12 Isis Innovation Infra-red scanning microscopy
US6671540B1 (en) * 1990-08-10 2003-12-30 Daryl W. Hochman Methods and systems for detecting abnormal tissue using spectroscopic techniques
US5239178A (en) 1990-11-10 1993-08-24 Carl Zeiss Optical device with an illuminating grid and detector grid arranged confocally to an object
US5168157A (en) 1990-11-20 1992-12-01 Fuji Photo Film Co., Ltd. Scanning microscope with means for detecting a first and second polarized light beams along first and second optical receiving paths
US5193525A (en) 1990-11-30 1993-03-16 Vision Sciences Antiglare tip in a sheath for an endoscope
JP3103894B2 (en) 1991-02-06 2000-10-30 ソニー株式会社 Apparatus and method for correcting camera shake of video data
US5261410A (en) 1991-02-07 1993-11-16 Alfano Robert R Method for determining if a tissue is a malignant tumor tissue, a benign tumor tissue, or a normal or benign tissue using Raman spectroscopy
US5162641A (en) 1991-02-19 1992-11-10 Phoenix Laser Systems, Inc. System and method for detecting, correcting and measuring depth movement of target tissue in a laser surgical system
US5303026A (en) 1991-02-26 1994-04-12 The Regents Of The University Of California Los Alamos National Laboratory Apparatus and method for spectroscopic analysis of scattering media
US5260578A (en) 1991-04-10 1993-11-09 Mayo Foundation For Medical Education And Research Confocal imaging system for visible and ultraviolet light
DK0540714T3 (en) * 1991-05-24 1998-09-07 British Broadcasting Corp video Imaging
JP2975719B2 (en) 1991-05-29 1999-11-10 オリンパス光学工業株式会社 Confocal optics
US5201908A (en) 1991-06-10 1993-04-13 Endomedical Technologies, Inc. Sheath for protecting endoscope from contamination
US5313567A (en) * 1991-06-13 1994-05-17 At&T Bell Laboratories Arrangement for determining and displaying volumetric data in an imaging system
US5237984A (en) * 1991-06-24 1993-08-24 Xomed-Treace Inc. Sheath for endoscope
US5203328A (en) 1991-07-17 1993-04-20 Georgia Tech Research Corporation Apparatus and methods for quantitatively measuring molecular changes in the ocular lens
US5162941A (en) 1991-07-23 1992-11-10 The Board Of Governors Of Wayne State University Confocal microscope
JPH0527177A (en) 1991-07-25 1993-02-05 Fuji Photo Film Co Ltd Scanning type microscope
JP3082346B2 (en) 1991-09-12 2000-08-28 株式会社ニコン Fluorescence confocal microscope
US5383874A (en) * 1991-11-08 1995-01-24 Ep Technologies, Inc. Systems for identifying catheters and monitoring their use
US5253071A (en) 1991-12-20 1993-10-12 Sony Corporation Of America Method and apparatus for stabilizing an image produced in a video camera
US5398685A (en) * 1992-01-10 1995-03-21 Wilk; Peter J. Endoscopic diagnostic system and associated method
US5284149A (en) 1992-01-23 1994-02-08 Dhadwal Harbans S Method and apparatus for determining the physical characteristics of ocular tissue
US5248876A (en) 1992-04-21 1993-09-28 International Business Machines Corporation Tandem linear scanning confocal imaging system with focal volumes at different heights
GB9213978D0 (en) * 1992-07-01 1992-08-12 Skidmore Robert Medical devices
US5609560A (en) * 1992-08-19 1997-03-11 Olympus Optical Co., Ltd. Medical operation device control system for controlling a operation devices accessed respectively by ID codes
US5402768A (en) * 1992-09-01 1995-04-04 Adair; Edwin L. Endoscope with reusable core and disposable sheath with passageways
US5306902A (en) 1992-09-01 1994-04-26 International Business Machines Corporation Confocal method and apparatus for focusing in projection lithography
US5704892A (en) * 1992-09-01 1998-01-06 Adair; Edwin L. Endoscope with reusable core and disposable sheath with passageways
US5285490A (en) * 1992-10-22 1994-02-08 Eastman Kodak Company Imaging combination for detecting soft tissue anomalies
US5418797A (en) * 1993-01-15 1995-05-23 The United States Of America As Represented By The Secretary Of The Navy Time gated imaging through scattering material using polarization and stimulated raman amplification
US5294799A (en) 1993-02-01 1994-03-15 Aslund Nils R D Apparatus for quantitative imaging of multiple fluorophores
US5415157A (en) * 1993-02-05 1995-05-16 Welcome; Steven Damage preventing endoscope head cover
US5419311A (en) * 1993-02-18 1995-05-30 Olympus Optical Co., Ltd. Endoscope apparatus of a type having cover for covering the endoscope
US5413108A (en) * 1993-04-21 1995-05-09 The Research Foundation Of City College Of New York Method and apparatus for mapping a tissue sample for and distinguishing different regions thereof based on luminescence measurements of cancer-indicative native fluorophor
US5421339A (en) * 1993-05-12 1995-06-06 Board Of Regents, The University Of Texas System Diagnosis of dysplasia using laser induced fluoroescence
US5596992A (en) * 1993-06-30 1997-01-28 Sandia Corporation Multivariate classification of infrared spectra of cell and tissue samples
US5496259A (en) * 1993-09-13 1996-03-05 Welch Allyn, Inc. Sterile protective sheath and drape for video laparoscope and method of use
US5412563A (en) * 1993-09-16 1995-05-02 General Electric Company Gradient image segmentation method
US5406939A (en) * 1994-02-14 1995-04-18 Bala; Harry Endoscope sheath
US5493444A (en) * 1994-04-28 1996-02-20 The United States Of America As Represented By The Secretary Of The Air Force Photorefractive two-beam coupling nonlinear joint transform correlator
US5599717A (en) * 1994-09-02 1997-02-04 Martin Marietta Energy Systems, Inc. Advanced synchronous luminescence system
JP3732865B2 (en) * 1995-01-18 2006-01-11 ペンタックス株式会社 Endoscope device
US5894340A (en) * 1995-02-17 1999-04-13 The Regents Of The University Of California Method for quantifying optical properties of the human lens
JP3490817B2 (en) * 1995-03-13 2004-01-26 ペンタックス株式会社 Endoscope tip
US5735276A (en) * 1995-03-21 1998-04-07 Lemelson; Jerome Method and apparatus for scanning and evaluating matter
US5612540A (en) * 1995-03-31 1997-03-18 Board Of Regents, The University Of Texas Systems Optical method for the detection of cervical neoplasias using fluorescence spectroscopy
US5713364A (en) * 1995-08-01 1998-02-03 Medispectra, Inc. Spectral volume microprobe analysis of materials
US5730701A (en) * 1995-09-12 1998-03-24 Olympus Optical Co., Ltd. Endoscope
JPH0998938A (en) * 1995-10-04 1997-04-15 Fuji Photo Optical Co Ltd Protector of insertion part of endoscope
US5865726A (en) * 1996-03-27 1999-02-02 Asahi Kogaku Kogyo Kabushiki Kaisha Front end structure of side-view type endoscope
AU706862B2 (en) * 1996-04-03 1999-06-24 Applied Biosystems, Llc Device and method for multiple analyte detection
US5717209A (en) * 1996-04-29 1998-02-10 Petrometrix Ltd. System for remote transmission of spectral information through communication optical fibers for real-time on-line hydrocarbons process analysis by near infra red spectroscopy
US5860913A (en) * 1996-05-16 1999-01-19 Olympus Optical Co., Ltd. Endoscope whose distal cover can be freely detachably attached to main distal part thereof with high positioning precision
WO1997050003A1 (en) * 1996-06-26 1997-12-31 Morphometrix Technologies Inc. Confocal ultrasonic imaging system
US5685822A (en) * 1996-08-08 1997-11-11 Vision-Sciences, Inc. Endoscope with sheath retaining device
US6101408A (en) * 1996-08-22 2000-08-08 Western Research Company, Inc. Probe and method to obtain accurate area measurements from cervical lesions
ATE270449T1 (en) * 1996-09-16 2004-07-15 Univ Syddansk METHOD AND COMPUTER PROGRAM FOR IMAGE ANALYSIS
CA2192036A1 (en) * 1996-12-04 1998-06-04 Harvey Lui Fluorescence scope system for dermatologic diagnosis
US6847490B1 (en) * 1997-01-13 2005-01-25 Medispectra, Inc. Optical probe accessory device for use in vivo diagnostic procedures
US6466687B1 (en) * 1997-02-12 2002-10-15 The University Of Iowa Research Foundation Method and apparatus for analyzing CT images to determine the presence of pulmonary tissue pathology
JP3654325B2 (en) * 1997-02-13 2005-06-02 富士写真フイルム株式会社 Fluorescence detection device
US5855551A (en) * 1997-03-17 1999-01-05 Polartechnics Limited Integral sheathing apparatus for tissue recognition probes
US6277067B1 (en) * 1997-04-04 2001-08-21 Kerry L. Blair Method and portable colposcope useful in cervical cancer detection
FR2763721B1 (en) * 1997-05-21 1999-08-06 Inst Nat Rech Inf Automat ELECTRONIC IMAGE PROCESSING DEVICE FOR DETECTING DIMENSIONAL VARIATIONS
CN1302210A (en) * 1997-10-20 2001-07-04 得克萨斯系统大学评议会 Acetic acid as a signal enhancing contrast agent in fluorescence spectroscopy
WO1999047041A1 (en) * 1998-03-19 1999-09-23 Board Of Regents, The University Of Texas System Fiber-optic confocal imaging apparatus and methods of use
US6377842B1 (en) * 1998-09-22 2002-04-23 Aurora Optics, Inc. Method for quantitative measurement of fluorescent and phosphorescent drugs within tissue utilizing a fiber optic probe
US6169817B1 (en) * 1998-11-04 2001-01-02 University Of Rochester System and method for 4D reconstruction and visualization
US6697666B1 (en) * 1999-06-22 2004-02-24 Board Of Regents, The University Of Texas System Apparatus for the characterization of tissue of epithelial lined viscus
US6208887B1 (en) * 1999-06-24 2001-03-27 Richard H. Clarke Catheter-delivered low resolution Raman scattering analyzing system for detecting lesions
US6633657B1 (en) * 1999-07-15 2003-10-14 General Electric Company Method and apparatus for controlling a dynamic range of a digital diagnostic image
US6717668B2 (en) * 2000-03-07 2004-04-06 Chemimage Corporation Simultaneous imaging and spectroscopy apparatus
DE60045146D1 (en) * 1999-11-02 2010-12-09 Fujifilm Corp Device for displaying fluorescence
US20020007122A1 (en) * 1999-12-15 2002-01-17 Howard Kaufman Methods of diagnosing disease
AU5116401A (en) * 2000-03-28 2001-10-08 Univ Texas Methods and apparatus for diagnositic multispectral digital imaging
GR1004180B (en) * 2000-03-28 2003-03-11 ����������� ����� ��������� (����) Method and system for characterization and mapping of tissue lesions
US6839661B2 (en) * 2000-12-15 2005-01-04 Medispectra, Inc. System for normalizing spectra
USD453832S1 (en) * 2001-02-09 2002-02-19 Medispectra, Inc. Sheath for cervical optical probe
USD453964S1 (en) * 2001-02-09 2002-02-26 Medispectra, Inc. Sheath for cervical optical probe
USD453963S1 (en) * 2001-02-09 2002-02-26 Medispectra, Inc. Sheath for cervical optical probe
USD453962S1 (en) * 2001-02-09 2002-02-26 Medispectra, Inc. Sheath for cervical optical probe
US7282723B2 (en) * 2002-07-09 2007-10-16 Medispectra, Inc. Methods and apparatus for processing spectral data for use in tissue characterization
US6818903B2 (en) * 2002-07-09 2004-11-16 Medispectra, Inc. Method and apparatus for identifying spectral artifacts
US6933154B2 (en) * 2002-07-09 2005-08-23 Medispectra, Inc. Optimal windows for obtaining optical data for characterization of tissue samples
US7103401B2 (en) * 2002-07-10 2006-09-05 Medispectra, Inc. Colonic polyp discrimination by tissue fluorescence and fiberoptic probe
US6768918B2 (en) * 2002-07-10 2004-07-27 Medispectra, Inc. Fluorescent fiberoptic probe for tissue health discrimination and method of use thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995645A (en) * 1993-08-18 1999-11-30 Applied Spectral Imaging Ltd. Method of cancer cell detection
US5690106A (en) * 1995-06-30 1997-11-25 Siemens Corporate Research, Inc. Flexible image registration for rotational angiography
US6058322A (en) * 1997-07-25 2000-05-02 Arch Development Corporation Methods for improving the accuracy in differential diagnosis on radiologic examinations
US6317617B1 (en) * 1997-07-25 2001-11-13 Arch Development Corporation Method, computer program product, and system for the automated analysis of lesions in magnetic resonance, mammogram and ultrasound images
US6312385B1 (en) * 2000-05-01 2001-11-06 Ge Medical Systems Global Technology Company, Llc Method and apparatus for automatic detection and sizing of cystic objects

Also Published As

Publication number Publication date
EP1476076A1 (en) 2004-11-17
US7260248B2 (en) 2007-08-21
AU2003207787A2 (en) 2003-09-02
US20030144585A1 (en) 2003-07-31
CA2474417A1 (en) 2003-08-07

Similar Documents

Publication Publication Date Title
US7260248B2 (en) Image processing using measures of similarity
Sazzad et al. Development of automated brain tumor identification using MRI images
EP1568307B1 (en) Image processing device and image processing method
Yang-Mao et al. Edge enhancement nucleus and cytoplast contour detector of cervical smear images
Mahendran et al. Investigation of the severity level of diabetic retinopathy using supervised classifier algorithms
CN111798425B (en) Intelligent detection method for mitotic image in gastrointestinal stromal tumor based on deep learning
EP2300988B1 (en) Method of reconstituting cellular spectra useful for detecting cellular disorders
Lin et al. Detection and segmentation of cervical cell cytoplast and nucleus
Abuared et al. Skin cancer classification model based on VGG 19 and transfer learning
CN116645384B (en) Stem cell area rapid segmentation method based on artificial intelligence
Abdullah et al. Brain tumor extraction approach in MRI images based on soft computing techniques
KR102103280B1 (en) Assistance diagnosis method for large intestine disease based on deep learning
Ratheesh et al. Advanced algorithm for polyp detection using depth segmentation in colon endoscopy
Waheed et al. Computer aided histopathological classification of cancer subtypes
Chitra et al. Detection of aml in blood microscopic images using local binary pattern and supervised classifier
Kumar et al. Artificial intelligence based real-time skin cancer detection
CN111798426A (en) Deep learning and detecting system for mitotic image in gastrointestinal stromal tumor of moving end
Lynn et al. Segmentation of skin lesion towards melanoma skin cancer classification
Imtiaz et al. Segmentation of skin lesion using harris corner detection and region growing
Hussein Automatic nuclei segmentation based on fuzzy C-Mean
Abinaya et al. A systematic review: Intellectual detection and prediction of cancer using dl techniques
Reddy et al. A Composite Feature Set Based Blood Vessel Segmentation in Retinal Images through Supervised Learning
Jitender LUNG CANCER DETECTION USING DIGITAL IMAGE PROCESSING AND ARTIFICIAL NEURAL NETWORKS
Kachare et al. Advancements in Melanoma Skin Cancer Detection Using Deep Learning: A Comprehensive Review
Karthikeyan et al. A Thorough Investigation on Automated Diagnosis of Glaucoma.

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2474417

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2003207787

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2003706024

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003706024

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP

WWW Wipo information: withdrawn in national office

Ref document number: 2003706024

Country of ref document: EP