WO1992013308A1 - Morphological classification system and method - Google Patents

Morphological classification system and method Download PDF

Info

Publication number
WO1992013308A1
WO1992013308A1 PCT/US1992/000660 US9200660W WO9213308A1 WO 1992013308 A1 WO1992013308 A1 WO 1992013308A1 US 9200660 W US9200660 W US 9200660W WO 9213308 A1 WO9213308 A1 WO 9213308A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
objects
representation
images
cell
Prior art date
Application number
PCT/US1992/000660
Other languages
French (fr)
Inventor
Randall L. Luck
Richard Scott
Original Assignee
Neuromedical Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=24596994&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO1992013308(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Neuromedical Systems, Inc. filed Critical Neuromedical Systems, Inc.
Publication of WO1992013308A1 publication Critical patent/WO1992013308A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/693Acquisition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Electro-optical investigation, e.g. flow cytometers
    • G01N15/1468Electro-optical investigation, e.g. flow cytometers with spatial resolution of the texture or inner structure of the particle
    • G01N15/1433
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S706/00Data processing: artificial intelligence
    • Y10S706/902Application using ai with detail of the ai system
    • Y10S706/924Medical

Definitions

  • This invention relates generally to classification, particularly to cytology, and, more particularly, to a method and apparatus for quickly and accurately classifying
  • a pap smear often contains as many as 100,000 to 200,000 or more cells and other objects, each of which a technician must individually inspect in order to determine the possible presence of very few
  • classifier is produced by Neuromedical Systems, Inc.* of Suffern, New York under
  • the present invention provides a method and apparatus for automating a cell classification process using at least primary and secondary classifications.
  • the primary classification step includes performing morphological
  • SUBSTITUTE SHEET secondary classification step then further classifies these objects using an implementation of a neural network.
  • the present invention preferably performs at least two scans of the specimen to produce one image suitable for analysis by the primary and secondary classifiers and a second image suitable for display on a high
  • predetermined criteria obtaining a second image of at least one of the objects most likely to have a predetermined criteria, and displaying at least part of the second image to produce a visual display of at least one of the objects most likely to have a predetermined criteria.
  • a method of classifying objects in a specimen includes the steps of obtaining a first digital representation of at least part of the cytological specimen, storing the first digital representation,
  • an apparatus for classifying objects in a cytological specimen includes a device for obtaining a first image of at least part of the cytological specimen, a processor for classifying objects in the first image on the basis of a predetermined criteria, a device for obtaining a second image of at least one of the objects most likely to have a predetermined
  • a monitor for displaying at least part of the second image to produce a visual display of at least one of the objects most likely to have the predete ⁇ nined criteria.
  • Figure 1 is a schematic illustration of a cytological classification or screening
  • SUBSTITUTE SH Figure 2 is a diagrammatic illustration of the scanning passes which the screening device performs
  • FIG 3 is a schematic illustration of the screening device of Figure 1 with particular emphasis on the processing system;
  • Figure 4 is an illustration of the various image components representing areas of the specimen
  • Figures 5a through 5c are flowcharts illustrating the primary classification
  • Figures 6a through 6d are graphical representations of a morphological closing
  • the device 10 includes
  • an automated optical microscope 12 having a motorized stage 14 for the movement of a slide 16 relative to the viewing region of the viewing portion 18 of the
  • a-camera 2 ⁇ or Obtaining electronic images from the optical microscope, a processing system 22 for classifying objects in the images as likely to be of a predetermined cell type, and a memory 24 and a high resolution color monitor 26 for
  • the storage and display respectively of objects identified by the processing system as being likely to be of that predetermined cell type.
  • the classification device 10 is completely, or
  • the microscope 12 will preferably include, in addition to the motorized stage 14, automated apparatus for focusing, for changing
  • the microscope may also include an automated slide transport system for moving the slides containing the specimen to be classified on to and off of the motorized stage, a cell juryer for
  • the automated microscope 12 preferably performs three scans of the slide having the specimen disposed thereon, as shown diagrammatically in Figure 2.
  • the first scan of the slide is performed at a relatively low magnification, for example 50 power, and is called the low resolution scan (30).
  • the second scan is performed at a higher magnification, for example 200 power, and is called the high resolution scan (35).
  • the third scan is referred to as the high resolution rescan and is also performed at a high magnification (40).
  • the high resolution scan (35) is performed.
  • the high resolution scan (35) is performed only on the areas of the slide found in the low resolution scan (30) to contain a portion of the specimen. Consequently,
  • the comparatively long high resolution scan (35) is performed only on relevant areas of the slide and the processing time is greatly reduced.
  • SUBSTITUTE SHEET scan (35), the automated microscope 12 scans the relevant areas of the slide, and the camera 20 takes electronic images of these areas and sends the images to the processing system 22.
  • the processing system performs a primary classification of the image which finds the centroids of biological objects having attributes typical of
  • the processing system 22 uses a smaller sub-image centered around these centroids to assigns each centroid a value indicative of
  • centroid is a cell of the type for which classification is being performed.
  • centroids are also ranked based
  • the high resolution rescan (40) is performed for the highest 64 ranked objects.
  • the automated microscope 12 will move to each of the highest 64 ranked centroids and the camera 20 will obtain a high resolution color image of the object having that centroid.
  • These 64 high resolution images, called color tiles, are then stored in the
  • memory 24 which may be a removable device, such as an optical disk or a tape, etc. , or a fixed storage-device-such ⁇ s hard disk.
  • the sixty-four color tiles may be transferred to another computer via a network or through transportation of the data on a removable storage media.
  • the sixty-four color tiles make up a summary screen which is preferably an 8 x 8 matrix of high resolution color tiles featuring a suspect cell in the center of each
  • SUBSTITUTE SHEET analysis and classification by a cytotechnician This analysis may take place at anytime after the highest sixty-four have been secondarily classified and ranked. Further, through the use of a removable memory device or a network connection, the images and tiles of the summary screen may be transferred to a workstation remote from the microscope 18, camera 20 and processing system 22 for display and analysis. In such an instance a separate graphics processor 41 ( Figure 3) may be employed to drive the high resolution color monitor 26 and provide a suitable interface with the cytotechnician.
  • a cytotechnician or cytotechnologist can easily scan the summary screen in search of an object having the attributes of the cell type for which classification is being performed. If the system is being used ' to screen a pap smear for the presence of cervical cancer, the cytotechnician would typically look for cells having attributes of malignant or premalignant cervical cells, such as a comparatively large, dark nucleus. The cytotechnician may also make a determination from the summary screen as to whether a specimen having no
  • endocervic-d ⁇ c ⁇ ls make up me -ining of the transitional zone of the cervix where most cervical cancers start; consequently, their presence in a pap smear specimen indicates that the test swab must have made contact with the critical transitional zone of the cervix. Since endocervical cells have more characteristics common to a malignant cell than a vaginal or other benign cell has,
  • the screening may be performed for the detection of other cell classes or types.
  • the screening device 10 is shown with particular emphasis on the classification elements embodied in the processing system 22.
  • the processing system 22 preferably includes an image processor and digitizer 42, a neurocomputer 44, and a general processor 46 with peripherals for printing, storage, etc.
  • the general processor 46 is preferably an Intel* 80386 microprocessor or faster microprocessor based microcomputer although it may be another computer-type device suitable for efficrent-e-ircratio ⁇ thei ⁇ -mcti ⁇ ns xiescribed herein.
  • the general processor 46 is preferably an Intel* 80386 microprocessor or faster microprocessor based microcomputer although it may be another computer-type device suitable for efficrent-e-ircratio ⁇ thei ⁇ -mcti ⁇ ns xiescribed herein.
  • processor 46 controls the functioning of and the flow of data between components of
  • the device 10 may cause execution of additional primary feature extraction algorithms and handles the storage of image and classification information.
  • the general processor 46 additionally controls peripheral devices such as a printer 48, a storage device 24 such as an optical or magnetic hard disk, a tape drive, etc., as well as other devices such as a bar code reader 50, a slide marker 52, autofocus circuitry, a robotic slide handler, and the stage 14.
  • the image processor and digitizer 42 performs the primary cell classification functions described more fully below.
  • the image processor and digitizer 42 is a commercially available low level morphological feature extraction image classifier such as the ASPEX Incorporated PIPE* image processor which includes among other things an image digitization function and an ISMAP (Iconic to Symbolic Mapping) board.
  • the PIPE ® image processor is described more fully in U.S. Patent No. 4,601,055, the entire disclosure of which is incorporated by this reference.
  • the image processing and digitization functions could be separated into two or more components.
  • the image processor and digitizer will be conjunctively referred to as the image processor 42.
  • the neurocomputer 44 is a computer embodiment of a neural network trained to identify suspect cells.
  • backpropagation neural network is emulated with pipelined serial processing techniques executed on one of a host of commercially available neurocomputer accelerator boards.
  • the operation of these neurocomputers is discussed in Hecht- Nielsen, -Robert, • *Neu ⁇ uc ⁇ mpu-ing. ⁇ ⁇ id ⁇ g ⁇ the -Human Brain*, IEEE Spectrum, March, 1988, pp. 36-41.
  • the neural network is preferably implemented on an Anza PlusTM processor, which is a commercially available neurocomputer of Hecht-Nielsen
  • Neurocomputers Such a neurocomputer could be easily configured to operate in a manner suitable to perform the secondary classification functions by one of ordinary
  • secondary cell classification functions could be performed using a template matching
  • Another alternative secondary classification embodiment is a holographic image processor
  • the image processor 42, the neurocomputer 44, and the general computer 46 may each access read-only and/or random access memory, as would be readily apparent to one skilled in the art, for the storage and execution of software necessary to perform the functions described relative to that processing component. Further, each component 42, 44, 46 includes circuitry, chips, etc. for the control of
  • the area of the slide 16 possibly containing the biological matter of the specimen is segmented into a plurality of rows and columns, for example, 20 rows
  • Each block 60 occupies an area of the slide, for example, approximately 2000 microns x 1600 microns, and corresponds to an individual image to be viewedt e by g one
  • Each block 60 is subdivided, for example, into sixteen equally sized analysis fields 62. Each field 62 is thus approximately 500 microns by 400 microns in size.
  • each analysis field 62 will be represented by a 256 by 242 matrix or array of pixels which corresponds to a resolution of approximately two microns per pixel during a low resolution scan (30) or high resolution scan (35), or a 512 by
  • Each pixel then represents the brightness, or gray scale
  • SUBSTITUTE SHEET density of a discrete area of the analysis field 62 image The gray scale density of each pixel is further represented by an 8 bit digital value. Consequently, each pixel will represent an area of the analysis field image 62 by a gray scale level ranging from zero to 255.
  • the screening device will perform a low resolution scan (30) on each analysis field 62 to determine if that field contains biological matter, and a high resolution scan (35) on each of the analysis fields 62 having biological matter to detect objects contained therein which are likely to be malignant
  • a third scan (40), the high resolution rescan, may also be performed on an analysis field 62, or a portion of an analysis field, if during the high resolution scan (35) the processing system found an object within the field which is likely to be a malignant or premalignant cell.
  • the objective of the microscope 12 is set, for example, at its 50 magnification power, and the microscope begins scanning the individual blocks 60 of the slide 16. For each block 60 the microscope 12 will automatically determine the approximate focal plane for that area of the slide 16.
  • the cover slip covering the specimen tends to be somewhat wavy or possibly angled
  • the focal plane may vary from block 60 to block.
  • the camera 20 will capture the image of the block and send that image to the image processor 42 through a suitable digitizer.
  • the image processor 42 then subdivides the block 60 into analysis field 62 and determines whether there are areas of interest
  • each analysis field corresponding to objects which may be biological material. If a field 62 contains material which may be biological, the block 60 is identified along
  • the high resolution scan (35) is begun. Initially, a scan path is determined which will allow the microscope 12 to view each block 60 possibly containing
  • the objective corresponding, for example, to a 200 power magnification is inserted into the viewing path of the microscope 12, and the
  • the microscope 12 via the motorized stage 14 will move the slide 16 into a position such that the first block 60,
  • the microscope 12 will then, based initially on the focal plane determined during the low resolution scan (30), focus the block 60 under the high resolution magnification level.
  • the block 60 is digitized and again subdivided into 16 analysis fields 62.
  • the image processor 42 will then
  • This primary classification finds the centroids of objects in each field that have the correct size and gray scale density characteristics.
  • a net image 64 is approximately
  • SUBSTITUTE SHEET premalignant cell nucleus tends to range between 10 and 40 microns in diameter, the net image 64 is sufficiently large to contain a complete image of a suspect cell.
  • the highest 64 ranked objects are displayed on the summary screen 66.
  • the summary screen may be an 8 x 8 matrix of 64 discrete images, called color tiles 68, a 4 4 arrangement of 16 color tiles, or some other arrangement.
  • Color tiles 68 are obtained during the rescan pass (40).
  • Each color tile represents an approximately 128 x 104 micron area surrounding the centroid of a suspect cell, with a resolution of one micron per pixel.
  • Each color tile produces a high resolution color image of a suspect cell and surrounding cells and biological matter, with the suspect cell centered in the tile.
  • magnification levels, etc. can be employed to accomplish the same or similar results --s thep--rt--etriarembodmtent of theinvention described above, and that all such differing resolutions, image parameters, etc. are within the scope of the present invention.
  • the Papanicolaou stain used in treating a pap smear dyes the nuclei of biological cells within the smear a
  • the primary classifier performs a morphological "well” algorithm which filters out objects that are the size of a premalignant or malignant cell or smaller.
  • a "well” algorithm is the inverse of a morphological "top hat” algorithm.
  • the resulting image containing only objects which are too large to be a cancerous cell, is then subtracted from the original image containing all of the objects.
  • the centroids of the objects in this image are then determined and the images centered around those centroids are sent to the secondary classifier for further classification.
  • the image in the frame buffer which corresponds to an analysis field 62 and is referred to herein as the frame image, is spatially filtered such as with a gaussian low pass filter to remove random noise from the image (110).
  • the gaussian filter has a convolution mask of:
  • This convolution mask is moved across every pixel in the frame image. To explain, the convolution mask will be initially centered on the first pixel in the frame image
  • Gray scale dilation is a mathematical morphology term used in image processing to denote an operation wherein a mask is centered on a pixel and a corresponding pixel in a corresponding result frame pixel is replaced by the largest value of its neighboring pixels added to their corresponding mask values or itself added to its corresponding mask value.
  • Erosion is a similar term wherein the center pixel is replaced by the minimum value
  • a gray scale erosion operation is first performed using a 4 neighboring pixel mask followed by an erosion using an 8 neighboring pixel mask
  • the object on the left 70 is a large dark nucleus and the object on the right 72 is a smaller, less dark cell, such as a leukocyte.
  • the horizontal line 74 represents a row of pixels passing through the objects 70, 72.
  • the gray scale values for this row of pixels is shown in Figure 6b.
  • the large dark nucleus 70 forms a wide and deep gray scale rectangle 76 due to its relatively large size and darkness.
  • the leukocyte 70 being smaller and less dark, forms a narrower, shallower rectangle 78.
  • nucleus may now be represented by the relatively wide depression 80 while the leukocyte may be represented by the narrow spike 82. Subsequent dilation operations
  • a gray scale dilation is performed on the gaussian low pass filtered frame image using an 8 neighboring pixel mask (115).
  • a gray scale dilation is then performed on this dilated frame image using a 4 neighboring pixel mask (120).
  • This series of gray scale dilations is then performed one additional time (125, 130).
  • the dilated image is then eroded using a 4 neighboring pixel mask erosion followed by an 8 neighboring pixel mask erosion (160). Again, this series of erosions
  • SUBSTITUTE SHEET frame image thus consists only of objects which generally correspond to the size of a malignant or premalignant nucleus or smaller.
  • a threshold operation is then performed on this image with pixels having a gray scale value above a certain threshold being assigned a binary ' 1 ' and those pixels have gray scale values below that threshold being assigned a binary zero (200).
  • the threshold is chosen to filter out objects which are not dark enough to be nuclei.
  • a binary 3 x 3 erosion is then performed on the image to remove the outer pixel
  • This binary erosion is accomplished by performing a boolean AND on the center pixel and the eight neighboring pixels. Representing the mask as below:
  • the frame image now contains objects of the appropriate gray scale density that are of a size corresponding to a malignant or premalignant cervical cell and
  • a binary mask is needed to mask off the smaller objects. Accordingly, the mask would have binary 'l's in all locations of the frame image pixel array except for those locations occupied by a small image, where there would be zeros. Consequently, by boolean ANDing the binary frame image with the binary
  • This mask is obtained by taking the untreated frame image (C) stored earlier
  • binary dilation is then performed on the image to slightly expand or thicken the edges in the image (235).
  • the binary dilation operation is performed as a boolean 'OR' operation.
  • This function could also be performed with a gray-scale morphological well or top hat filter as described above with its parameters set to find smaller objects.
  • a number of the outermost rows and columns of pixels, for example, eight, forming a border around the frame are then removed, as they contain artifacts and other irrelevant information introduced by the operations performed above (245).
  • the complement of the resultant image is then taken by a boolean 'NOT' operation (250). Consequently, the binary image will consist of binary 'l's in all locations except in the slightly enlarged areas encompassing objects too small to be malignant cells. This complemented image thus forms the binary mask used to subtract the small objects from the earlier developed frame image.
  • the suppression operation can be any combination of binary erosion and dilation operations.
  • SUBSTITUTE SHEET be expressed mathematically as (E & A) ! (E & B)
  • the shrinking operation successively removes layers of pixels around the object in the image frame until, for an object of the appropriate size, only one pixel or no pixels remain. In the case where the object is completely removed, the last pixel removed is replaced with a binary 1. Since the outer layers of pixels of the objects were successively removed progressing inwardly,
  • the remaining or replaced pixel will represent the approximate center or centroid of the object. Any objects remaining in the image which are larger than one pixel are
  • the identification of the centroids ⁇ f objects ⁇ vhfcfr * correspond to the same size and gray scale density that a typical cervical cancer cell would be expected to have marks the end of the primary classification phase of operation.
  • the code would be loaded, burned, or otherwise encoded into memory accessible by the image processor 42 for execution by the image processor.
  • the general processor 46 individually transfers each net image 64 to the
  • the task of the secondary classification is to distinguish the premalignant and malignant cells from other objects of the same size which may pass the primary classifier, such as cell clumps, debris, clumps of leukocytes and mucus.
  • the neurocomputer 44 will assign each net image 64 with a value, called a net value, ranging from .1 to .9, as determined by the likelihood that the object is a premalignant or malignant cell.
  • a net value ranging from .1 to .9, as determined by the likelihood that the object is a premalignant or malignant cell.
  • the present invention overcomes this difficulty in the domain of cytology.
  • Another advantage of the present invention is that during actual classification operations, a secondary classifier is presented with precisely the same type of net images on which it was trained. These images are also centered on the centroid of the suspect nucleus by the primary classifier in a manner identical to that used to prepare the training set images. This makes a generalization task of the secondary classifier
  • the secondary classifier is trained to associate a known benign image with an output of .1 and a known pathological image with an output of .9. Such outputs represent, for example, the degree of certainty that a cell is normal or abnormal, respectively.
  • the net values assigned to those objects by the secondary classifier are ranked from closest to .9 down to .1.
  • the high resolution rescan (40) is begun at the 200 power magnification.
  • the stage 14 will move the slide relative to the microscope 12 so that one of the highest 64 ranked object is in the viewing path of the microscope.
  • the image is then focused according to the previously determined high resolution focus parameters, and the camera grabs, at 512 x 484 resolution, the 128 x 104 red, green and blue component image around the centroid location.
  • This high resolution color tile 68 is then stored in the memory 24, such as on an optical disk or tape. These operations are then performed for the next cell until all 64 of the highest ranked cells have been rescanned and their high resolution color images have been stored in the memory 24. This completes the
  • the automated classifier 10 may then remove the slide and replace it with another slide for further classification.
  • the 64 color tiles 68 may be displayed as a summary screen 66 in their descending order of ------iking, their positioned relation to each other in the
  • the present invention is applicable to cell classification in general and is particularly applicable to the classification of cells in a pap smear.

Abstract

A method of classifying objects in a specimen includes the steps of obtaining a first digital representation of at least part of the cytological specimen, storing the first digital representation, performing a first filtering operation to filter out images in the first representation that are the approximate size of a malignant or premalignant cell or smaller to produce a second digital representation, removing the images in the second representation from the images in the first representation to produce a third representation, performing a second filtering operation to filter out images in the first representation that are smaller than the approximate size of a premalignant or malignant cell to produce a fourth representation, and eliminating the images in the fourth representation from the images in the third representation to produce a representation having substantially only images the approximate size of a premalignant or malignant cell.

Description

MORPHOLOGICAL CLASSIFICATION SYSTEM AND METHOD TECHNICAL FIELD OF THE INVENTION
This invention relates generally to classification, particularly to cytology, and, more particularly, to a method and apparatus for quickly and accurately classifying
cells based on cellular morphology.
BACKGROUND OF THE INVENTION
In the medical industry there is often the need for an experienced laboratory technician to review a specimen of biological matter for the presence of cells of a certain cellular type. An example of this is the need to review a pap smear slide for the presence of malignant or premalignant cells. A pap smear often contains as many as 100,000 to 200,000 or more cells and other objects, each of which a technician must individually inspect in order to determine the possible presence of very few
malignant or premalignant cells. Pap smear tests, as well as other tests requiring equally exhausting cell inspection techniques, have therefore suffered from a high
false negative rate due to the tedium and fatigue imposed upon the technician.
Several thousand women die each year in the United States alone from cervical cancer; a cancer from which a woman theoretically has a high probability of survival
if detected in its early in situ stages. If not detected early, however, the chances of survival may decrease drastically. If a malignant cell in a pap smear is missed, by the time the woman has another pap smear performed the cancer may have advanced to its invasive stage from which a woman has a much smaller chance of survival.
Consequently, the importance of detecting the presence of only one or a few
malignant or premalignant cells among the hundreds of thousands of cells in a smear cannot be overstated. Unfortunately, present manual screening methods are
SUBSTITUTE SHEET inaccurate. In fact, recently some laboratories have been found to have incorrectly classified as benign up to 30% of the specimens containing malignant or premalignant cells. Also unfortunate is that many prior attempts to automate the cell inspection or
classification have been unsuccessful.
Predominately, these prior attempts at automation have relied on feature
extraction, template matching and other statistical or algorithmic methods alone. These attempts have required expensive and time-consuming cell preparations to distribute the cells and other objects over a slide so that none of the cells or objects overlap. However, even then these attempts have been unsuccessful at accurately classifying specimens in a reasonable time frame.
These difficulties have been overcome by ∞mbining an algorithmic or statistical primary classifier with a neural network based secondary classifier as disclosed in U.S. Patent No. 4,965,725, and U.S. Patent Application Nos. 07/420,105, 07/425,665, 07/520,611 and 07/610,423, which are incorporated in their entireties by this reference. A commercially available automated pap smear screener, using a primary classifier in conjunction with a neurocomputer based secondary
classifier is produced by Neuromedical Systems, Inc.* of Suffern, New York under
trademark PAPNET™.
SUMMARY OF THE INVENTION The present invention provides a method and apparatus for automating a cell classification process using at least primary and secondary classifications.
Specifically, the primary classification step includes performing morphological
filtering on an image of a biological specimen to detect cells having certain morphological features, such as a generally round shape or a certain size. The
SUBSTITUTE SHEET secondary classification step then further classifies these objects using an implementation of a neural network. The present invention preferably performs at least two scans of the specimen to produce one image suitable for analysis by the primary and secondary classifiers and a second image suitable for display on a high
resolution color monitor to facilitate inspection by a cytotechnician.
In accordance with the present invention, a method of classifying cells based
upon their morphology includes a method of classifying objects in a cytological
specimen, including the steps of obtaining a first image of at least part of a cytological specimen, classifying objects in the first image on the basis of a
predetermined criteria, obtaining a second image of at least one of the objects most likely to have a predetermined criteria, and displaying at least part of the second image to produce a visual display of at least one of the objects most likely to have a predetermined criteria.
In accordance with another aspect of the invention, a method of classifying objects in a specimen includes the steps of obtaining a first digital representation of at least part of the cytological specimen, storing the first digital representation,
performing a first filtering operation to filter out images in the representation that are
the approximate size of a malignant or premalignant cell or smaller to produce a second digital representation, removing the images in the second representation from
the images in the first representation to produce a third representation, performing a second filtering operation to filter out images in the first representation that are smaller than the approximate size of a premalignant or malignant cell to produce a
fourth representation, and eliminating the images in the fourth representation from the
SUBSTITUTE SHEET images in the third representation to produce a representation having substantially
only images the approximate size of a premalignant or malignant cell.
In accordance with a further aspect of the invention an apparatus for classifying objects in a cytological specimen includes a device for obtaining a first image of at least part of the cytological specimen, a processor for classifying objects in the first image on the basis of a predetermined criteria, a device for obtaining a second image of at least one of the objects most likely to have a predetermined
criteria, and a monitor for displaying at least part of the second image to produce a visual display of at least one of the objects most likely to have the predeteπnined criteria.
These and other objects, advantages, features and aspects of the present invention will become apparent as the following description proceeds.
To the accomplishments of the foregoing and related ends, the invention, then comprises the features hereinafter fully described in the specification and particularly pointed out in claims, the following description and the annexed drawings setting forth in detail a certain illustrative embodiment of the invention, this being indicative, however, f-but oneOfthe various waysin Mcirthe principals of the invention may
be employed. It will be appreciated that the scope of the invention is to be determined by the claims and the equivalents thereof.
BRIEF DESCRIPTION OF THE DRAWINGS In the annexed drawings:
Figure 1 is a schematic illustration of a cytological classification or screening
device in accordance with the present invention;
SUBSTITUTE SH Figure 2 is a diagrammatic illustration of the scanning passes which the screening device performs;
Figure 3 is a schematic illustration of the screening device of Figure 1 with particular emphasis on the processing system;
Figure 4 is an illustration of the various image components representing areas of the specimen;
Figures 5a through 5c are flowcharts illustrating the primary classification
function of the present invention; and
Figures 6a through 6d are graphical representations of a morphological closing
operation performed during the primary classification function.
DETAILED DESCRIPTION OF THE INVENTION
With reference to the several figures in which like reference numerals depict like items, and initially to Figure 1, there is shown an automated cell classification device 10 in accordance with the present invention. Briefly, the device 10 includes
an automated optical microscope 12 having a motorized stage 14 for the movement of a slide 16 relative to the viewing region of the viewing portion 18 of the
microscope, a-camera 2θ^or Obtaining electronic images from the optical microscope, a processing system 22 for classifying objects in the images as likely to be of a predetermined cell type, and a memory 24 and a high resolution color monitor 26 for
the storage and display respectively of objects identified by the processing system as being likely to be of that predetermined cell type.
In its preferred embodiment the classification device 10 is completely, or
nearly completely, automated. As such the microscope 12 will preferably include, in addition to the motorized stage 14, automated apparatus for focusing, for changing
SUBSTITUTE SHEET lens objectives between high and low power, and for adjustment of the light incident of the slide, as well as circuitry for controlling the movement of the motorized stage, typically in response to a command from the processing system. The microscope may also include an automated slide transport system for moving the slides containing the specimen to be classified on to and off of the motorized stage, a cell dorter for
marking relevant areas of the slide, and a bar code reader for reading encoded information from the slide. An example of an automated microscope performing at least some of these functions is manufactured by McBain Instruments of California. In accordance with the invention the automated microscope 12 preferably performs three scans of the slide having the specimen disposed thereon, as shown diagrammatically in Figure 2. The first scan of the slide is performed at a relatively low magnification, for example 50 power, and is called the low resolution scan (30). The second scan is performed at a higher magnification, for example 200 power, and is called the high resolution scan (35). The third scan is referred to as the high resolution rescan and is also performed at a high magnification (40).
During the first scan (30) of the slide, approximate focal planes for the
specific areas-of the slide are found andit is determined whether that area of the slide contains a portion of the specimen. Once a low resolution scan (30) has been
performed of the whole slide, and the focal planes and areas of the slide containing the specimen have been logged, the high resolution scan (35) is performed.
The high resolution scan (35) is performed only on the areas of the slide found in the low resolution scan (30) to contain a portion of the specimen. Consequently,
the comparatively long high resolution scan (35) is performed only on relevant areas of the slide and the processing time is greatly reduced. During the high resolution
SUBSTITUTE SHEET scan (35), the automated microscope 12 scans the relevant areas of the slide, and the camera 20 takes electronic images of these areas and sends the images to the processing system 22. The processing system performs a primary classification of the image which finds the centroids of biological objects having attributes typical of
the cell class for which screening is being performed, such as malignant cells. Using a smaller sub-image centered around these centroids, the processing system 22 performs a secondary classification which assigns each centroid a value indicative of
the possibility that the object having that centroid is a cell of the type for which classification is being performed. Simultaneously, the centroids are also ranked based
on the value assigned through the secondary classification.
Upon completion of the high resolution scan (35), the high resolution rescan (40) is performed for the highest 64 ranked objects. During the rescan (40) the automated microscope 12 will move to each of the highest 64 ranked centroids and the camera 20 will obtain a high resolution color image of the object having that centroid. These 64 high resolution images, called color tiles, are then stored in the
memory 24 which may be a removable device, such as an optical disk or a tape, etc. , or a fixed storage-device-such ^s hard disk. Alternatively, the sixty-four color tiles may be transferred to another computer via a network or through transportation of the data on a removable storage media.
The sixty-four color tiles make up a summary screen which is preferably an 8 x 8 matrix of high resolution color tiles featuring a suspect cell in the center of each
tile. It will be appreciated, however, that other numbers of color tiles may be displayed concurrently to produce a summary screen, such as a 4 x 4 matrix. These summary screens are displayed on the high resolution color monitor 26 for tertiary
SUBSTITUTE SHEET analysis and classification by a cytotechnician. This analysis may take place at anytime after the highest sixty-four have been secondarily classified and ranked. Further, through the use of a removable memory device or a network connection, the images and tiles of the summary screen may be transferred to a workstation remote from the microscope 18, camera 20 and processing system 22 for display and analysis. In such an instance a separate graphics processor 41 (Figure 3) may be employed to drive the high resolution color monitor 26 and provide a suitable interface with the cytotechnician.
A cytotechnician or cytotechnologist (hereinafter cytotechnician) can easily scan the summary screen in search of an object having the attributes of the cell type for which classification is being performed. If the system is being used' to screen a pap smear for the presence of cervical cancer, the cytotechnician would typically look for cells having attributes of malignant or premalignant cervical cells, such as a comparatively large, dark nucleus. The cytotechnician may also make a determination from the summary screen as to whether a specimen having no
malignant or premalignant cells was taken properly. This can be done by ascertaining
the ^presence -of endocervical ceils: " Endocervic-dπc^ls make up me -ining of the transitional zone of the cervix where most cervical cancers start; consequently, their presence in a pap smear specimen indicates that the test swab must have made contact with the critical transitional zone of the cervix. Since endocervical cells have more characteristics common to a malignant cell than a vaginal or other benign cell has,
in the absence of any true premalignant or malignant cells it will be ranked above the other cells and displayed.
SUBSTITUT Herein the screening method and apparatus of the present invention will be described as used in screening a pap smear for the presence of cervical cancer cells. However, it will be apparent to a person of ordinary skill in the art that this is only
an illustrative use and that the present invention may be used in screening samples of
other biological matter taken by a variety of cell sampling techniques, such as aspiration and exfoliation to name but two. Further it will be apparent that while the
illustrative example screens for malignant or premalignant cells, the screening may be performed for the detection of other cell classes or types.
Turning now to a more in-depth discussion of the present invention with
specific reference to Figure 3, the screening device 10 is shown with particular emphasis on the classification elements embodied in the processing system 22. The processing system 22 preferably includes an image processor and digitizer 42, a neurocomputer 44, and a general processor 46 with peripherals for printing, storage, etc.
The general processor 46 is preferably an Intel* 80386 microprocessor or faster microprocessor based microcomputer although it may be another computer-type device suitable for efficrent-e-ircratioπ^theiϊ-mctiσns xiescribed herein. The general
processor 46 controls the functioning of and the flow of data between components of
the device 10, may cause execution of additional primary feature extraction algorithms and handles the storage of image and classification information. The general processor 46 additionally controls peripheral devices such as a printer 48, a storage device 24 such as an optical or magnetic hard disk, a tape drive, etc., as well as other devices such as a bar code reader 50, a slide marker 52, autofocus circuitry, a robotic slide handler, and the stage 14.
SUBSTITUTE SHEET The image processor and digitizer 42 performs the primary cell classification functions described more fully below. In the preferred embodiment, the image processor and digitizer 42 is a commercially available low level morphological feature extraction image classifier such as the ASPEX Incorporated PIPE* image processor which includes among other things an image digitization function and an ISMAP (Iconic to Symbolic Mapping) board. The PIPE® image processor is described more fully in U.S. Patent No. 4,601,055, the entire disclosure of which is incorporated by this reference. Alternatively, the image processing and digitization functions could be separated into two or more components. Below, the image processor and digitizer will be conjunctively referred to as the image processor 42.
Secondary cell classification is performed by the neurocomputer 44. The neurocomputer 44 is a computer embodiment of a neural network trained to identify suspect cells. In this embodiment the parallel structure of a two or three-layer
backpropagation neural network is emulated with pipelined serial processing techniques executed on one of a host of commercially available neurocomputer accelerator boards. The operation of these neurocomputers is discussed in Hecht- Nielsen, -Robert, *Neuιucυmpu-ing.~^id πg~the -Human Brain*, IEEE Spectrum, March, 1988, pp. 36-41. The neural network is preferably implemented on an Anza Plus™ processor, which is a commercially available neurocomputer of Hecht-Nielsen
Neurocomputers. Such a neurocomputer could be easily configured to operate in a manner suitable to perform the secondary classification functions by one of ordinary
skill in the art through reference to corresponding manuals, etc. Alternatively, secondary cell classification functions could be performed using a template matching
algorithm designed to identify shapes known to be typical of a pathological cell. A template matching or other group processing algorithm could be efficiently implemented in a parallel distributed processing network, for example. Another alternative secondary classification embodiment is a holographic image processor
designed to perform group based classification.
The image processor 42, the neurocomputer 44, and the general computer 46 may each access read-only and/or random access memory, as would be readily apparent to one skilled in the art, for the storage and execution of software necessary to perform the functions described relative to that processing component. Further, each component 42, 44, 46 includes circuitry, chips, etc. for the control of
communication or data transfer over the data bus 54 as well as other functions typical of similar processors as would be appreciated.
Returning to a discussion of the operation of the device 10 and with reference to Figure 4, the area of the slide 16 possibly containing the biological matter of the specimen is segmented into a plurality of rows and columns, for example, 20 rows
and 50 columns of equal sized blocks 60. Each block 60 occupies an area of the slide, for example, approximately 2000 microns x 1600 microns, and corresponds to an individual image to be viewedt e bygone
Figure imgf000013_0001
Each block 60 is subdivided, for example, into sixteen equally sized analysis fields 62. Each field 62 is thus approximately 500 microns by 400 microns in size. Once digitized by the
image processor 42, each analysis field 62 will be represented by a 256 by 242 matrix or array of pixels which corresponds to a resolution of approximately two microns per pixel during a low resolution scan (30) or high resolution scan (35), or a 512 by
484 array of pixels corresponding to a one micron per pixel resolution during a high
resolution rescan pass (40). Each pixel then represents the brightness, or gray scale
SUBSTITUTE SHEET density of a discrete area of the analysis field 62 image. The gray scale density of each pixel is further represented by an 8 bit digital value. Consequently, each pixel will represent an area of the analysis field image 62 by a gray scale level ranging from zero to 255. In operation, the screening device will perform a low resolution scan (30) on each analysis field 62 to determine if that field contains biological matter, and a high resolution scan (35) on each of the analysis fields 62 having biological matter to detect objects contained therein which are likely to be malignant
or premalignant cells. A third scan (40), the high resolution rescan, may also be performed on an analysis field 62, or a portion of an analysis field, if during the high resolution scan (35) the processing system found an object within the field which is likely to be a malignant or premalignant cell.
During the low resolution scan (30) the objective of the microscope 12 is set, for example, at its 50 magnification power, and the microscope begins scanning the individual blocks 60 of the slide 16. For each block 60 the microscope 12 will automatically determine the approximate focal plane for that area of the slide 16. As
the cover slip covering the specimen tends to be somewhat wavy or possibly angled,
such as doe-to air-bubbles contained under the cove-rstip, the focal plane may vary from block 60 to block. Once the focus is determined for the block 60 being viewed, the camera 20 will capture the image of the block and send that image to the image processor 42 through a suitable digitizer. The image processor 42 then subdivides the block 60 into analysis field 62 and determines whether there are areas of interest
in each analysis field corresponding to objects which may be biological material. If a field 62 contains material which may be biological, the block 60 is identified along
with its approximate focal plane.and stored in memory for future analysis during the high resolution scan (35). This low resolution scan (30) should be performed for all
blocks 60 on the slide 16.
Once the low resolution scan (30) has been completed and all of the blocks 60 cont-tining objects which are possibly biological material have been identified in
memory, the high resolution scan (35) is begun. Initially, a scan path is determined which will allow the microscope 12 to view each block 60 possibly containing
biological matter preferably with the least amount of movement of the slide 16. For the high resolution scan (35), the objective corresponding, for example, to a 200 power magnification is inserted into the viewing path of the microscope 12, and the
scan is begun at the first block 60 in the scan path. The microscope 12, via the motorized stage 14 will move the slide 16 into a position such that the first block 60,
which was identified as having biological material during the low resolution scan, will be in the field of view of the microscope. The microscope 12 will then, based initially on the focal plane determined during the low resolution scan (30), focus the block 60 under the high resolution magnification level. The block 60 is digitized and again subdivided into 16 analysis fields 62. The image processor 42 will then
perform the primary classification of the objects in each analysis field 62 as discussed
more fully below. This primary classification finds the centroids of objects in each field that have the correct size and gray scale density characteristics.
When an object in an analysis field 62 has been identified as having the size and gray scale density characteristics of a premalignant or malignant cell, a 24 x 24 array of pixels surrounding the object centroid, called a net image 64, is transferred to the secondary classifier for further classification. A net image 64 is approximately
48 x 48 microns in size at a resolution of 2 microns per pixel. As a malignant or
SUBSTITUTE SHEET premalignant cell nucleus tends to range between 10 and 40 microns in diameter, the net image 64 is sufficiently large to contain a complete image of a suspect cell.
The highest 64 ranked objects are displayed on the summary screen 66. As discussed above, the summary screen may be an 8 x 8 matrix of 64 discrete images, called color tiles 68, a 4 4 arrangement of 16 color tiles, or some other arrangement. Color tiles 68 are obtained during the rescan pass (40). Each color tile represents an approximately 128 x 104 micron area surrounding the centroid of a suspect cell, with a resolution of one micron per pixel. Each color tile produces a high resolution color image of a suspect cell and surrounding cells and biological matter, with the suspect cell centered in the tile. By reviewing the summary screen 66, the cytotechnician can relatively easily classify the high resolution color images
of the suspect cells.
It will be appreciated by a person of ordinary skill in the art that while particular resolutions and image sizes were described above, these particular values are exemplary. It will further be appreciated that different resolutions, image
parameters, magnification levels, etc., can be employed to accomplish the same or similar results --s thep--rt--etriarembodmtent of theinvention described above, and that all such differing resolutions, image parameters, etc. are within the scope of the present invention.
Turning to a more in-depth discussion of the primary classification routine, and referring to Figures 5a-5c, there is shown a flowchart of the primary
classification functions which the image processor 42 performs. Numbers contained within parenthesis below correspond to like numbered steps in the flowchart.
Similarly, letters contained within parenthesis denote the flow of image data at
SUBSTITUTE SHEET various steps in the flowchart. Once a block image 60 has been focused and taken by the camera 20, the image processor 42 will digitize an area of the block 60 corresponding to an analysis field 62 and grab the 8 bit gray scale red and green
images from the camera (100). The red and green 8 bit gray scale images are then
combined to produce a monochrome image (105) which is stored in a segment of memory of the image processor 42, called the frame buffer. The Papanicolaou stain used in treating a pap smear dyes the nuclei of biological cells within the smear a
purple color. Since the colors red and green when combined approximately equally make a yellowish green color which is directly opposite the color purple on a color
space triangle, combining red and green, without blue, creates an image wherein the purple stained nuclei appear very dark and the areas of other colors appear brighter.
Briefly, the primary classifier performs a morphological "well" algorithm which filters out objects that are the size of a premalignant or malignant cell or smaller. (A "well" algorithm is the inverse of a morphological "top hat" algorithm.)
The resulting image, containing only objects which are too large to be a cancerous cell, is then subtracted from the original image containing all of the objects.
Consequently ,~what is left are objects of the correct size or smaller. A separate
image is then prepared from the original image which will contain only objects which are too small to be of concern. When this latter image, containing only objects which
are too small, is then subtracted from the image having objects of the size of a possible cancerous cell or smaller, the resultant image will thus contain only images being of the size of a possible cancerous cell. The centroids of the objects in this image are then determined and the images centered around those centroids are sent to the secondary classifier for further classification. The image in the frame buffer, which corresponds to an analysis field 62 and is referred to herein as the frame image, is spatially filtered such as with a gaussian low pass filter to remove random noise from the image (110). Preferably, the gaussian filter has a convolution mask of:
1 2 1
2 4 2 1 2 1
This convolution mask is moved across every pixel in the frame image. To explain, the convolution mask will be initially centered on the first pixel in the frame image
pixel matrix. Consequently, the 8 bit gray value of this pixel will be multiplied by 4 while the 8 bit gray values of the pixels immediately above and below and on either side will be multiplied by 2, and the adjacent diagonal pixels will have their 8 bit gray scale values multiplied by 1. All nine of these results are summed and the result is divided by 16 and placed in a pixel location in a result frame corresponding to the location of the center pixel. The convolution mask is then shifted to the next pixel in the frame image pixel matrix and the operation is repeated. This continues for all pixels
Figure imgf000018_0001
A morphological closing is then performed on the frame image in the result
frame to filter out all objects smaller than a certain diameter which is somewhat larger than the size of a malignant cell nucleus. Consequently, what is left are only those objects which are too large to correspond to malignant cell nuclei. This morphological closing is performed using a series of gray scale dilation operations followed by a series of gray scale erosion operations. Gray scale dilation is a mathematical morphology term used in image processing to denote an operation wherein a mask is centered on a pixel and a corresponding pixel in a corresponding result frame pixel is replaced by the largest value of its neighboring pixels added to their corresponding mask values or itself added to its corresponding mask value. Erosion is a similar term wherein the center pixel is replaced by the minimum value
of its neighbors added to their corresponding mask values or itself with its corresponding mask value added to it.
In the preferred embodiment of the invention, gray scale dilation operations
are performed using an 8 neighboring pixel mask wherein the center pixel and its 8 neighboring pixels are each allocated equal weights, followed by a gray scale dilation
using a 4 neighboring pixel mask wherein the pixels diagonally related to the center pixel are ignored and the remaining 4 neighboring pixels are allocated equal weights with the center pixel. The respective masks for these dilation operations would appear as:
0 0 0 -oo o -oo
0 0 0 0 0 0
0 0 0 -oo 0 -oo
For the erosion operations, a gray scale erosion operation is first performed using a 4 neighboring pixel mask followed by an erosion using an 8 neighboring pixel mask
where all the neighboring pixels are taken into account. The respective masks for the erosion operation would appear as:
SUBSTITUTE SHEET + 00 0 + 00 0 0 0
0 0 0 0 0 0
+ O0 0 + 00 0 0 0
The combination of operations using these masks morphologically approximates an octagon, which is digitally analogous to the morphology of a nucleus which tends to be generally round.
The manner in which this morphological closing filters small objects out of the image is graphically represented in Figures 6a-d. Figure 6a illustrates two objects;
the object on the left 70 is a large dark nucleus and the object on the right 72 is a smaller, less dark cell, such as a leukocyte. The horizontal line 74 represents a row of pixels passing through the objects 70, 72. The gray scale values for this row of pixels is shown in Figure 6b. The large dark nucleus 70 forms a wide and deep gray scale rectangle 76 due to its relatively large size and darkness. The leukocyte 70, being smaller and less dark, forms a narrower, shallower rectangle 78.
As dilation operations are performed, these gray scale rectangles are gradually fiHe in as-indicated in Figure 6c. After one such dilation operation, the large dark
nucleus may now be represented by the relatively wide depression 80 while the leukocyte may be represented by the narrow spike 82. Subsequent dilation operations
will continue to fill in these depressions. Consequently, a relatively small object will be completely filled in while larger, darker objects, such as malignant nuclei, will remain as somewhat narrower depressions.
When the image is then subjected to a series of erosion operations, the narrow depressions which are the gray scale representations of large, dark objects will be expanded to approximately their original size and shape, whereas the completely filled in gray scale representations of small objects will not be eroded. The morphological closing will thus yield a row of pixels having gray scales, as shown by the solid line 84 in Figure 6d, which represent the large dark object, but have the object which is
too small to be a malignant or premalignant cell filtered out.
A discussion of mathematical morphology can be found in Serra, "Introduction
to Mathematical Morphology" Computer Vision, Graphics, and Image Processing, 35,
283-305 (1986) and Sternberg, "Grayscale Morphology", Computer Vision, Graphics, and Image Processing, 35, 333-355 (1986).
Returning to the flowchart illustrated in Figures 5a-5c, a gray scale dilation is performed on the gaussian low pass filtered frame image using an 8 neighboring pixel mask (115). A gray scale dilation is then performed on this dilated frame image using a 4 neighboring pixel mask (120). This series of gray scale dilations is then performed one additional time (125, 130).
The dilated image is then eroded using a 4 neighboring pixel mask erosion followed by an 8 neighboring pixel mask erosion (160). Again, this series of erosions
is-performed one-additional time (i65, 170). ' "
It will be appreciated by a person skilled in the art that other filtering functions could be performed to achieve similar results, such as a rolling ball or cone
filter.
The resulting dilated-eroded frame image has now had the smaller objects
filtered out and consists only of objects which are larger than a malignant cell nucleus. This dilated eroded frame image is then subtracted from the frame image (B) which has been gaussian filtered but not dilated and eroded (195). The resultant
SUBSTITUTE SHEET frame image thus consists only of objects which generally correspond to the size of a malignant or premalignant nucleus or smaller.
A threshold operation is then performed on this image with pixels having a gray scale value above a certain threshold being assigned a binary ' 1 ' and those pixels have gray scale values below that threshold being assigned a binary zero (200). The threshold is chosen to filter out objects which are not dark enough to be nuclei. A binary 3 x 3 erosion is then performed on the image to remove the outer pixel
boundary of the remaining objects, thus making the objects narrower by one pixel in all directions (205). This binary erosion is accomplished by performing a boolean AND on the center pixel and the eight neighboring pixels. Representing the mask as below:
A B C D E F G H I Mathematically, this will be represented as A & B & C & D & E & F & G & H &
I, where each letter indicates a binary value at certain pixel locations. As discussed
above relative -to "fte-gauss-an-iow-pass-filter,
Figure imgf000022_0001
mask slides across the frame so that the binary erosion is performed with the mask eventually centered on each pixel in the frame, thus creating an eroded result frame image.
The resulting image is then boolean A Ded (210) with the original frame
image (C) which has been thresholded (215) as described above to produce a binary image. Boolean ANDing these images masks off objects that are of the appropriate size but are too bright to be malignant or premalignant cervical cells. An example of the objects which would be eliminated from the frame image by this operation are clumps of red blood cells.
The frame image now contains objects of the appropriate gray scale density that are of a size corresponding to a malignant or premalignant cervical cell and
smaller. These smaller objects typically represent neutrophils which are white blood cells. Consequently, these objects which are smaller than a cancerous cervical cell nucleus must be removed from the frame image. Since the frame image is now in
a binary format, a binary mask is needed to mask off the smaller objects. Accordingly, the mask would have binary 'l's in all locations of the frame image pixel array except for those locations occupied by a small image, where there would be zeros. Consequently, by boolean ANDing the binary frame image with the binary
mask the small objects are removed and predominantly what is left are objects of the appropriate gray scale density and size for a cervical cancer cell nucleus (220).
This mask is obtained by taking the untreated frame image (C) stored earlier
and treating the image with a well-known Sobel operator to find areas within the image (225) having relatively large arithmetic gradients. Areas of high detail in the
imagej-or are-B^-aΛdng-te-tatively- mall objects,- will have large gradients. The Sobel
operator, which is essentially a high-pass image filter, will pass those areas having large gradients and reject the areas of large objects which have smaller gradients. By
choosing the correct filter parameters, the edges of objects are found. The image is then thresholded to convert the gray scale image into a binary image (230). A 3 x
3 binary dilation is then performed on the image to slightly expand or thicken the edges in the image (235). The binary dilation operation is performed as a boolean 'OR' operation. Using the mask notation discussed earlier relative to binary erosion,
SUBSTITUTE SHEET mathematically the binary dilation would be expressed as A|BjC|D|E|FjG|H|I. The dilation operation is then repeated preferably using only the adjacent pixels. Mathematically this second dilation would be represented as B|D|EjF|H (240). After the edge thickening operation, objects smaller than a certain diameter will have become solid "blobs" and objects larger than this diameter will remain open in the
center. This function could also be performed with a gray-scale morphological well or top hat filter as described above with its parameters set to find smaller objects.
A number of the outermost rows and columns of pixels, for example, eight, forming a border around the frame are then removed, as they contain artifacts and other irrelevant information introduced by the operations performed above (245). The complement of the resultant image is then taken by a boolean 'NOT' operation (250). Consequently, the binary image will consist of binary 'l's in all locations except in the slightly enlarged areas encompassing objects too small to be malignant cells. This complemented image thus forms the binary mask used to subtract the small objects from the earlier developed frame image. When boolean ANDed with the frame
image having objects of the appropriate size and smaller (220), the small objects will be elimj-nated. -
Once the small objects have been removed and the frame image contains
Figure imgf000024_0001
objects having a gray scale density and size compatible with the gray
scale density and size of a cervical cancer cell nucleus an operation is performed to suppress isolated white pixels in the frame image (255). These isolated white pixels (binary '1') constitutes noise introduced during any of the previous operations and are not of a sufficient size to be a cell nucleus. Using the binary mask notation discussed
above relative to binary erosion and dilation operations, the suppression operation can
SUBSTITUTE SHEET be expressed mathematically as (E & A) ! (E & B) | (E & C) | (E & D) j (E & F) j (E &
G) | (E & H) | (E & I).
Once the isolated pixels have been suppressed, a "shrinking" operation is performed four times (260, 265, 270, 275). "Shrinking" is a boolean operation well-
known in the field of image processing. The shrinking operation successively removes layers of pixels around the object in the image frame until, for an object of the appropriate size, only one pixel or no pixels remain. In the case where the object is completely removed, the last pixel removed is replaced with a binary 1. Since the outer layers of pixels of the objects were successively removed progressing inwardly,
the remaining or replaced pixel will represent the approximate center or centroid of the object. Any objects remaining in the image which are larger than one pixel are
removed, as they correspond to something which is larger than a cervical cancer cell nucleus would be (280).
The location of the remaining single pixels, as defined by their X and Y
coordinates on the screen, are then listed and recorded in memory (285). The ISMAP board of the PIPE* image processor conveniently provides these coordinates.
The identification of the centroids ϋf objects Λvhfcfr*correspond to the same size and gray scale density that a typical cervical cancer cell would be expected to have marks the end of the primary classification phase of operation.
Given the discussion above and a general knowledge of image processors, such as the preferred PIPE* image processor, the primary classification functions could be
reduced to the appropriate software code for execution on a suitable image processor. As would be apparent, the code would be loaded, burned, or otherwise encoded into memory accessible by the image processor 42 for execution by the image processor.
SUBSTITUTE SHEET A 24 x 24 array of pixels surrounding each centroid, identified earlier as a net image 64, identified by the primary classification function of the image processor 42 is then transferred to the general processor 46 for storage. A secondary classification of the objects represented by these net images 64 may then be performed.
The general processor 46 individually transfers each net image 64 to the
neurocomputer 44 for secondary classification. The task of the secondary classification is to distinguish the premalignant and malignant cells from other objects of the same size which may pass the primary classifier, such as cell clumps, debris, clumps of leukocytes and mucus.
Based on training performed with a training set of several hundred or thousands of known benign and premalignant or malignant cells, as described more -ftilly above, the neurocomputer 44 will assign each net image 64 with a value, called a net value, ranging from .1 to .9, as determined by the likelihood that the object is a premalignant or malignant cell. One major advantage of the present invention over prior known cell classifiers resides in the fact that each net image 64 presented to the
secondary classifier is precentered through the primary classification on the centroid of the suspect ceH nucleus, as-described above. In revious knowirattempts to utilize neural networks and other high level template matching pattern classifiers for image recognition, difficulty has been encountered in consistently presenting the classifier with the centroid of the image requiring classification. To use an example from another application domain, back propagation networks are excellent at reading
handwritten zip code digits but have difficulty in finding where the zip code is on the envelope. The present invention overcomes this difficulty in the domain of cytology. Another advantage of the present invention is that during actual classification operations, a secondary classifier is presented with precisely the same type of net images on which it was trained. These images are also centered on the centroid of the suspect nucleus by the primary classifier in a manner identical to that used to prepare the training set images. This makes a generalization task of the secondary
classifier far easier and much more successful than anything previously known.
It should be recognized that while the image processor and digitizer 42, the general processor 46, and the neurocomputer 44 are described operating in a serial manner, in actual practice as many functions will be performed in parallel as is possible. Consequently, the components 42, 44, 46 may process different slide segments or different areas of a segment concurrently, greatly reducing the time required to screen a slide.
As noted above the secondary classifier is trained to associate a known benign image with an output of .1 and a known pathological image with an output of .9. Such outputs represent, for example, the degree of certainty that a cell is normal or abnormal, respectively. When the secondary classifier is presented with new,
unknown cells, it generalizes from its training and attaches a net value to the image. The closer tha. the secondary classifier is abhrto rategoiizelheπiTϊ-knowrrimage into
the benign category, the closer is its net value equal to .1. Conversely, the more
closely that the unknown image appears to resemble the nonbenign images of its training set, the closer is the net value assigned to that image equal to .9.
Once all objects classified by the primary classifier to be possibly premalignant or malignant have been classified by the secondary classifier, the net values assigned to those objects by the secondary classifier are ranked from closest to .9 down to .1.
SUBSTITUTE SHEET The highest ranked 64 objects are then stored. This completes the end of the high resolution scan (35).
Once the highest ranked 64 objects in the specimen have been identified which are most likely to be malignant or premalignant, the high resolution rescan (40) is begun at the 200 power magnification. During the rescan (40) the stage 14 will move the slide relative to the microscope 12 so that one of the highest 64 ranked object is in the viewing path of the microscope. The image is then focused according to the previously determined high resolution focus parameters, and the camera grabs, at 512 x 484 resolution, the 128 x 104 red, green and blue component image around the centroid location. This high resolution color tile 68 is then stored in the memory 24, such as on an optical disk or tape. These operations are then performed for the next cell until all 64 of the highest ranked cells have been rescanned and their high resolution color images have been stored in the memory 24. This completes the
rescan pass (40). The automated classifier 10 may then remove the slide and replace it with another slide for further classification.
Once stored, the 64 color tiles 68 may be displayed as a summary screen 66 in their descending order of ------iking, their positioned relation to each other in the
specimen, or in some other arrangement. The tertiary classification performed by a cytotechnician can then take place at any time.
STATEMENT OF INDUSTRIAL APPLICATION
The present invention is applicable to cell classification in general and is particularly applicable to the classification of cells in a pap smear.
SUBSTITUTE SHEET

Claims

What is claimed is:
1. A method of classifying objects in a cytological specimen, comprising the steps of: a) obtaining a first image of at least part of such cytological specimen;
b) classifying objects in such first image on the basis of a predetermined criteria; c) obtaining a second image of at least one of such objects most likely to have such a predetermined criteria; and d) displaying at least part of such second image to produce a visual
display of at least one of such objects most likely to have such predetermined criteria.
2. The method of claim 1, wherein such first image is of a lower resolution than such second image.
3. The method of claim 1, including the step of further classifying such objects in such visual display.
4. The method of claim 1, wherein such visual display represents plural objects.
5. The method of claim 4, including the step of arranging such plural objects in such visual display in an eight by eight array of sixty-four elements.
6. The method of claim 4, including the step of arranging such plural
objects in such visual display in a four by four array of sixteen elements.
7. The method of claim 5, wherein a centroid of each object displayed is approximately centered in one of such elements.
8. The method of claim 6, wherein a centroid of each object displayed is approximately centered in one of such elements.
9. The method of claim 5, wherein such plural objects are arranged in an order related to the likelihood that each object has such predetermined criteria.
10. The method of claim 5, wherein such plural objects are arranged according to their relative positional locations in such cytological specimen.
11. A method of classifying objects in a specimen; comprising the steps of: a) obtaining a first digital representation of at least part of such
cytological specimen; b) storing such first digital representation; c) performing a first filtering operation to filter out images in such first representation that are the approximate size of a malignant or premalignant cell or smaller to produce a second digital representation; d) removing the images in such second representation from the images in such first representation to produce a third representation;
e) performing a second filtering operation to filter out images in such first representation that are smaller than the approximate size of a premalignant or
malignant cell to produce a fourth representation; and f) eliminating the images in such fourth representation from the images in such third representation to produce a further representation having substantially only images the approximate size of a premalignant or malignant cell.
12. The method of claim 11, wherein said step of performing a first filtering operation includes performing a morphological well function.
13. The method of claim 11, wherein said step of performing a first filtering operation includes performing a morphological top hat function.
SUBSTITUTE SHEET
14. The method of claim 11 wherein said step of performing a second filtering operation includes performing an arithmetic gradient operation on such first digital representation.
15. The method of claim 11 wherein said step of performing a second
filtering operation includes performing a morphological well function on such first digital representation.
16. The method of claim 11 wherein said step of performing a second
filtering operation includes performing a morphological top hat function on such first digital representation.
17. The method of claim 11, wherein said step of removing includes masking out images which are too bright to be malignant or premalignant cells.
18. The method of claim 11, wherein said step of eliminating includes taking the complement of said fourth representation and boolean ANDing the resulting representation with such third representation.
19. The method of claim 18, wherein said step of taking the complement includes performing a boolean NOT.
20. The method of claim 13 , wherein said step of performing an arithmetic gradient operation includes performing a Sobel operation.
21. An apparatus for classifying objects in a cytological specimen,
comprising: means for obtaining a first image of at least part of such cytological specimen;
means for classifying objects in such first image on the basis of a predetermined criteria;
SUBSTITUTE SHEET means for obtaining a second image of at least one of such objects most likely to have such a predetermined criteria; and means for displaying at least part of such second image to produce a visual display of at least one of such objects most likely to have such predetermined criteria.
SUBSTITUTE SHEET
PCT/US1992/000660 1991-01-29 1992-01-28 Morphological classification system and method WO1992013308A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US07647438 US5257182B1 (en) 1991-01-29 1991-01-29 Morphological classification system and method
US647,438 1991-01-29

Publications (1)

Publication Number Publication Date
WO1992013308A1 true WO1992013308A1 (en) 1992-08-06

Family

ID=24596994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1992/000660 WO1992013308A1 (en) 1991-01-29 1992-01-28 Morphological classification system and method

Country Status (3)

Country Link
US (1) US5257182B1 (en)
AU (1) AU1375692A (en)
WO (1) WO1992013308A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995010036A2 (en) * 1993-10-07 1995-04-13 Tafas Triantafillos P Cytological screening method
EP0660103A2 (en) * 1993-12-22 1995-06-28 Hitachi, Ltd. Particle image analyzing apparatus
EP0660104A2 (en) * 1993-12-27 1995-06-28 Hitachi, Ltd. Urinary sediment examining apparatus
WO1997011350A2 (en) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. A neural network assisted multi-spectral segmentation system
US5677966A (en) * 1993-10-12 1997-10-14 Autocyte, Inc. Interactive automated cytology method incorporating both manual and automatic determinations
GB2318637A (en) * 1996-10-25 1998-04-29 Accumed International Inc Cytological specimen analyser
US5889880A (en) * 1995-06-07 1999-03-30 Autocyte, Inc. Interactive automated cytology method incorporating both manual and automatic determinations
US6136540A (en) * 1994-10-03 2000-10-24 Ikonisys Inc. Automated fluorescence in situ hybridization detection of genetic abnormalities
US6148096A (en) * 1995-09-15 2000-11-14 Accumed International, Inc. Specimen preview and inspection system
EP1078257A1 (en) * 1998-05-09 2001-02-28 Iconisys Inc. Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis
WO2002056256A2 (en) * 2001-01-11 2002-07-18 Interscope Technologies, Inc. A system and method for finding regions of interest for microscopic digital montage imaging
US6535626B1 (en) 2000-01-14 2003-03-18 Accumed International, Inc. Inspection system with specimen preview
EP1474684A2 (en) * 2002-01-15 2004-11-10 VYSIS, Inc. Method and/or system for analyzing biological samples using a computer system
EP1548480A1 (en) * 2002-09-06 2005-06-29 Celestar Lexico-Sciences, Inc. Microscope image processing system, microscope image processing method, program, and recording medium
US6956695B2 (en) 2001-03-19 2005-10-18 Ikonisys, Inc. System and method for increasing the contrast of an image produced by an epifluorescence microscope
US7155049B2 (en) 2001-01-11 2006-12-26 Trestle Acquisition Corp. System for creating microscopic digital montage images
WO2008034721A1 (en) * 2006-09-18 2008-03-27 Robert Bosch Gmbh Method for processing an intensity image of a microscope
WO2009023101A2 (en) 2007-08-07 2009-02-19 Nextslide Imaging Llc Network review in clinical hematology
CN102349018A (en) * 2009-03-17 2012-02-08 索尼公司 Image creating device and image creating method

Families Citing this family (159)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5544650A (en) * 1988-04-08 1996-08-13 Neuromedical Systems, Inc. Automated specimen classification system and method
US5740270A (en) * 1988-04-08 1998-04-14 Neuromedical Systems, Inc. Automated cytological specimen classification system and method
US5655029A (en) * 1990-11-07 1997-08-05 Neuromedical Systems, Inc. Device and method for facilitating inspection of a specimen
CA2130340C (en) * 1992-02-18 2000-06-06 Shih-Jong James Lee Method for identifying objects using data processing techniques
US5757516A (en) * 1993-01-11 1998-05-26 Canon Inc. Noise quenching method and apparatus for a colour display system
US5426010A (en) * 1993-02-26 1995-06-20 Oxford Computer, Inc. Ultra high resolution printing method
US5479526A (en) * 1993-03-23 1995-12-26 Martin Marietta Pixel designator for small objects
US5587833A (en) * 1993-07-09 1996-12-24 Compucyte Corporation Computerized microscope specimen encoder
DE69429145T2 (en) * 1993-08-19 2002-07-18 Hitachi Ltd Classification and test device for particles in a liquid
US5797130A (en) * 1993-11-16 1998-08-18 Neopath, Inc. Method for testing proficiency in screening images of biological slides
DK0745243T3 (en) * 1994-02-14 2004-04-19 Autocyte North Carolina Llc Methods for automated classification of cytological specimens
US5740266A (en) * 1994-04-15 1998-04-14 Base Ten Systems, Inc. Image processing system and method
US5625705A (en) * 1994-06-03 1997-04-29 Neuromedical Systems, Inc. Intensity texture based classification system and method
CA2195565A1 (en) * 1994-07-26 1996-02-08 Robert Tjon-Fo-Sang Inspection device and method
US5740269A (en) * 1994-09-20 1998-04-14 Neopath, Inc. Method and apparatus for robust biological specimen classification
AU3544995A (en) * 1994-09-20 1996-04-09 Neopath, Inc. Apparatus for identification and integration of multiple cell patterns
WO1996010237A1 (en) * 1994-09-20 1996-04-04 Neopath, Inc. Biological specimen analysis system processing integrity checking apparatus
US5638459A (en) * 1994-09-20 1997-06-10 Neopath, Inc. Method and apparatus for detecting a microscope slide coverslip
AU3586195A (en) * 1994-09-20 1996-04-09 Neopath, Inc. Apparatus for automated identification of cell groupings on a biological specimen
CA2200455A1 (en) * 1994-09-20 1996-03-28 Louis R. Piloco Apparatus for illumination stabilization and homogenization
US5692066A (en) * 1994-09-20 1997-11-25 Neopath, Inc. Method and apparatus for image plane modulation pattern recognition
EP0791205A4 (en) * 1994-09-20 1999-04-21 Neopath Inc Biological analysis system self-calibration apparatus
US5566249A (en) * 1994-09-20 1996-10-15 Neopath, Inc. Apparatus for detecting bubbles in coverslip adhesive
US5757954A (en) * 1994-09-20 1998-05-26 Neopath, Inc. Field prioritization apparatus and method
US5715327A (en) * 1994-09-20 1998-02-03 Neopath, Inc. Method and apparatus for detection of unsuitable conditions for automated cytology scoring
WO1996009594A1 (en) * 1994-09-20 1996-03-28 Neopath, Inc. Apparatus for automated identification of thick cell groupings on a biological specimen
US5978497A (en) * 1994-09-20 1999-11-02 Neopath, Inc. Apparatus for the identification of free-lying cells
WO1996009598A1 (en) * 1994-09-20 1996-03-28 Neopath, Inc. Cytological slide scoring apparatus
US5627908A (en) * 1994-09-20 1997-05-06 Neopath, Inc. Method for cytological system dynamic normalization
US5647025A (en) * 1994-09-20 1997-07-08 Neopath, Inc. Automatic focusing of biomedical specimens apparatus
WO1996010801A1 (en) * 1994-09-30 1996-04-11 Neopath, Inc. Method and apparatus for highly efficient computer aided screening
USRE43097E1 (en) 1994-10-13 2012-01-10 Illumina, Inc. Massively parallel signature sequencing by ligation of encoded adaptors
WO1996020456A1 (en) * 1994-12-23 1996-07-04 International Remote Imaging Systems, Inc. Method and apparatus of analyzing particles in a fluid sample and displaying same
US5848177A (en) * 1994-12-29 1998-12-08 Board Of Trustees Operating Michigan State University Method and system for detection of biological materials using fractal dimensions
US5619428A (en) * 1995-05-31 1997-04-08 Neopath, Inc. Method and apparatus for integrating an automated system to a laboratory
DE69614449T2 (en) * 1995-05-31 2002-05-16 Horiba Ltd Method of measuring microbial activity
US5625706A (en) * 1995-05-31 1997-04-29 Neopath, Inc. Method and apparatus for continously monitoring and forecasting slide and specimen preparation for a biological specimen population
US5671288A (en) * 1995-05-31 1997-09-23 Neopath, Inc. Method and apparatus for assessing slide and specimen preparation quality
US5787208A (en) * 1995-06-07 1998-07-28 Neopath, Inc. Image enhancement method and apparatus
US6252979B1 (en) * 1995-06-07 2001-06-26 Tripath Imaging, Inc. Interactive method and apparatus for sorting biological specimens
JP3504030B2 (en) * 1995-07-04 2004-03-08 シスメックス株式会社 Method and apparatus for determining particle criterion, and particle analyzer using the criterion
US5642433A (en) * 1995-07-31 1997-06-24 Neopath, Inc. Method and apparatus for image contrast quality evaluation
US5745601A (en) * 1995-07-31 1998-04-28 Neopath, Inc. Robustness of classification measurement apparatus and method
US5621519A (en) * 1995-07-31 1997-04-15 Neopath, Inc. Imaging system transfer function control method and apparatus
US6430309B1 (en) 1995-09-15 2002-08-06 Monogen, Inc. Specimen preview and inspection system
US6718053B1 (en) * 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US6151405A (en) * 1996-11-27 2000-11-21 Chromavision Medical Systems, Inc. System and method for cellular specimen grading
AU724393B2 (en) * 1995-11-30 2000-09-21 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US5835620A (en) * 1995-12-19 1998-11-10 Neuromedical Systems, Inc. Boundary mapping system and method
US5798514A (en) * 1996-01-11 1998-08-25 Accumed Inc. Circular bar code
US5841890A (en) * 1996-05-06 1998-11-24 Northrop Grumman Corporation Multi-dimensional wavelet tomography
US6031930A (en) * 1996-08-23 2000-02-29 Bacus Research Laboratories, Inc. Method and apparatus for testing a progression of neoplasia including cancer chemoprevention testing
US6396941B1 (en) 1996-08-23 2002-05-28 Bacus Research Laboratories, Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
US6404906B2 (en) 1997-03-03 2002-06-11 Bacus Research Laboratories,Inc. Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
US6272235B1 (en) 1997-03-03 2001-08-07 Bacus Research Laboratories, Inc. Method and apparatus for creating a virtual microscope slide
FR2754346B1 (en) * 1996-10-07 1999-02-05 Hycel Groupe Lisabio METHOD FOR IDENTIFYING PARTICLE GROUPS AND CORRESPONDING IDENTIFICATION DEVICE
US6738529B1 (en) * 1996-10-09 2004-05-18 Symyx Technologies, Inc. Analysis of chemical data from images
US5937103A (en) * 1997-01-25 1999-08-10 Neopath, Inc. Method and apparatus for alias free measurement of optical transfer function
US6753161B2 (en) * 1997-03-27 2004-06-22 Oncosis Llc Optoinjection methods
AU736321B2 (en) 1997-05-23 2001-07-26 Lynx Therapeutics, Inc. System and apparatus for sequential processing of analytes
US5959726A (en) * 1997-07-25 1999-09-28 Neopath, Inc. Modulation transfer function test compensation for test pattern duty cycle
US6181811B1 (en) 1998-01-13 2001-01-30 Neopath, Inc. Method and apparatus for optimizing biological and cytological specimen screening and diagnosis
US6526167B1 (en) * 1998-05-26 2003-02-25 Sony Corporation Image processing apparatus and method and provision medium
WO1999063320A2 (en) 1998-05-29 1999-12-09 The Brigham And Women's Hospital, Inc. Computer system and computer-implemented process for analyzing results of cytology tests for performance evaluation of cytologists
US6606413B1 (en) * 1998-06-01 2003-08-12 Trestle Acquisition Corp. Compression packaged image transmission for telemicroscopy
US20040083085A1 (en) * 1998-06-01 2004-04-29 Zeineh Jack A. Integrated virtual slide and live microscope system
EP2045334A1 (en) 1998-06-24 2009-04-08 Illumina, Inc. Decoding of array sensors with microspheres
CN1325287A (en) 1998-07-23 2001-12-05 奥乐斯堪/崔龙合资企业公司 Apparatus and method for obtaining transepithelia L specimen of a body surface using a non-lacerating technique
US6143512A (en) * 1998-08-17 2000-11-07 Markovic; Nenad Cap-pap test
US6091843A (en) * 1998-09-03 2000-07-18 Greenvision Systems Ltd. Method of calibration and real-time analysis of particulates
US20060257884A1 (en) * 2004-05-20 2006-11-16 Amnis Corporation Methods for preparing and analyzing cells having chromosomal abnormalities
US8131053B2 (en) 1999-01-25 2012-03-06 Amnis Corporation Detection of circulating tumor cells using imaging flow cytometry
US8885913B2 (en) 1999-01-25 2014-11-11 Amnis Corporation Detection of circulating tumor cells using imaging flow cytometry
US8005314B2 (en) 2005-12-09 2011-08-23 Amnis Corporation Extended depth of field imaging for high speed object analysis
US8406498B2 (en) 1999-01-25 2013-03-26 Amnis Corporation Blood and cell analysis using an imaging flow cytometer
US7450229B2 (en) * 1999-01-25 2008-11-11 Amnis Corporation Methods for analyzing inter-cellular phenomena
US6284482B1 (en) * 1999-04-23 2001-09-04 Oralscan Laboratories, Inc. Method for detection of abnormal keratinization in epithelial tissue
US6297044B1 (en) * 1999-02-23 2001-10-02 Oralscan Laboratories, Inc. Minimally invasive apparatus for testing lesions of the oral cavity and similar epithelium
JP4748628B2 (en) * 1999-02-23 2011-08-17 シーディーエックス ラボラトリーズ インコーポレーテッド A minimally invasive device for examining oral and similar epithelial lesions
EP1177523B1 (en) 1999-04-13 2013-08-21 Chromavision Medical Systems, Inc. Histological reconstruction and automated image analysis
US7369304B2 (en) * 1999-10-29 2008-05-06 Cytyc Corporation Cytological autofocusing imaging systems and methods
US6348325B1 (en) 1999-10-29 2002-02-19 Cytyc Corporation Cytological stain composition
US6593102B2 (en) 1999-10-29 2003-07-15 Cytyc Corporation Cytological stain composition
US6665060B1 (en) 1999-10-29 2003-12-16 Cytyc Corporation Cytological imaging system and method
WO2001033196A2 (en) * 1999-10-29 2001-05-10 Veracel Inc. Controlled review of medical sample
US6661501B1 (en) 1999-10-29 2003-12-09 Cytyc Corporation Cytological stain composition including verification characteristic
WO2001036939A2 (en) * 1999-11-04 2001-05-25 Meltec Multi-Epitope-Ligand-Technologies Gmbh Method for the automatic analysis of microscope images
JP5087192B2 (en) * 1999-11-30 2012-11-28 インテレクソン コーポレイション Method and apparatus for selectively aiming a specific cell in a cell group
US7738688B2 (en) * 2000-05-03 2010-06-15 Aperio Technologies, Inc. System and method for viewing virtual slides
AUPQ849200A0 (en) * 2000-06-30 2000-07-27 Cea Technologies Inc. Unsupervised scene segmentation
WO2002031583A1 (en) 2000-10-12 2002-04-18 Amnis Corporation System and method for high numeric aperture imaging systems
JP2004512845A (en) 2000-10-24 2004-04-30 オンコシス リミテッド ライアビリティ カンパニー Methods and devices for selectively targeting cells in a three-dimensional specimen
US6466690C1 (en) * 2000-12-19 2008-11-18 Bacus Res Lab Inc Method and apparatus for processing an image of a tissue sample microarray
US6816606B2 (en) 2001-02-21 2004-11-09 Interscope Technologies, Inc. Method for maintaining high-quality focus during high-throughput, microscopic digital montage imaging
US6798571B2 (en) 2001-01-11 2004-09-28 Interscope Technologies, Inc. System for microscopic digital montage imaging using a pulse light illumination system
DE10143441A1 (en) * 2001-09-05 2003-03-27 Leica Microsystems Process and microscope system for observing dynamic processes
GB2397423B (en) * 2001-09-17 2005-06-01 Ca Minister Agriculture & Food A method and apparatus for identifying and quantifying characteristics of seeds and other small objects
US10156501B2 (en) 2001-11-05 2018-12-18 Life Technologies Corporation Automated microdissection instrument for determining a location of a laser beam projection on a worksurface area
US8722357B2 (en) 2001-11-05 2014-05-13 Life Technologies Corporation Automated microdissection instrument
US8715955B2 (en) 2004-09-09 2014-05-06 Life Technologies Corporation Laser microdissection apparatus and method
US8346483B2 (en) * 2002-09-13 2013-01-01 Life Technologies Corporation Interactive and automated tissue image analysis with global training database and variable-abstraction processing in cytological specimen classification and laser capture microdissection applications
WO2003042788A2 (en) 2001-11-13 2003-05-22 Chromavision Medical Systems, Inc. A system for tracking biological samples
US7291456B2 (en) 2002-01-24 2007-11-06 The Regents Of The University Of California Method for determining differences in molecular interactions and for screening a combinatorial library
CA2436043C (en) * 2002-02-22 2006-01-10 Bacus Research Laboratories, Inc. Focusable virtual microscopy apparatus and method
US6889154B2 (en) * 2002-04-18 2005-05-03 Infineon Technologies Ag Method and apparatus for calibrating data-dependent noise prediction
US7522678B2 (en) * 2002-04-18 2009-04-21 Infineon Technologies Ag Method and apparatus for a data-dependent noise predictive viterbi
AU2003231827A1 (en) * 2002-05-23 2003-12-12 Invitrogen Corporation Pseudo-tissues and uses thereof
CA2390056A1 (en) * 2002-06-07 2003-12-07 Du Pont Canada Inc. Method and system for managing commodity information in a supply chain of production
US7272252B2 (en) 2002-06-12 2007-09-18 Clarient, Inc. Automated system for combining bright field and fluorescent microscopy
DE10234404B4 (en) * 2002-07-29 2021-10-14 Leica Microsystems Cms Gmbh Method, arrangement and software for monitoring and controlling a microscope
US20040114829A1 (en) * 2002-10-10 2004-06-17 Intelligent System Solutions Corp. Method and system for detecting and correcting defects in a digital image
US7302096B2 (en) * 2002-10-17 2007-11-27 Seiko Epson Corporation Method and apparatus for low depth of field image segmentation
GB0224626D0 (en) * 2002-10-23 2002-12-04 Qinetiq Ltd Tubule grading
US7200252B2 (en) * 2002-10-28 2007-04-03 Ventana Medical Systems, Inc. Color space transformations for use in identifying objects of interest in biological specimens
DE10250503A1 (en) * 2002-10-29 2004-05-19 Leica Microsystems Heidelberg Gmbh Microscope system and method for the detection and compensation of changes in a recorded image content
US8712118B2 (en) * 2003-04-10 2014-04-29 Carl Zeiss Microimaging Gmbh Automated measurement of concentration and/or amount in a biological sample
US20040202357A1 (en) * 2003-04-11 2004-10-14 Perz Cynthia B. Silhouette image acquisition
US20050095578A1 (en) * 2003-10-31 2005-05-05 Koller Manfred R. Method and apparatus for cell permeabilization
US7425426B2 (en) * 2004-03-15 2008-09-16 Cyntellect, Inc. Methods for purification of cells based on product secretion
US8953866B2 (en) 2004-03-16 2015-02-10 Amnis Corporation Method for imaging and differential analysis of cells
US8103080B2 (en) 2004-03-16 2012-01-24 Amnis Corporation Method for imaging and differential analysis of cells
ATE538138T1 (en) 2004-03-16 2012-01-15 Amnis Corp IMAGING-BASED QUANTIFICATION OF MOLECULAR TRANSLOCATION
US7653260B2 (en) * 2004-06-17 2010-01-26 Carl Zeis MicroImaging GmbH System and method of registering field of view
US7316904B1 (en) 2004-06-30 2008-01-08 Chromodynamics, Inc. Automated pap screening using optical detection of HPV with or without multispectral imaging
US8582924B2 (en) 2004-06-30 2013-11-12 Carl Zeiss Microimaging Gmbh Data structure of an image storage and retrieval system
US7792338B2 (en) * 2004-08-16 2010-09-07 Olympus America Inc. Method and apparatus of mechanical stage positioning in virtual microscopy image capture
US20070019854A1 (en) * 2005-05-10 2007-01-25 Bioimagene, Inc. Method and system for automated digital image analysis of prostrate neoplasms using morphologic patterns
US20070031043A1 (en) 2005-08-02 2007-02-08 Perz Cynthia B System for and method of intelligently directed segmentation analysis for automated microscope systems
US20070091109A1 (en) 2005-09-13 2007-04-26 Roscoe Atkinson Image quality
US7933435B2 (en) * 2005-11-21 2011-04-26 Vala Sciences, Inc. System, method, and kit for processing a magnified image of biological material to identify components of a biological object
US20070140543A1 (en) * 2005-12-19 2007-06-21 Cytyc Corporation Systems and methods for enhanced cytological specimen review
JP4684959B2 (en) * 2006-07-04 2011-05-18 キヤノン株式会社 Image processing apparatus, image processing method, and program
JP4637063B2 (en) * 2006-07-04 2011-02-23 キヤノン株式会社 Image processing apparatus, image processing method, and program
DE102006042157B4 (en) 2006-09-06 2013-03-21 Leica Microsystems Cms Gmbh Method and microscope system for scanning a sample
US8795197B2 (en) 2007-07-17 2014-08-05 Histologics, LLC Frictional trans-epithelial tissue disruption collection apparatus and method of inducing an immune response
EP2166965B1 (en) 2007-07-17 2017-05-17 Neal Marc Lonky Frictional trans-epithelial tissue disruption and collection apparatus
US20100284618A1 (en) * 2007-12-31 2010-11-11 Dimitrios Ioannou Method and system for identifying an object
US8346574B2 (en) 2008-02-29 2013-01-01 Dako Denmark A/S Systems and methods for tracking and providing workflow information
JP5079552B2 (en) * 2008-03-13 2012-11-21 オリンパス株式会社 Image processing apparatus, imaging apparatus, and image processing method
US8135202B2 (en) * 2008-06-02 2012-03-13 Nec Laboratories America, Inc. Automated method and system for nuclear analysis of biopsy images
US8090177B2 (en) * 2008-08-01 2012-01-03 Sti Medical Systems, Llc Methods for detection and characterization of atypical vessels in cervical imagery
US8483454B2 (en) * 2008-10-10 2013-07-09 Sti Medical Systems, Llc Methods for tissue classification in cervical imagery
KR20110106436A (en) * 2009-01-09 2011-09-28 신텔렉트 인코포레이티드 Genetic analysis of cells
WO2010081171A2 (en) 2009-01-12 2010-07-15 Cyntellect, Inc. Laser mediated sectioning and transfer of cell colonies
US8451524B2 (en) 2009-09-29 2013-05-28 Amnis Corporation Modifying the output of a laser to achieve a flat top in the laser's Gaussian beam intensity profile
US9607202B2 (en) 2009-12-17 2017-03-28 University of Pittsburgh—of the Commonwealth System of Higher Education Methods of generating trophectoderm and neurectoderm from human embryonic stem cells
US9044213B1 (en) 2010-03-26 2015-06-02 Histologics, LLC Frictional tissue sampling and collection method and device
US8817115B1 (en) 2010-05-05 2014-08-26 Amnis Corporation Spatial alignment of image data from a multichannel detector using a reference image
US8699813B2 (en) * 2010-11-19 2014-04-15 Analog Devices, Inc Adaptive filter for low-light noise reduction
JP5644447B2 (en) * 2010-12-06 2014-12-24 ソニー株式会社 Microscope, region determination method, and program
IN2014CN01734A (en) 2011-09-13 2015-05-29 Koninkl Philips Nv
WO2013106842A2 (en) * 2012-01-13 2013-07-18 The Charles Stark Draper Laboratory, Inc. Stem cell bioinformatics
US10201332B1 (en) 2012-12-03 2019-02-12 Healoe Llc Device and method of orienting a biopsy device on epithelial tissue
JP6515317B2 (en) * 2014-08-06 2019-05-22 パナソニックIpマネジメント株式会社 Particle detector
US11013466B2 (en) 2016-01-28 2021-05-25 Healoe, Llc Device and method to control and manipulate a catheter
CN110100198B (en) * 2016-12-23 2022-03-22 生物辐射实验室股份有限公司 Reduction of background signal in print image
US10402623B2 (en) 2017-11-30 2019-09-03 Metal Industries Research & Development Centre Large scale cell image analysis method and system
EP3951386B1 (en) * 2019-03-26 2024-04-03 Osaka University Homology-based detection of cancer in biological tissue
WO2023112002A1 (en) * 2021-12-18 2023-06-22 Imageprovision Technology Private Limited Photomicrographic image-processing method for automatic scanning, detection and classification of particles

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3908078A (en) * 1971-10-06 1975-09-23 Object Recognition Systems Method and apparatus for digital recognition of objects particularly biological materials
US4097845A (en) * 1976-11-01 1978-06-27 Rush-Presbyterian-St. Luke's Medical Center Method of and an apparatus for automatic classification of red blood cells
US4199748A (en) * 1976-11-01 1980-04-22 Rush-Presbyterian-St. Luke's Medical Center Automated method and apparatus for classification of cells with application to the diagnosis of anemia
US4513438A (en) * 1982-04-15 1985-04-23 Coulter Electronics, Inc. Automated microscopy system and method for locating and re-locating objects in an image
US4523278A (en) * 1979-02-01 1985-06-11 Prof. Dr.-Ing. Werner H. Bloss Method of automatic detection of cells and determination of cell features from cytological smear preparations
US5068906A (en) * 1988-04-22 1991-11-26 Toa Medical Electronics Co., Ltd. Processor for extracting and memorizing cell images

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1081364A (en) * 1976-09-28 1980-07-08 Shuichi Samejima Differential detection systems with non-redundant error correction
JPS5661650A (en) * 1979-10-24 1981-05-27 Omron Tateisi Electronics Co Analyzing device of cell
US4965725B1 (en) * 1988-04-08 1996-05-07 Neuromedical Systems Inc Neural network based automated cytological specimen classification system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3908078A (en) * 1971-10-06 1975-09-23 Object Recognition Systems Method and apparatus for digital recognition of objects particularly biological materials
US4097845A (en) * 1976-11-01 1978-06-27 Rush-Presbyterian-St. Luke's Medical Center Method of and an apparatus for automatic classification of red blood cells
US4199748A (en) * 1976-11-01 1980-04-22 Rush-Presbyterian-St. Luke's Medical Center Automated method and apparatus for classification of cells with application to the diagnosis of anemia
US4523278A (en) * 1979-02-01 1985-06-11 Prof. Dr.-Ing. Werner H. Bloss Method of automatic detection of cells and determination of cell features from cytological smear preparations
US4513438A (en) * 1982-04-15 1985-04-23 Coulter Electronics, Inc. Automated microscopy system and method for locating and re-locating objects in an image
US5068906A (en) * 1988-04-22 1991-11-26 Toa Medical Electronics Co., Ltd. Processor for extracting and memorizing cell images

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995010036A3 (en) * 1993-10-07 1995-06-01 Triantafillos P Tafas Cytological screening method
WO1995010036A2 (en) * 1993-10-07 1995-04-13 Tafas Triantafillos P Cytological screening method
US6221607B1 (en) 1993-10-07 2001-04-24 Ikonisys Inc. Automated fluorescence in situ hybridization detection of genetic abnormalities
AU694144B2 (en) * 1993-10-07 1998-07-16 Triantafillos P. Tafas Cytological screening method
US5677966A (en) * 1993-10-12 1997-10-14 Autocyte, Inc. Interactive automated cytology method incorporating both manual and automatic determinations
EP0660103A3 (en) * 1993-12-22 1996-03-06 Hitachi Ltd Particle image analyzing apparatus.
US6229912B1 (en) 1993-12-22 2001-05-08 Hitachi, Ltd. Particle image analyzing apparatus
EP0660103A2 (en) * 1993-12-22 1995-06-28 Hitachi, Ltd. Particle image analyzing apparatus
EP0660104A2 (en) * 1993-12-27 1995-06-28 Hitachi, Ltd. Urinary sediment examining apparatus
EP0660104A3 (en) * 1993-12-27 1995-08-09 Hitachi Ltd Urinary sediment examining apparatus.
US6136540A (en) * 1994-10-03 2000-10-24 Ikonisys Inc. Automated fluorescence in situ hybridization detection of genetic abnormalities
US5889880A (en) * 1995-06-07 1999-03-30 Autocyte, Inc. Interactive automated cytology method incorporating both manual and automatic determinations
US6148096A (en) * 1995-09-15 2000-11-14 Accumed International, Inc. Specimen preview and inspection system
WO1997011350A2 (en) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. A neural network assisted multi-spectral segmentation system
AU726049B2 (en) * 1995-09-19 2000-10-26 Veracel Inc. A neural network assisted multi-spectral segmentation system
US6463425B2 (en) 1995-09-19 2002-10-08 Morphometrix Technologies Inc. Neural network assisted multi-spectral segmentation system
WO1997011350A3 (en) * 1995-09-19 1997-05-22 Morphometrix Techn Inc A neural network assisted multi-spectral segmentation system
US6091842A (en) * 1996-10-25 2000-07-18 Accumed International, Inc. Cytological specimen analysis system with slide mapping and generation of viewing path information
GB2318637B (en) * 1996-10-25 1999-06-16 Accumed International Inc Cytological speciman analysis system with prescreening and generation of viewing path information
GB2318637A (en) * 1996-10-25 1998-04-29 Accumed International Inc Cytological specimen analyser
EP1078257A1 (en) * 1998-05-09 2001-02-28 Iconisys Inc. Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis
EP1078257A4 (en) * 1998-05-09 2007-07-18 Iconisys Inc Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis
US6535626B1 (en) 2000-01-14 2003-03-18 Accumed International, Inc. Inspection system with specimen preview
WO2002056256A3 (en) * 2001-01-11 2003-02-13 Interscope Technologies Inc A system and method for finding regions of interest for microscopic digital montage imaging
US7212660B2 (en) 2001-01-11 2007-05-01 Clarient, Inc. System and method for finding regions of interest for microscopic digital montage imaging
US7876948B2 (en) 2001-01-11 2011-01-25 Carl Zeiss Microimaging Gmbh System for creating microscopic digital montage images
US7421102B2 (en) 2001-01-11 2008-09-02 Carl Zeiss Microimaging Ais, Inc. System and method for finding regions of interest for microscopic digital montage imaging
US6993169B2 (en) 2001-01-11 2006-01-31 Trestle Corporation System and method for finding regions of interest for microscopic digital montage imaging
US7155049B2 (en) 2001-01-11 2006-12-26 Trestle Acquisition Corp. System for creating microscopic digital montage images
WO2002056256A2 (en) * 2001-01-11 2002-07-18 Interscope Technologies, Inc. A system and method for finding regions of interest for microscopic digital montage imaging
US7869641B2 (en) 2001-01-11 2011-01-11 Carl Zeiss Microimaging Gmbh System and method for finding regions of interest for microscopic digital montage imaging
US7330309B2 (en) 2001-03-19 2008-02-12 Ikonisys, Inc. System and method for increasing the contrast of an image produced by an epifluorescence microscope
US6956695B2 (en) 2001-03-19 2005-10-18 Ikonisys, Inc. System and method for increasing the contrast of an image produced by an epifluorescence microscope
EP1474684A2 (en) * 2002-01-15 2004-11-10 VYSIS, Inc. Method and/or system for analyzing biological samples using a computer system
EP1474684A4 (en) * 2002-01-15 2008-01-09 Vysis Inc Method and/or system for analyzing biological samples using a computer system
US8594944B2 (en) 2002-01-15 2013-11-26 Vysis, Inc. Method and/or system for analyzing biological samples using a computer system
EP3182121A1 (en) * 2002-01-15 2017-06-21 Abbott Molecular Inc. Method and/or system for analyzing biological samples using a computer system
EP1548480A4 (en) * 2002-09-06 2007-02-14 Celestar Lexico Sciences Inc Microscope image processing system, microscope image processing method, program, and recording medium
EP1548480A1 (en) * 2002-09-06 2005-06-29 Celestar Lexico-Sciences, Inc. Microscope image processing system, microscope image processing method, program, and recording medium
GB2454857B (en) * 2006-09-18 2010-06-09 Bosch Gmbh Robert Method for processing a microscope intensity image
GB2454857A (en) * 2006-09-18 2009-05-27 Bosch Gmbh Robert Method for processing an intensity image of a microscope
WO2008034721A1 (en) * 2006-09-18 2008-03-27 Robert Bosch Gmbh Method for processing an intensity image of a microscope
WO2009023101A2 (en) 2007-08-07 2009-02-19 Nextslide Imaging Llc Network review in clinical hematology
EP2183570B1 (en) * 2007-08-07 2013-12-25 Nextslide Imaging LLC Network review in clinical hematology
CN102349018A (en) * 2009-03-17 2012-02-08 索尼公司 Image creating device and image creating method
CN102349018B (en) * 2009-03-17 2014-01-29 索尼公司 Image creating device and image creating method

Also Published As

Publication number Publication date
US5257182B1 (en) 1996-05-07
AU1375692A (en) 1992-08-27
US5257182A (en) 1993-10-26

Similar Documents

Publication Publication Date Title
US5257182A (en) Morphological classification system and method
US5625705A (en) Intensity texture based classification system and method
US5544650A (en) Automated specimen classification system and method
US5987158A (en) Apparatus for automated identification of thick cell groupings on a biological specimen
AU703072B2 (en) Apparatus for detecting bubbles in coverslip adhesive
US5757954A (en) Field prioritization apparatus and method
US5287272A (en) Automated cytological specimen classification system and method
US5933519A (en) Cytological slide scoring apparatus
US6327377B1 (en) Automated cytological specimen classification system and method
US5333207A (en) Inspection apparatus and method with inspection auditing for images presented on a display
EP0479977B1 (en) Automated cytological specimen classification system and method
CA2200457C (en) Biological analysis system self-calibration apparatus
EP0745243B9 (en) Automated cytological specimen classification methods
JP4391527B2 (en) A system that organizes multiple objects of interest within a field of interest
JP4864709B2 (en) A system for determining the staining quality of slides using a scatter plot distribution
WO1991006911A1 (en) Automated cytological specimen classification system and method
GB2329014A (en) Automated identification of tubercle bacilli
US11790673B2 (en) Method for detection of cells in a cytological sample having at least one anomaly
EP0995102B1 (en) Inspection system with specimen preview
EP4312151A1 (en) Artificial intelligence training system for medical applications and method
CN1153564A (en) Intensity texture based classification system and method
WO2000062241A1 (en) Method and apparatus for determining microscope specimen preparation type
CA2182793C (en) Automated cytological specimen classification system and method
Seit et al. Performance Comparison Of Extracted Features In Automated Classification Of Cervical Smears
Luck et al. PAPNET TM: an automated cytology screener using image processing and neural networks

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR CA CH CS DE DK ES FI GB HU JP KP KR LK LU MG MN MW NL NO PL RO RU SD SE

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU MC ML MR NL SE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA