CA1323700C - Neural network based automated cytological specimen classification system and method - Google Patents
Neural network based automated cytological specimen classification system and methodInfo
- Publication number
- CA1323700C CA1323700C CA000595659A CA595659A CA1323700C CA 1323700 C CA1323700 C CA 1323700C CA 000595659 A CA000595659 A CA 000595659A CA 595659 A CA595659 A CA 595659A CA 1323700 C CA1323700 C CA 1323700C
- Authority
- CA
- Canada
- Prior art keywords
- classifier
- automated
- primary
- specimen
- cells
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000002380 cytological effect Effects 0.000 title claims abstract description 52
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000012216 screening Methods 0.000 claims abstract description 24
- 210000004027 cell Anatomy 0.000 claims description 77
- 230000003211 malignant effect Effects 0.000 claims description 37
- 230000001537 neural effect Effects 0.000 claims description 29
- 239000000463 material Substances 0.000 claims description 19
- 230000003287 optical effect Effects 0.000 claims description 16
- 230000001413 cellular effect Effects 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 13
- 239000002356 single layer Substances 0.000 claims description 11
- 238000012549 training Methods 0.000 claims description 10
- 238000004458 analytical method Methods 0.000 claims description 9
- 238000011156 evaluation Methods 0.000 claims description 8
- 230000000877 morphologic effect Effects 0.000 claims description 7
- 210000000805 cytoplasm Anatomy 0.000 claims description 4
- 230000008878 coupling Effects 0.000 claims 1
- 238000010168 coupling process Methods 0.000 claims 1
- 238000005859 coupling reaction Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 abstract description 25
- 239000010410 layer Substances 0.000 description 26
- 238000009595 pap smear Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 7
- 210000002569 neuron Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 210000004940 nucleus Anatomy 0.000 description 4
- 238000003909 pattern recognition Methods 0.000 description 4
- 206010028980 Neoplasm Diseases 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 201000011510 cancer Diseases 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 210000003097 mucus Anatomy 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 150000001768 cations Chemical class 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 150000002500 ions Chemical class 0.000 description 2
- 230000007170 pathology Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 206010008342 Cervix carcinoma Diseases 0.000 description 1
- 101000822695 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C1 Proteins 0.000 description 1
- 101000655262 Clostridium perfringens (strain 13 / Type A) Small, acid-soluble spore protein C2 Proteins 0.000 description 1
- 101150039033 Eci2 gene Proteins 0.000 description 1
- 241000288147 Meleagris gallopavo Species 0.000 description 1
- 101100189356 Mus musculus Papolb gene Proteins 0.000 description 1
- 101000655256 Paraclostridium bifermentans Small, acid-soluble spore protein alpha Proteins 0.000 description 1
- 101000655264 Paraclostridium bifermentans Small, acid-soluble spore protein beta Proteins 0.000 description 1
- 241000030538 Thecla Species 0.000 description 1
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 201000010881 cervical cancer Diseases 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 210000005168 endometrial cell Anatomy 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000036210 malignancy Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 239000002674 ointment Substances 0.000 description 1
- 210000004205 output neuron Anatomy 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000010186 staining Methods 0.000 description 1
- 230000002459 sustained effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N15/1468—Electro-optical investigation, e.g. flow cytometers with spatial resolution of the texture or inner structure of the particle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G01N15/1433—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Electro-optical investigation, e.g. flow cytometers
- G01N2015/1488—Methods for deciding
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/92—Computer assisted medical diagnostics
- Y10S128/925—Neural network
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S706/00—Data processing: artificial intelligence
- Y10S706/902—Application using ai with detail of the ai system
- Y10S706/924—Medical
Abstract
ABSTRACT OF THE DISCLOSURE
An automated screening system and method for cytological specimen classification in which a neural network is utilized in performance of the classification function. Also included is an automated microscope and associated image processing circuitry.
An automated screening system and method for cytological specimen classification in which a neural network is utilized in performance of the classification function. Also included is an automated microscope and associated image processing circuitry.
Description
13237~0 Title: NEURAL NETWO~X ~ASED AUTOMATED~ICY~l~LC)G~ L:S~ECl~EN
CLASSIFICATION SYSTEM AND METHOD
TECHNICAL FIELD
Th~s invention relates generally, as indicated, to cell classification and, more partlcularly, to the use of neural networks and/or n~urocompu~erM for increa~ing the speed and accuracy of cell classi~ication.
The cervical smear tPap test) is the only mass screening cytological examination which reguires visual inspection of virtually every cell on the slide. The test suffers from a high false neg~tive rate due to the tedium and fatigue associated with its current manual mode of performance. Cell classification is typically performed on a "piece-work" basis by "cytotechnicians" employed by pathology laboratories and in some circumstances by salariad technicians. Due to the clearly life threat~ning nature of the false negative problem with its resultant possibility of undiagnosed cervical cancer, the American Cancer Society is considering doubling the frequency of recommended Pap smears. This, however, will certainly overload an already overburdened cervical screening industry as increasingly fewer individuals are willing to enter the tedious and stressful field oP manual cervical smear classification. An American Cancer Society reco~mendation to increase Pap smear frequency may only serve to increase the false negative rate by decreasing the amount of time spent on manual examination of each slide. ~ thorough manual examination should take no less than fifteen minutes per slide although a cytotechnician, especially one under a heavy workload, may spend less than half this amount of time. The College of American Pathology is well aware of this problem and would rapidly embrace an automated solution to cervical smear screening.
Due to the clear commercial potential ~or automated cervical smear analysis several attempts to this end have been made in the prior art. These attempts have proven to be 1 3 ~ 3 7 ~ O
unsuccessful since they have relied exclusively on classical pattern recognition technology (geometric, syntactic, template, statistical) or artif~cial intelligence (AI) based pattern recognition, l.e., rule-based expert system~. There i8, however, no claar algorlthm or complete and expllcit 6Qt of rule~ by which the human cytotechnlclan or pathol~gist uses his experience to comb~ne a multitude of fe~tures to make a classification in ge~talt manner. Cervical s~ear classification is, therefore, an excellent application for neural network based pattern recognition.
An example of the limitations of the prior art can be found in the 1987 reference entitled "Automated CerYical Screen Classification" by Tien et al, identified further below.
Background references of interest are, as follows:
Rumelhart, David E. and McClelland, James L., "Parallel Distributed Processing, n MIT Press, 1986, Volume l;
Tien, D. et al, "Automated Cervical Smear Classification," Proceedings of the IEE~/Ninth Annual Conference of the Engineering in Medicine and Biology Society, 1987, p. 1457-1458;
Hecht-Nielsen, Robert, "Neurocomputing: Picking the Human Brain," IEEE Spectrum, March, 1988, p 36-41; and Lippmann, Richard P., "An Introduction to Computing with Neural Nets,~ IE~E ASSP Magazine, April, 1987, p. 4-22.
BRIEF SUMMARY _ THE INVENTION
It is, therefore, a principal ob~ect of the present invention to provide an automated system and method for the classification of cytological specimens into categories, for example, categories of diagnostic significance.
Briefly, the invention lncludes an initial classifier (sometimes referred to as a primary classifier) preliminarily to classify a cytological specimen and a subsequent classifier (someti~es referred to as a secondary classifier) to classify those portions of the cytological specimen selected by the ~nit~al classifier for subsequent classification, wherein the ~3~37~0 subsequent classlfier includes a neural computer or neural network.
In one embodiment the pri~ary classifler may include a commercially ava~lable automated mlcroscope ln the ~orm Oe a s~andard cytology ~icroscope wlth a ~i~eo ca~era or CCD array with ths micro~cope stage controlled ~or automa~ic scanning of a slide. Image~ from the camer~ are digltized and outputted to the secondary classi~ier in the for~ o~ a computer sy~tem.
The computer system includes a neural networ~ as defined below and is disclosed, too, in several of the references referred to herein, which is utilized in the performance of cell image identification and classification into group~ of diagnostic interest. In an alternate embodiment the primary classifier may include a neural network. Other alternate embodiments also are disclosed below.
- It is a further ob;ect of the present invention that it perform its classification of a group of specimens within the period of time typically consumed for this task by careful manual screening (i.e., approximately 15 minutes/specimen).
It is a further ob;ect of the present invention that it perform its classification on cytological specimens which contain the numbers and types of objects other than single layers of cells of interest that are typically found in cervical smears (e.g., clumps of cells, overlapping cells, debris, leucocyte~, bacteria, mucus).
It is a further object of the present invention to perform the above-described classification on cervical smears for the detection of pre-malignant and malignant cells.
It is a further ob~ect of the present invention that it -perform its classification with smaller false negative error rates than typically found in conventional manual cervical smear screening.
An advantage of the cytological classification system of the present invention is that classification of cytological specimens into medically significant diagnostic categories will be mora reliable, i.e., with lower falsa negative error rates.
4 ~3~37~0 A further advantage of the cytological classification system of the present invention is that it does not require a modification in the procedure by which cellular specimens are obtained from the patient.
-A further advantage of the cytological classification system of the present invention is that it will permit reliable classification within processing time constraints that permit economically viable operation.
These and other objects, advantages and features of the present invention will become evident to those of ordinary skill in the art after having read the following detailed description of the preferred embodiment.
In a broad aspect, therefore, the present invention relates to :an automated cytological specimen classifier, comprising: (a) an automated microscope; (b) a video camera-charge coupled device; (c) an image digitizer; (d) a primary statistical classifier for detection of objects in a cytological specimen which exceed a threshold integrated optical density; and (e) a secondary classifier based on a neural network for detection of pre-malignant and malignant cells among the objects identified by the primary classifier.
In another broad aspect, the present invention relates to a method of classifying cytological specimens, comprising primarily classifying a specimen using a first classifier to determine 4(a) 13~37i~0 locations of interest and secondarily classifying such locations of interest using a neural network.
In another broad aspect, the present invention relates to an automated cytological specimen classifier for classifying cells contained in a smear on a slide to identify cells that are likely to be malignant or pre-malignant, comprising: (a) microscope mean~
for obtaining a view of at least part o~ a cytological specimen including cells and other material located generally randomly on a slide in an arrangement which can include other than a single layer o~ cells: (b~ camera means for creating an image of such view; (c) image digitizing means for producing a digital representation o~
such image: (d) a primary classifier means for detecting obiects in a digital representation of a cytological specimen based on a d~tectable feature, said primary classifier means co~prising a classifier means for detecting cells that are likely to be malignant or pre-malignant as well as other oells and material that initially appear to have characteristics of a malignant cell or a pre-malignant cell based on integrated optical density; and (e) a secondary classifier for distinguishing pre malignant and malignant cells from other cells and material among the objects detected b~
the primary classifier, said secondary classiier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
In yet another broad aspect, the present invention relates to a method of classifying cytological specimens, comprising using a 4(b~ 13~37~0 primary classifier apparatus primarily classifying a specimen which is generally randomly arranged and can include other than in a single layer to determine locations of interest, and a secondarily classifying such locations of interest using a neural network computer apparatus.
In still another broad aspect, the present invention relates to an automated cytological specimen classifier, comprising: (a) - microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly in an ar~angement which can include other than a single layer of cells; (b) camera means for creating an image of such view; (c) image digitizing means for producing a digital representation of such image; (d) primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classification means comprising a classifier for detecting cells that are likely to be of a predetermined cell type as well as other cells and material that initially appear to have characteristics of such predetermined cell type; and ~e) secondary classifier means for distinguishing cells of such predetermined cell type from other cells and material among the o~jects detected by said primary classifier means, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
...
4(c) 13237~0 Moreover, it is noted here that the invention is described herein mainly with respect to classi~ication of cytologiaal specimens in the form of a cervical smear, e.g., as typically is done in connection with a Pap test. However, it will be appreciated that this is but one example of the application of the principles of the invention which are intended for application for classification of many other cytological specimens.
BRI~F DESCRIPTION OF THE DRAWINGS
: 10 In the annexed drawings:
Figure 1 is a block diagram for a neural network based automated cytological specimen screening device in accordance with the present invention;
Figure 2 is a representation o~ a three-layer neural network of the type utilized in the preferred embodiment;
Figure 3 is a block diagram of the alternate embodiment of the :~ automated screening device in accordance with the present invention:
Figure 4 is a block diagram of an alternate embodiment of the automated screening device in accordance with the present invention;
5 13~37~() Figure 5 iB a block diagram of an alternate embodiment of the automated screening device in accordance with the present invention;
Figure 6 iB a block diagram o~ an alternate embodlment of the automated screening device ln ~ccordance wlth th~ present invention: and ;~ Flgure 7 i8 a block diaqram of an alternate embodiment o~
J the automated screening device in accordance with the present - invention.
DESCRIPTION OF THE PREFERRED AND ALTERNATE EMBODIMENTS
Figure 1 illustrates a neural network based automated cytological specimen screening device in accordance with the present invention and referred to by the general reference numeral 10. The classification device 10 includes an ; automated microscope 11, a video camera or CCD device 12, an image digitizer 13, and classifier stages 14, 15, and 16.
The automated microscope ll effects relative movement of `~ the microscope objective and the specimen, and video camera or CCD 12 obtains an image or picture of a specific portion of the cytological specimen. The image is digit~zed by the image ; digitizer 13 and the information therefrom ~s coupled to the ~t`: classifier 14. In the preferred embodiment, classifier 14 is commercially available statistical classifier which ?~ ident~fies cell nuclel of interest by measurement of their integrated optical density (nuclear stain density). This is th~ ~um o~ the pixel grey values for the ob~ect, corrected for s optical errors. Compared to normal cell~, malignant cells - tend to possess a larger, more densely staining nucleu Ob~ects which pass classifier 14 consist of pre-malignant and malignant cells but also include other objects with high integrated optical density such as cell clumps, debris, leucocytes and mucus. The task of t~e secondary classifier 15 is to distinguish pre-malignant and malignant cells from these other objects.
A neural network is u~ilized to implement secondary classifier 15. Detailed descriptions of the design and opera~ion of neural networks suitable for implementation of 6 ~3~7~(~
secondary classifier 15 can be found ln the references cited herein. A brief description of this lnformation is provided below.
; Ba~ed on the data obtalned by the primary cla~sifler for the cytological specimen, the secondary classifier i8 used to check ~pecific areas o~ the specimen that are, for exampl~, determined to require further screening or clas~if~cation.
Such further examination by the secondary classi~ier may be effected by reliance on the already obtained digitized image data for the selected areas of the ~peci~en or by the taking of addit~onal data by the components 11-13 or by other commercially available op~ical or other equipment that would : provide acceptable data for use and anayl~is by the secondary classifier 15.
A neural network is a highly parallel distributed system with the topology of a directed graph. The nodes in neural networks are usually referred to as "processing elements" or "neurons" while the links are generally Xnown as "interconnects." Each processing element accepts multiple inputs and generates a single output signal which branches into multiple copies that are in turn distributed to the other processing elements as input signals. Information is stored in the strength of the connections known as weights. In an asynchronous fas~ion, each processing element co~putes the sum of products of the weight of each input line ~ultiplied by the signal level (usually 0 or 1) on that input line. If the su~
of products exceeds a preset activation threshold, the output of the processing element is set to 1, if les~, it is set to 0. Learning is achieved through adjustment of the values of the weights.
For the present invention, the preferred embodiment is achieved by utilization of a three-layer neural network of the type described in the Lippman reference as a "multi-layer perceptron" and discussed in detail in Chapter 8 of the Rumelhart reference. Other types of neural netowrk syste~s also may be used.
7 1'~37(~0 A three-layer neural network consi~t~ of an input layer, an output layer, and an intermediate hidden layer. The intermediate layer i~ required to allow for internal representatlon of patterns within the network. As shown by Min~ky and Papert in thelr 1969 book ontitled "Perceptrons"
(MIT Press), simple two-layer a880cl~tive networ~s are limited in the types o~ problems they can solve. A two-layer network with only "input" and noutput" processing elements ca~ only represent mappings in which similar input patterns lead to similar output patterns. Whenever the real word problem is not of this type, a three-layer networ~ i8 reguired. It has been shown that with a large enough hidden layer, a three-- layer neural networX can always find a representation that will map any input pattern to any desired output pattern. A
generic three-layer neural network of the type utilized in the preferred embodiment is shown in Figure 2.
Several important features of neural network architecture~ distinguish the~ from prior art approaches to the implementation of classifier 15.
1. There i8 little or no executive function. There are only v~ry simple units each performing its sum of products calculation. Each processing element's task is thus limited to receiving the inputs from its neighbors and, as a function of these inputs, computing an output ~alue which it ~ends to its neig~bor~. Each processing element per~orms this calculation periodically, in parallel with, but not synchronized to, the ~ctivities of any of its neiqhbors.
CLASSIFICATION SYSTEM AND METHOD
TECHNICAL FIELD
Th~s invention relates generally, as indicated, to cell classification and, more partlcularly, to the use of neural networks and/or n~urocompu~erM for increa~ing the speed and accuracy of cell classi~ication.
The cervical smear tPap test) is the only mass screening cytological examination which reguires visual inspection of virtually every cell on the slide. The test suffers from a high false neg~tive rate due to the tedium and fatigue associated with its current manual mode of performance. Cell classification is typically performed on a "piece-work" basis by "cytotechnicians" employed by pathology laboratories and in some circumstances by salariad technicians. Due to the clearly life threat~ning nature of the false negative problem with its resultant possibility of undiagnosed cervical cancer, the American Cancer Society is considering doubling the frequency of recommended Pap smears. This, however, will certainly overload an already overburdened cervical screening industry as increasingly fewer individuals are willing to enter the tedious and stressful field oP manual cervical smear classification. An American Cancer Society reco~mendation to increase Pap smear frequency may only serve to increase the false negative rate by decreasing the amount of time spent on manual examination of each slide. ~ thorough manual examination should take no less than fifteen minutes per slide although a cytotechnician, especially one under a heavy workload, may spend less than half this amount of time. The College of American Pathology is well aware of this problem and would rapidly embrace an automated solution to cervical smear screening.
Due to the clear commercial potential ~or automated cervical smear analysis several attempts to this end have been made in the prior art. These attempts have proven to be 1 3 ~ 3 7 ~ O
unsuccessful since they have relied exclusively on classical pattern recognition technology (geometric, syntactic, template, statistical) or artif~cial intelligence (AI) based pattern recognition, l.e., rule-based expert system~. There i8, however, no claar algorlthm or complete and expllcit 6Qt of rule~ by which the human cytotechnlclan or pathol~gist uses his experience to comb~ne a multitude of fe~tures to make a classification in ge~talt manner. Cervical s~ear classification is, therefore, an excellent application for neural network based pattern recognition.
An example of the limitations of the prior art can be found in the 1987 reference entitled "Automated CerYical Screen Classification" by Tien et al, identified further below.
Background references of interest are, as follows:
Rumelhart, David E. and McClelland, James L., "Parallel Distributed Processing, n MIT Press, 1986, Volume l;
Tien, D. et al, "Automated Cervical Smear Classification," Proceedings of the IEE~/Ninth Annual Conference of the Engineering in Medicine and Biology Society, 1987, p. 1457-1458;
Hecht-Nielsen, Robert, "Neurocomputing: Picking the Human Brain," IEEE Spectrum, March, 1988, p 36-41; and Lippmann, Richard P., "An Introduction to Computing with Neural Nets,~ IE~E ASSP Magazine, April, 1987, p. 4-22.
BRIEF SUMMARY _ THE INVENTION
It is, therefore, a principal ob~ect of the present invention to provide an automated system and method for the classification of cytological specimens into categories, for example, categories of diagnostic significance.
Briefly, the invention lncludes an initial classifier (sometimes referred to as a primary classifier) preliminarily to classify a cytological specimen and a subsequent classifier (someti~es referred to as a secondary classifier) to classify those portions of the cytological specimen selected by the ~nit~al classifier for subsequent classification, wherein the ~3~37~0 subsequent classlfier includes a neural computer or neural network.
In one embodiment the pri~ary classifler may include a commercially ava~lable automated mlcroscope ln the ~orm Oe a s~andard cytology ~icroscope wlth a ~i~eo ca~era or CCD array with ths micro~cope stage controlled ~or automa~ic scanning of a slide. Image~ from the camer~ are digltized and outputted to the secondary classi~ier in the for~ o~ a computer sy~tem.
The computer system includes a neural networ~ as defined below and is disclosed, too, in several of the references referred to herein, which is utilized in the performance of cell image identification and classification into group~ of diagnostic interest. In an alternate embodiment the primary classifier may include a neural network. Other alternate embodiments also are disclosed below.
- It is a further ob;ect of the present invention that it perform its classification of a group of specimens within the period of time typically consumed for this task by careful manual screening (i.e., approximately 15 minutes/specimen).
It is a further ob;ect of the present invention that it perform its classification on cytological specimens which contain the numbers and types of objects other than single layers of cells of interest that are typically found in cervical smears (e.g., clumps of cells, overlapping cells, debris, leucocyte~, bacteria, mucus).
It is a further object of the present invention to perform the above-described classification on cervical smears for the detection of pre-malignant and malignant cells.
It is a further ob~ect of the present invention that it -perform its classification with smaller false negative error rates than typically found in conventional manual cervical smear screening.
An advantage of the cytological classification system of the present invention is that classification of cytological specimens into medically significant diagnostic categories will be mora reliable, i.e., with lower falsa negative error rates.
4 ~3~37~0 A further advantage of the cytological classification system of the present invention is that it does not require a modification in the procedure by which cellular specimens are obtained from the patient.
-A further advantage of the cytological classification system of the present invention is that it will permit reliable classification within processing time constraints that permit economically viable operation.
These and other objects, advantages and features of the present invention will become evident to those of ordinary skill in the art after having read the following detailed description of the preferred embodiment.
In a broad aspect, therefore, the present invention relates to :an automated cytological specimen classifier, comprising: (a) an automated microscope; (b) a video camera-charge coupled device; (c) an image digitizer; (d) a primary statistical classifier for detection of objects in a cytological specimen which exceed a threshold integrated optical density; and (e) a secondary classifier based on a neural network for detection of pre-malignant and malignant cells among the objects identified by the primary classifier.
In another broad aspect, the present invention relates to a method of classifying cytological specimens, comprising primarily classifying a specimen using a first classifier to determine 4(a) 13~37i~0 locations of interest and secondarily classifying such locations of interest using a neural network.
In another broad aspect, the present invention relates to an automated cytological specimen classifier for classifying cells contained in a smear on a slide to identify cells that are likely to be malignant or pre-malignant, comprising: (a) microscope mean~
for obtaining a view of at least part o~ a cytological specimen including cells and other material located generally randomly on a slide in an arrangement which can include other than a single layer o~ cells: (b~ camera means for creating an image of such view; (c) image digitizing means for producing a digital representation o~
such image: (d) a primary classifier means for detecting obiects in a digital representation of a cytological specimen based on a d~tectable feature, said primary classifier means co~prising a classifier means for detecting cells that are likely to be malignant or pre-malignant as well as other oells and material that initially appear to have characteristics of a malignant cell or a pre-malignant cell based on integrated optical density; and (e) a secondary classifier for distinguishing pre malignant and malignant cells from other cells and material among the objects detected b~
the primary classifier, said secondary classiier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
In yet another broad aspect, the present invention relates to a method of classifying cytological specimens, comprising using a 4(b~ 13~37~0 primary classifier apparatus primarily classifying a specimen which is generally randomly arranged and can include other than in a single layer to determine locations of interest, and a secondarily classifying such locations of interest using a neural network computer apparatus.
In still another broad aspect, the present invention relates to an automated cytological specimen classifier, comprising: (a) - microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly in an ar~angement which can include other than a single layer of cells; (b) camera means for creating an image of such view; (c) image digitizing means for producing a digital representation of such image; (d) primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classification means comprising a classifier for detecting cells that are likely to be of a predetermined cell type as well as other cells and material that initially appear to have characteristics of such predetermined cell type; and ~e) secondary classifier means for distinguishing cells of such predetermined cell type from other cells and material among the o~jects detected by said primary classifier means, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
...
4(c) 13237~0 Moreover, it is noted here that the invention is described herein mainly with respect to classi~ication of cytologiaal specimens in the form of a cervical smear, e.g., as typically is done in connection with a Pap test. However, it will be appreciated that this is but one example of the application of the principles of the invention which are intended for application for classification of many other cytological specimens.
BRI~F DESCRIPTION OF THE DRAWINGS
: 10 In the annexed drawings:
Figure 1 is a block diagram for a neural network based automated cytological specimen screening device in accordance with the present invention;
Figure 2 is a representation o~ a three-layer neural network of the type utilized in the preferred embodiment;
Figure 3 is a block diagram of the alternate embodiment of the :~ automated screening device in accordance with the present invention:
Figure 4 is a block diagram of an alternate embodiment of the automated screening device in accordance with the present invention;
5 13~37~() Figure 5 iB a block diagram of an alternate embodiment of the automated screening device in accordance with the present invention;
Figure 6 iB a block diagram o~ an alternate embodlment of the automated screening device ln ~ccordance wlth th~ present invention: and ;~ Flgure 7 i8 a block diaqram of an alternate embodiment o~
J the automated screening device in accordance with the present - invention.
DESCRIPTION OF THE PREFERRED AND ALTERNATE EMBODIMENTS
Figure 1 illustrates a neural network based automated cytological specimen screening device in accordance with the present invention and referred to by the general reference numeral 10. The classification device 10 includes an ; automated microscope 11, a video camera or CCD device 12, an image digitizer 13, and classifier stages 14, 15, and 16.
The automated microscope ll effects relative movement of `~ the microscope objective and the specimen, and video camera or CCD 12 obtains an image or picture of a specific portion of the cytological specimen. The image is digit~zed by the image ; digitizer 13 and the information therefrom ~s coupled to the ~t`: classifier 14. In the preferred embodiment, classifier 14 is commercially available statistical classifier which ?~ ident~fies cell nuclel of interest by measurement of their integrated optical density (nuclear stain density). This is th~ ~um o~ the pixel grey values for the ob~ect, corrected for s optical errors. Compared to normal cell~, malignant cells - tend to possess a larger, more densely staining nucleu Ob~ects which pass classifier 14 consist of pre-malignant and malignant cells but also include other objects with high integrated optical density such as cell clumps, debris, leucocytes and mucus. The task of t~e secondary classifier 15 is to distinguish pre-malignant and malignant cells from these other objects.
A neural network is u~ilized to implement secondary classifier 15. Detailed descriptions of the design and opera~ion of neural networks suitable for implementation of 6 ~3~7~(~
secondary classifier 15 can be found ln the references cited herein. A brief description of this lnformation is provided below.
; Ba~ed on the data obtalned by the primary cla~sifler for the cytological specimen, the secondary classifier i8 used to check ~pecific areas o~ the specimen that are, for exampl~, determined to require further screening or clas~if~cation.
Such further examination by the secondary classi~ier may be effected by reliance on the already obtained digitized image data for the selected areas of the ~peci~en or by the taking of addit~onal data by the components 11-13 or by other commercially available op~ical or other equipment that would : provide acceptable data for use and anayl~is by the secondary classifier 15.
A neural network is a highly parallel distributed system with the topology of a directed graph. The nodes in neural networks are usually referred to as "processing elements" or "neurons" while the links are generally Xnown as "interconnects." Each processing element accepts multiple inputs and generates a single output signal which branches into multiple copies that are in turn distributed to the other processing elements as input signals. Information is stored in the strength of the connections known as weights. In an asynchronous fas~ion, each processing element co~putes the sum of products of the weight of each input line ~ultiplied by the signal level (usually 0 or 1) on that input line. If the su~
of products exceeds a preset activation threshold, the output of the processing element is set to 1, if les~, it is set to 0. Learning is achieved through adjustment of the values of the weights.
For the present invention, the preferred embodiment is achieved by utilization of a three-layer neural network of the type described in the Lippman reference as a "multi-layer perceptron" and discussed in detail in Chapter 8 of the Rumelhart reference. Other types of neural netowrk syste~s also may be used.
7 1'~37(~0 A three-layer neural network consi~t~ of an input layer, an output layer, and an intermediate hidden layer. The intermediate layer i~ required to allow for internal representatlon of patterns within the network. As shown by Min~ky and Papert in thelr 1969 book ontitled "Perceptrons"
(MIT Press), simple two-layer a880cl~tive networ~s are limited in the types o~ problems they can solve. A two-layer network with only "input" and noutput" processing elements ca~ only represent mappings in which similar input patterns lead to similar output patterns. Whenever the real word problem is not of this type, a three-layer networ~ i8 reguired. It has been shown that with a large enough hidden layer, a three-- layer neural networX can always find a representation that will map any input pattern to any desired output pattern. A
generic three-layer neural network of the type utilized in the preferred embodiment is shown in Figure 2.
Several important features of neural network architecture~ distinguish the~ from prior art approaches to the implementation of classifier 15.
1. There i8 little or no executive function. There are only v~ry simple units each performing its sum of products calculation. Each processing element's task is thus limited to receiving the inputs from its neighbors and, as a function of these inputs, computing an output ~alue which it ~ends to its neig~bor~. Each processing element per~orms this calculation periodically, in parallel with, but not synchronized to, the ~ctivities of any of its neiqhbors.
2. All knowledge i~ in the connections. Only very short term storaqe can occur in the states of the processing elements. All lonq term storage i8 represented by the values of the connection strengths or "weights~ between the r ` processing elements. It i8 the rules that establish these weights and modify them for learning that primarily distinguish one neural network model from another. All ~nowledge is thus ~mplicitly represented in the strengths of the connection weights rather than explicitly represented in the states of the processing elements.
.
8 ~3~37~
.
8 ~3~37~
3. In contrast to algorithmie computers and expert systems, the goal of neural net learning i6 not the formulation o~ an algorlthm or a set of explicit rules.
During learning, a neural network self-organizes to establish the global set of welghts which will result ln itA output ~or a ~iven input mo t closely correspondinq to what lt i8 told i3 the correct output for that input. It i8 thls adaptive acquisition of connection strengths that allow3 a neural network to behave as if it knew the rules. Conventional computers excell in application~ where the knowledge can be ; readily represented in an explicit algorithm or an explicit and complete set of rules. Where this is not the case, conventional computers encounter great difficulty. While conventional computers can execute an algorithm much more rapidly than any human, they are challenged to match human performance in non-algorithmic tasks such as pattern recognition, nearest neighbor classification~ and arriving at the optimum solution when faced with multiple simultaneous constraints. If N exemplar patterns are to be searched in order to classify an unknown input pattern, an algorithmic system can accomplish thi~ tas~ in approximately order N time.
In a neural network, all of the candidate signatures are simultaneously represented by the global set of connection weights of the entire syctem. A neural network thus automatically arrives at the nearest neighbor to th~ ambiguous input in order 1 time as opposed to ~rder N time.
For the present invention, the preferre~ embodiment is achieved by utilization of a three-layer ~ackpropagation network as described in the Rumelhart reference for the neural network of classifier stage 15. Backpropagation ls described in detail in the Rumel~art reference. Briefly described, it operates as follows. During net training, errors (i.e., the difference between the appropriate output for an exemplar input and the current net outpu~ for that output) are propagated backwards from the output layer to the ~iddle layer and then to the input layer. These errors are util~zed at 13`~3700 g each layer by the training algorithm to read~ust the interconnection weights 80 that a future presentatlon of the - exemplar pattern will result in the appropriate output category. Following the net training, during the feed-~orward mode, unknown input pattern~ ~re classlfied by the neural network into the exemplar category which mo~t clo8ely re6embles it.
`~ The output of neural net classi~ier 15 indicate~ the g presence or absence of pre-malignant or malignant cells. Thelocation of the cells on the input slide i~ obtained from X-Y
plane position coordinates outputted continually by the automated microscope. This positlonal information ~s outputted to printer or video display 17 along with diagnosis and patient identification information so that the classification can be reviewed by a patholoqist.
In the preferred embodiment, the parallel structure o the neural network is emulated by execution with pipelined serial processing as performed by one of the commercially available neurocomputer accelerator boards. The operation of these neurocomputers is discussed in the Spectrum reference cited. The neural network preferably is a "Delta" processor, which is a commercially availa~le neurocomputer of Science Application International Corp. (SAIC) (see the Hecht-Nielsen reference above) that has demonstrated a susta~ned processing rate of lO interconnects/second in the feed-~orward (i.e., non-training) mode. For a typical cervlcal smear containing lO0,000 cells, 1-2S of ~he cells or approximately 1,500 images w111 require processing by classifier 15. As an example of the dat~ rates which result, assume that following data 3~ compression an image 50 x 50 pixels is processed by classifier 15. The input layer for the neural network, therefore, consists of 2,500 processing elements or "neurons. n The middle layer consists of approximately 25% of the input layer, or 625 neurons. (The number of output neurons is equal to the number of diagnostic categories of interest. This small nu~ber does not siqnificantly aff6ct this calculat~on.) The number of interconnects i~ thus ~2500)~625) or approximately 13~7~0 ; 6 1.5 x 10 . At a processing rate of 10 interco~nects/second, the processing by classifier 15 of the 1,500 images sent to it by classifier 14 will take le~s than four minute~. Currently available embodiments o~ classi~ier 14 operate at a rate of ; 50,000 cQlls/~inute (refer to the Tlen et al citation). With - classifier 14 operatlng at a rate of 50,000 cells/minute, the four minutes consumed by classifler 15 iB added to the two minutes used by clas~ifier 14 for a total of six minutes to analyze the 100,000 cell images on the slide. As discussed above, an accurate manual cervical ~mear analysis takes approximately lS minutes/slide. Prior art automated attempts using a non-neural network embod~ment of classifier 15 require over one hour/slide. This example is not meant in any way to limit the actual configuration of the present invention, but rather to demonstrate that it is capable of achievinq the object of processing cervical smears and other cytological samples within the time period required for commercially feasible operation.
In the preferred embodiment, primary classifier 14 i3 restricted to evaluation of the cellular nucleus while the ` secondary classifier 15 evaluates both the necleus and its - surrounding cytoplas~. The ratio between the nucleus and ^~ cytoplasm is an important indicator for pre-malignant and malignant cell classification. In a~ alternate embodiment, both classifier 14 and classifier 15 are limited to evaluation for the cellular nuclei.
Output information from the secondary classifier 15 is directed to an output monitor and printer 17, which may indicate a variety of information including, importantly, 30 whether any cells appear to be malignant or pre-malignant, appear to require further exa~ination, etc.
Figure 3 illustrates an alternate embodiment in which an additional neural net classifier stage 16 is added to pre-process the slide for large areas of artifactual material, i.e., ~aterial other than single layer cells of interest.
Thi~ includes clumps of cèlls, debris, mucus, leucocytes, etc.
11 ~30~70a Positional information obtained ~n this pre-screen i3 stored for use by th¢ remainder o~ the cla~sification ~ystem. The information from cla~si~ier stage 16 i8 utllized to llmit the proce~slng required by clas~i~ler 15. Classi~ier ~tage 14 can ignore ~ll materi~l within the areas de~ined by the po~itional coordinates outputted by clas~ifier 16. Thi8 wlll result ln less information being sent for process~ng by classifi~r 15.
A diagnosi~ iB, therefore, made on the basis of classif~cation of only those cells which lie outside of these areas. If an insufficient sample of cells lies outside of these areas for a valid diagnosis, this information will be outputted on 17 as an ~insufficient cell sample. n Figure 4 illustrates an alternate embodiment in which the imayes within the areas identified by classifier 16 are not ignored but are instead processed by a separate classif~er 18 which operates in parallel with classifier 15. ~he training of the neural net which composes classifier 18 i~ dedicated to the distinction of pre-malignant and malignant cells from said artifactual material.
Figure 5 illustrates an alternate embodiment wherein an additional non-neural net classification of nuclear morphological components, exclusive of integrated optical density, is placed between classifier 14 and classifier 15.
T~is classification is performed by classifier 19.
Figure 6 illustrates an alternate embodiment in which a com~ercially available SAIC neurocomputer is optimized for feed-forward processing 20. Through deletion of learning-mode capacity, all neurocomputer functions are dedicated to feed-forward operation. Learning is completed on a separate unmodified neurocomputer which contains both the learning and feed-forward functions.
Following the completion o~ learning, the final interconnection weights are transferred to the optimized feed-forward neurocomputer 20. Dedication of neurocomputer 20 to the feed-forward ~ode results ln a sustained feed-forward operation rate of 10 interconnects/second vs. lO
interconnects/second for the non-optimized board as 12 13~37~0 commercially supplled. The optlmlzed feed-forward neural network 20 is utillzed to perform the functlons of classifiers 14 and 16 in Figures 1, 3, 4, and 5. By utllizlng neural n~t classi~ier 20 to per~orm the functlon of ~tatlst~cal classlfier 14, cells of intere~t whlch are not necess~rily - mallgnant cervlcal cells, and wh~ch do not tbereforQ exceed the lntegrated optical denslty threshold of classlfier 14, would nevertheless be detected. An example would be the detection of endometrial cell~ which, while not necessarily indicative of cervical mallgnancy, are indicative of uterine malignancy when found in the Pap smear of a post-menopausal patient.
As an example of the data rates that result from this embodiment in Fig. 6, assume outside slide dim6ensions of 152m x 45mm or a total slide area of 675 x 10 micrometers .
'~ Neural net 20 processes a sliding window over this area for analysis. This window has dimensions of 20 micrometers x 20 micrometers or an area of 400 micrometers . There are, therefore, 1.5 x 10 of these windows on the 15mm x 45mm slide. For the primary classification function performed by neural net 20, a resolution of 1 micrometer/pixel is ; sufficient to detect those ob~ects which must be sent to ". , t secondary neural network classifier 15 for further analysis.
The input pattern for the image window analyzed by classifier ~ 20 is therefore 20 x 20 pixels or 400 neurons to the input - layer of neural net 20. T~e middle layer consists of approximately 25% of the input layer or 100 neurons. ~As discussed above in the data rate calculation for classifier - 15, the number of output layer neurons is small and does not significantly affect our results.) The number of interconnections in classifier 20 is thus approximatel~
; (400~(100) or 40 x 10 . At a processing rate of 10 -~ interconnects~second, each image from the sliding window will take 400 microseconds for neural net 20 to classify. In a 15mm x 45mm slide, there are 1.5 x 10 of the 400 micrometer windows which require classification by neural net 20. ~otal 13 13~37~0 clasæification time for neural net 20 ~8 therefore (1.5 x 10 )(400 x 10 ) = 600 seconds or ten minutes. If this ten minutes i8 added to the appriximately four mlnutes requir~d for secondary neural net clas~ifier 15, ~ total of 14 minutes/~lide result~. Thls example 1~ not meant in any way to limit the actual configuration Or the present lnventlon, but rather to demonstrate that it is capablo of achieving th3 ob~ect of proce6sing cervical smears and other cytological samples within the time period required for comemrc~ally feasible operation.
Speed of processinq data can be enhanced, too, by using parallel processing. For example, plural commercially available neurocomputers from SAIC can be coupled to effect ~-parallel processing of data, thus increasing overall operat~onal speed of the classifier usinq the same.
~Figure ~ illustrates an alternate embodiment in which ineural net primary classifier 20 iR utilized in con~unction with, rather than as a substitute for morphological classification and area classification. By dedication of classifier 20 to the detection of those few cell types which are of interest, but which cannot be detected by other means, `~the resolution required of classifier 20 is min$mized.
Althou ~ the present invention has been described in terms of the presently preferred embodiment, it is to be understood that such disclosure i9 not to ~e interpreted as limiting. Various alterations and modifications will no doubt become apparent to those skilled in the art after having read the above disclosure. Accordingly, it is intended that the appended claims be interpreted as covering all alterations and modifications as fall within t~e true spirit and scope of the invention.
`:
.~
.--.~ .
;
During learning, a neural network self-organizes to establish the global set of welghts which will result ln itA output ~or a ~iven input mo t closely correspondinq to what lt i8 told i3 the correct output for that input. It i8 thls adaptive acquisition of connection strengths that allow3 a neural network to behave as if it knew the rules. Conventional computers excell in application~ where the knowledge can be ; readily represented in an explicit algorithm or an explicit and complete set of rules. Where this is not the case, conventional computers encounter great difficulty. While conventional computers can execute an algorithm much more rapidly than any human, they are challenged to match human performance in non-algorithmic tasks such as pattern recognition, nearest neighbor classification~ and arriving at the optimum solution when faced with multiple simultaneous constraints. If N exemplar patterns are to be searched in order to classify an unknown input pattern, an algorithmic system can accomplish thi~ tas~ in approximately order N time.
In a neural network, all of the candidate signatures are simultaneously represented by the global set of connection weights of the entire syctem. A neural network thus automatically arrives at the nearest neighbor to th~ ambiguous input in order 1 time as opposed to ~rder N time.
For the present invention, the preferre~ embodiment is achieved by utilization of a three-layer ~ackpropagation network as described in the Rumelhart reference for the neural network of classifier stage 15. Backpropagation ls described in detail in the Rumel~art reference. Briefly described, it operates as follows. During net training, errors (i.e., the difference between the appropriate output for an exemplar input and the current net outpu~ for that output) are propagated backwards from the output layer to the ~iddle layer and then to the input layer. These errors are util~zed at 13`~3700 g each layer by the training algorithm to read~ust the interconnection weights 80 that a future presentatlon of the - exemplar pattern will result in the appropriate output category. Following the net training, during the feed-~orward mode, unknown input pattern~ ~re classlfied by the neural network into the exemplar category which mo~t clo8ely re6embles it.
`~ The output of neural net classi~ier 15 indicate~ the g presence or absence of pre-malignant or malignant cells. Thelocation of the cells on the input slide i~ obtained from X-Y
plane position coordinates outputted continually by the automated microscope. This positlonal information ~s outputted to printer or video display 17 along with diagnosis and patient identification information so that the classification can be reviewed by a patholoqist.
In the preferred embodiment, the parallel structure o the neural network is emulated by execution with pipelined serial processing as performed by one of the commercially available neurocomputer accelerator boards. The operation of these neurocomputers is discussed in the Spectrum reference cited. The neural network preferably is a "Delta" processor, which is a commercially availa~le neurocomputer of Science Application International Corp. (SAIC) (see the Hecht-Nielsen reference above) that has demonstrated a susta~ned processing rate of lO interconnects/second in the feed-~orward (i.e., non-training) mode. For a typical cervlcal smear containing lO0,000 cells, 1-2S of ~he cells or approximately 1,500 images w111 require processing by classifier 15. As an example of the dat~ rates which result, assume that following data 3~ compression an image 50 x 50 pixels is processed by classifier 15. The input layer for the neural network, therefore, consists of 2,500 processing elements or "neurons. n The middle layer consists of approximately 25% of the input layer, or 625 neurons. (The number of output neurons is equal to the number of diagnostic categories of interest. This small nu~ber does not siqnificantly aff6ct this calculat~on.) The number of interconnects i~ thus ~2500)~625) or approximately 13~7~0 ; 6 1.5 x 10 . At a processing rate of 10 interco~nects/second, the processing by classifier 15 of the 1,500 images sent to it by classifier 14 will take le~s than four minute~. Currently available embodiments o~ classi~ier 14 operate at a rate of ; 50,000 cQlls/~inute (refer to the Tlen et al citation). With - classifier 14 operatlng at a rate of 50,000 cells/minute, the four minutes consumed by classifler 15 iB added to the two minutes used by clas~ifier 14 for a total of six minutes to analyze the 100,000 cell images on the slide. As discussed above, an accurate manual cervical ~mear analysis takes approximately lS minutes/slide. Prior art automated attempts using a non-neural network embod~ment of classifier 15 require over one hour/slide. This example is not meant in any way to limit the actual configuration of the present invention, but rather to demonstrate that it is capable of achievinq the object of processing cervical smears and other cytological samples within the time period required for commercially feasible operation.
In the preferred embodiment, primary classifier 14 i3 restricted to evaluation of the cellular nucleus while the ` secondary classifier 15 evaluates both the necleus and its - surrounding cytoplas~. The ratio between the nucleus and ^~ cytoplasm is an important indicator for pre-malignant and malignant cell classification. In a~ alternate embodiment, both classifier 14 and classifier 15 are limited to evaluation for the cellular nuclei.
Output information from the secondary classifier 15 is directed to an output monitor and printer 17, which may indicate a variety of information including, importantly, 30 whether any cells appear to be malignant or pre-malignant, appear to require further exa~ination, etc.
Figure 3 illustrates an alternate embodiment in which an additional neural net classifier stage 16 is added to pre-process the slide for large areas of artifactual material, i.e., ~aterial other than single layer cells of interest.
Thi~ includes clumps of cèlls, debris, mucus, leucocytes, etc.
11 ~30~70a Positional information obtained ~n this pre-screen i3 stored for use by th¢ remainder o~ the cla~sification ~ystem. The information from cla~si~ier stage 16 i8 utllized to llmit the proce~slng required by clas~i~ler 15. Classi~ier ~tage 14 can ignore ~ll materi~l within the areas de~ined by the po~itional coordinates outputted by clas~ifier 16. Thi8 wlll result ln less information being sent for process~ng by classifi~r 15.
A diagnosi~ iB, therefore, made on the basis of classif~cation of only those cells which lie outside of these areas. If an insufficient sample of cells lies outside of these areas for a valid diagnosis, this information will be outputted on 17 as an ~insufficient cell sample. n Figure 4 illustrates an alternate embodiment in which the imayes within the areas identified by classifier 16 are not ignored but are instead processed by a separate classif~er 18 which operates in parallel with classifier 15. ~he training of the neural net which composes classifier 18 i~ dedicated to the distinction of pre-malignant and malignant cells from said artifactual material.
Figure 5 illustrates an alternate embodiment wherein an additional non-neural net classification of nuclear morphological components, exclusive of integrated optical density, is placed between classifier 14 and classifier 15.
T~is classification is performed by classifier 19.
Figure 6 illustrates an alternate embodiment in which a com~ercially available SAIC neurocomputer is optimized for feed-forward processing 20. Through deletion of learning-mode capacity, all neurocomputer functions are dedicated to feed-forward operation. Learning is completed on a separate unmodified neurocomputer which contains both the learning and feed-forward functions.
Following the completion o~ learning, the final interconnection weights are transferred to the optimized feed-forward neurocomputer 20. Dedication of neurocomputer 20 to the feed-forward ~ode results ln a sustained feed-forward operation rate of 10 interconnects/second vs. lO
interconnects/second for the non-optimized board as 12 13~37~0 commercially supplled. The optlmlzed feed-forward neural network 20 is utillzed to perform the functlons of classifiers 14 and 16 in Figures 1, 3, 4, and 5. By utllizlng neural n~t classi~ier 20 to per~orm the functlon of ~tatlst~cal classlfier 14, cells of intere~t whlch are not necess~rily - mallgnant cervlcal cells, and wh~ch do not tbereforQ exceed the lntegrated optical denslty threshold of classlfier 14, would nevertheless be detected. An example would be the detection of endometrial cell~ which, while not necessarily indicative of cervical mallgnancy, are indicative of uterine malignancy when found in the Pap smear of a post-menopausal patient.
As an example of the data rates that result from this embodiment in Fig. 6, assume outside slide dim6ensions of 152m x 45mm or a total slide area of 675 x 10 micrometers .
'~ Neural net 20 processes a sliding window over this area for analysis. This window has dimensions of 20 micrometers x 20 micrometers or an area of 400 micrometers . There are, therefore, 1.5 x 10 of these windows on the 15mm x 45mm slide. For the primary classification function performed by neural net 20, a resolution of 1 micrometer/pixel is ; sufficient to detect those ob~ects which must be sent to ". , t secondary neural network classifier 15 for further analysis.
The input pattern for the image window analyzed by classifier ~ 20 is therefore 20 x 20 pixels or 400 neurons to the input - layer of neural net 20. T~e middle layer consists of approximately 25% of the input layer or 100 neurons. ~As discussed above in the data rate calculation for classifier - 15, the number of output layer neurons is small and does not significantly affect our results.) The number of interconnections in classifier 20 is thus approximatel~
; (400~(100) or 40 x 10 . At a processing rate of 10 -~ interconnects~second, each image from the sliding window will take 400 microseconds for neural net 20 to classify. In a 15mm x 45mm slide, there are 1.5 x 10 of the 400 micrometer windows which require classification by neural net 20. ~otal 13 13~37~0 clasæification time for neural net 20 ~8 therefore (1.5 x 10 )(400 x 10 ) = 600 seconds or ten minutes. If this ten minutes i8 added to the appriximately four mlnutes requir~d for secondary neural net clas~ifier 15, ~ total of 14 minutes/~lide result~. Thls example 1~ not meant in any way to limit the actual configuration Or the present lnventlon, but rather to demonstrate that it is capablo of achieving th3 ob~ect of proce6sing cervical smears and other cytological samples within the time period required for comemrc~ally feasible operation.
Speed of processinq data can be enhanced, too, by using parallel processing. For example, plural commercially available neurocomputers from SAIC can be coupled to effect ~-parallel processing of data, thus increasing overall operat~onal speed of the classifier usinq the same.
~Figure ~ illustrates an alternate embodiment in which ineural net primary classifier 20 iR utilized in con~unction with, rather than as a substitute for morphological classification and area classification. By dedication of classifier 20 to the detection of those few cell types which are of interest, but which cannot be detected by other means, `~the resolution required of classifier 20 is min$mized.
Althou ~ the present invention has been described in terms of the presently preferred embodiment, it is to be understood that such disclosure i9 not to ~e interpreted as limiting. Various alterations and modifications will no doubt become apparent to those skilled in the art after having read the above disclosure. Accordingly, it is intended that the appended claims be interpreted as covering all alterations and modifications as fall within t~e true spirit and scope of the invention.
`:
.~
.--.~ .
;
Claims (54)
1. An automated cytological specimen classifier, comprising:
(a) an automated microscope;
(b) a video camera-charge coupled device;
(c) an image digitizer;
(d) a primary statistical classifier for detection of objects in a cytological specimen which exceed a threshold integrated optical density; and (e) a secondary classifier based on a neural network for detection of pre-malignant and malignant cells among the objects identified by the primary classifier.
(a) an automated microscope;
(b) a video camera-charge coupled device;
(c) an image digitizer;
(d) a primary statistical classifier for detection of objects in a cytological specimen which exceed a threshold integrated optical density; and (e) a secondary classifier based on a neural network for detection of pre-malignant and malignant cells among the objects identified by the primary classifier.
2. The automated cytological classifier of Claim 1 further comprising a neural network pre-screening classifier for recognition and classification of general areas within a specimen that contain material other than a cellular monolayer.
3. The automated classifier of Claim 2 wherein the output from the pre-screening classifier is utilized to exclude the said identified areas from analysis by the secondary classifier.
4. The automated classifier of Claim 2 wherein the output from the pre-screening classifier is utilized to modify the secondary classification on images found within the areas of the specimen identified by said pre-screening classifier.
5. The automated classifier of Claim 1 wherein the primary statistical classifier is restricted to evaluation of the cellular nucleus while the secondary classifier evaluates both the cellular nucleus and its surrounding cytoplasm.
6. The automated classifier of Claim 1 wherein both the primary statistical classifier and the secondary classifier are restricted to evaluation of the cellular nucleus.
7. The automated classifier of Claim 1 further comprising means for making an additional non-neural net classification of nuclear morphological components in addition to integrated optical density, said means being coupled between the primary and secondary classifier.
8. An automated cytological specimen classifier, comprising a low resolution neural network for performing primary classification of a cytological specimen, a high resolution neural network for performing secondary classification, and means for coupling said low and high resolution neural networks to pass data representing locations of interest from the former to the latter.
9. A method of classifying cytological specimens, comprising primarily classifying a specimen using a first classifier to determine locations of interest and secondarily classifying such locations of interest using a neural network.
10. The method of Claim 9, wherein said primarily classifying comprises using a video camera or charge coupled device (CCD) to obtain images of the specimen, a digitizer to digitize such images and an integrated optical density detector.
11. The method of Claim 9, wherein said primary classifying comprises using a neural network.
12. An automated cytological specimen classifier for classifying cells contained in a smear on a slide to identify cells that are likely to be malignant or pre-malignant, comprising:
(a) microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly on a slide in an arrangement which can include other than a single layer of cells;
(b) camera means for creating an image of such view;
(c) image digitizing means for producing a digital representation of such image;
(d) a primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classifier means comprising a classifier means for detecting cells that are likely to be malignant or pre-malignant as well as other cells and material that initially appear to have characteristics of a malignant cell or a pre-malignant cell based on integrated optical density; and (e) a secondary classifier for distinguishing pre-malignant and malignant cells from other cells and material among the objects detected by the primary classifier, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
(a) microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly on a slide in an arrangement which can include other than a single layer of cells;
(b) camera means for creating an image of such view;
(c) image digitizing means for producing a digital representation of such image;
(d) a primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classifier means comprising a classifier means for detecting cells that are likely to be malignant or pre-malignant as well as other cells and material that initially appear to have characteristics of a malignant cell or a pre-malignant cell based on integrated optical density; and (e) a secondary classifier for distinguishing pre-malignant and malignant cells from other cells and material among the objects detected by the primary classifier, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
13. The automated classifier of Claim 12, wherein said primary classifier means comprises a statistical classifier.
14. The automated classifier of Claim 12, wherein such cytological specimen includes overlapping cells.
15. The automated classifier of Claim 12 wherein said secondary classifier means is operable to distinguish pre-malignant and malignant cells among overlapping arrangements of cells and other material.
16. The automated classifier of Claim 12, wherein said camera means is positioned to create an image of a portion of such cytological specimen from such view.
17. The automated classifier of Claim 12, said neural computer apparatus means comprising an electronic neural computer.
18. The automated classifier of Claim 12 wherein the primary classifier means is restricted to evaluation of the cellular nucleus while the secondary classifier means evaluates both the cellular nucleus and its surrounding cytoplasm.
19. The automated classifier of Claim 12 wherein both the primary classifier means and the secondary classifier means are restricted to evaluation of the cellular nucleus.
20. The automated classifier of Claim 12 further comprising means for making an additional non-neural net classification of nuclear morphological components in addition to integrated optical density, said means being coupled between the primary classifier means and the secondary classifier means.
21. The automated classifier of Claim 12, said microscope means comprising an automated microscope.
22. The automated classifier of Claim 12, said camera means comprising a video camera.
23. The automated classifier of Claim 12, said camera means comprising a charge coupled device.
24. The automated classifier of Claim 12, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen which has a feature that exceeds a threshold level.
25. The automated classifier of Claim 12, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen which has a feature that exceeds a threshold integrated optical density.
26. The automated classifier of Claim 12, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen based on morphological criteria.
27. The automated classifier of Claim 12 further comprising a neural network pre-screening classifier means for recognition and classification of general areas within the digital representation of a specimen that contain material other than a cellular monolayer prior to primary classification by said primary classifier means.
28. The automated classifier of Claim 27, wherein the output from the pre-screening classifier means is utilized to exclude such areas from further analysis.
29. The automated classifier of Claim 27, wherein the output from the pre-screening classifier means is utilized to modify further analysis of images found within such areas.
30. A method of classifying cytological specimens, comprising using a primary classifier apparatus primarily classifying a specimen which is generally randomly arranged and can include other than a single layer to determine locations of interest, and secondarily classifying such locations of interest using a neural network computer apparatus.
31. The method of Claim 30, wherein said primary classifying step comprises using a video camera or charge coupled device (CCD) to obtain images of the specimen, a digitizer to digitize such images and an integrated optical density detector.
32. The method of Claim 30, wherein said primary classifying comprises using a neural network computer apparatus.
33. The method of Claim 30, wherein said step of using a primary classifier apparatus primarily classifying a specimen comprises using a statistical classifier.
34. The method of Claim 30, wherein said step of using a primary classifier apparatus primarily classifying a specimen comprises making a classification based on morphology.
35. The method of Claim 30, wherein said step of using a primary classifier apparatus primarily classifying a specimen comprises making such primary classification based on integrated optical density.
36. The method of Claim 30, further comprising training such neural network computer apparatus to identify cytological specimens of interest.
37. An automated cytological specimen classifier, comprising:
(a) microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly in an arrangement which can include other than a single layer of cells;
(b) camera means for creating an image of such view;
(c) image digitizing means for producing a digital representation of such image;
(d) primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classification means comprising a classifier for detecting cells that are likely to be of a predetermined cell type as well as other cells and material that initially appear to have characteristics of such predetermined cell type; and (e) secondary classifier means for distinguishing cells of such predetermined cell type from other cells and material among the objects detected by said primary classifier means, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
(a) microscope means for obtaining a view of at least part of a cytological specimen including cells and other material located generally randomly in an arrangement which can include other than a single layer of cells;
(b) camera means for creating an image of such view;
(c) image digitizing means for producing a digital representation of such image;
(d) primary classifier means for detecting objects in a digital representation of a cytological specimen based on a detectable feature, said primary classification means comprising a classifier for detecting cells that are likely to be of a predetermined cell type as well as other cells and material that initially appear to have characteristics of such predetermined cell type; and (e) secondary classifier means for distinguishing cells of such predetermined cell type from other cells and material among the objects detected by said primary classifier means, said secondary classifier means comprising a neural computer apparatus means for effecting such distinguishing as a function of training thereof.
38. The automated classifier of Claim 37, wherein the primary classifier means is restricted to evaluation of the cellular nucleus while the secondary classifier means evaluates both the cellular nucleus and its surrounding cytoplasm.
39. The automated classifier of Claim 37, wherein both the primary classifier means and the secondary classifier means are restricted to evaluation of the cellular nucleus.
40. The automated classifier of Claim 37, further comprising means for making an additional non-neural net classification of nuclear morphological components, said means being coupled between the primary classifier means and said secondary classifier means.
41. The automated classifier of Claim 37, said microscope means comprising an automated microscope.
42. The automated classifier of Claim 37, said camera means comprising a video camera.
43. The automated classifier of Claim 37, said camera means comprising a charge coupled device.
44. The automated classifier of Claim 37, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen which have a feature that exceeds a threshold level.
45. The automated classifier of Claim 37, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen which has a feature that exceeds a threshold integrated optical density.
46. The automated classifier of Claim 37, said primary classifier means comprising means for detection of objects in such digital representation of a cytological specimen based on morphological criteria.
47. The automated classifier of Claim 37, wherein said camera means is positioned to create an image of a portion of such cytological specimen from such view.
48. The automated classifier of Claim 37, said neural computer apparatus means comprising an electronic neural computer.
49. The automated classifier of Claim 37, wherein primary classifier means comprises a statistical classifier.
50. The automated classifier of Claim 37, wherein such cytological specimen includes overlapping cells.
51. The automated classifier of Claim 37, wherein said secondary classifier means is operable to distinguish cells of such predetermined cell type among overlapping arrangements of cells and other materials.
52. The automated classifier of Claim 37 further comprising a neural network pre-screening classifier means for identifying general areas within the digital representation of a specimen that contain material other than a cellular monolayer prior to primary classification.
53. The automated classifier of Claim 52, wherein the output from the pre-screening classifier is utilized to exclude such identified areas from further analysis.
54. The automated classifier of Claim 52, wherein the output from the pre-screening classifier means is utilized to modify further analysis of images found within the areas of the specimen identified by said pre-screening classifier means.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US179,060 | 1980-08-18 | ||
US07179060 US4965725B1 (en) | 1988-04-08 | 1988-04-08 | Neural network based automated cytological specimen classification system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CA1323700C true CA1323700C (en) | 1993-10-26 |
Family
ID=22655067
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA000595659A Expired - Fee Related CA1323700C (en) | 1988-04-08 | 1989-04-04 | Neural network based automated cytological specimen classification system and method |
Country Status (23)
Country | Link |
---|---|
US (2) | US4965725B1 (en) |
EP (1) | EP0336608B1 (en) |
JP (1) | JPH04501325A (en) |
CN (1) | CN1031811C (en) |
AT (1) | ATE140327T1 (en) |
AU (1) | AU628342B2 (en) |
BG (1) | BG51463A3 (en) |
BR (1) | BR8907355A (en) |
CA (1) | CA1323700C (en) |
DE (1) | DE68926796T2 (en) |
DK (1) | DK262490D0 (en) |
ES (1) | ES2090033T3 (en) |
FI (1) | FI101653B1 (en) |
GR (1) | GR3021252T3 (en) |
HK (1) | HK1003583A1 (en) |
HU (1) | HU208186B (en) |
IL (1) | IL89859A0 (en) |
MC (1) | MC2101A1 (en) |
RO (1) | RO106931B1 (en) |
RU (1) | RU2096827C1 (en) |
SG (1) | SG46454A1 (en) |
WO (1) | WO1989009969A1 (en) |
ZA (1) | ZA892558B (en) |
Families Citing this family (309)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5202231A (en) | 1987-04-01 | 1993-04-13 | Drmanac Radoje T | Method of sequencing of genomes by hybridization of oligonucleotide probes |
US5525464A (en) * | 1987-04-01 | 1996-06-11 | Hyseq, Inc. | Method of sequencing by hybridization of oligonucleotide probes |
US5224175A (en) * | 1987-12-07 | 1993-06-29 | Gdp Technologies, Inc. | Method for analyzing a body tissue ultrasound image |
US5092343A (en) * | 1988-02-17 | 1992-03-03 | Wayne State University | Waveform analysis apparatus and method using neural network techniques |
DE68928484T2 (en) * | 1988-03-25 | 1998-07-23 | Hitachi Ltd | METHOD FOR RECOGNIZING IMAGE STRUCTURES |
US5740270A (en) * | 1988-04-08 | 1998-04-14 | Neuromedical Systems, Inc. | Automated cytological specimen classification system and method |
US5544650A (en) * | 1988-04-08 | 1996-08-13 | Neuromedical Systems, Inc. | Automated specimen classification system and method |
US4965725B1 (en) * | 1988-04-08 | 1996-05-07 | Neuromedical Systems Inc | Neural network based automated cytological specimen classification system and method |
DE68928895T2 (en) * | 1988-10-11 | 1999-05-27 | Oyo Keisoku Kenkyusho Kk | Method and device for universal adaptive learning image measurement and recognition |
US5041916A (en) * | 1989-02-07 | 1991-08-20 | Matsushita Electric Industrial Co., Ltd. | Color image data compression and recovery apparatus based on neural networks |
JPH0821065B2 (en) * | 1989-03-13 | 1996-03-04 | シャープ株式会社 | Character recognition device |
JP2885823B2 (en) * | 1989-04-11 | 1999-04-26 | 株式会社豊田中央研究所 | Visual recognition device |
JP2940933B2 (en) * | 1989-05-20 | 1999-08-25 | 株式会社リコー | Pattern recognition method |
US5547839A (en) | 1989-06-07 | 1996-08-20 | Affymax Technologies N.V. | Sequencing of surface immobilized polymers utilizing microflourescence detection |
US5850465A (en) * | 1989-06-26 | 1998-12-15 | Fuji Photo Film Co., Ltd. | Abnormnal pattern detecting or judging apparatus, circular pattern judging apparatus, and image finding apparatus |
US5086479A (en) * | 1989-06-30 | 1992-02-04 | Hitachi, Ltd. | Information processing system using neural network learning function |
US5140523A (en) * | 1989-09-05 | 1992-08-18 | Ktaadn, Inc. | Neural network for predicting lightning |
WO1991006911A1 (en) * | 1989-10-23 | 1991-05-16 | Neuromedical Systems, Inc. | Automated cytological specimen classification system and method |
JPH03196277A (en) * | 1989-12-25 | 1991-08-27 | Takayama:Kk | Feature data selecting method for data processor |
US5313532A (en) * | 1990-01-23 | 1994-05-17 | Massachusetts Institute Of Technology | Recognition of patterns in images |
WO1991011783A1 (en) * | 1990-01-23 | 1991-08-08 | Massachusetts Institute Of Technology | Recognition of patterns in images |
JPH03223976A (en) * | 1990-01-29 | 1991-10-02 | Ezel Inc | Image collating device |
WO1991014235A1 (en) * | 1990-03-06 | 1991-09-19 | Massachusetts Institute Of Technology | Recognition of patterns in images |
ATE155592T1 (en) * | 1990-03-30 | 1997-08-15 | Neuromedical Systems Inc | AUTOMATIC CELL CLASSIFICATION SYSTEM AND METHOD |
JP2896799B2 (en) * | 1990-04-18 | 1999-05-31 | 富士写真フイルム株式会社 | Radiation image reading condition and / or image processing condition determination device |
US5862304A (en) * | 1990-05-21 | 1999-01-19 | Board Of Regents, The University Of Texas System | Method for predicting the future occurrence of clinically occult or non-existent medical conditions |
WO1991020048A1 (en) * | 1990-06-21 | 1991-12-26 | Applied Electronic Vision, Inc. | Cellular analysis utilizing video processing and neural network |
US5365460A (en) * | 1990-08-27 | 1994-11-15 | Rockwell International Corp. | Neural network signal processor |
US5655029A (en) * | 1990-11-07 | 1997-08-05 | Neuromedical Systems, Inc. | Device and method for facilitating inspection of a specimen |
DE69131102T2 (en) * | 1990-11-07 | 1999-07-29 | Neuromedical Systems Inc | EXAMINATION CONTROL PROCEDURE FOR IMAGES SHOWED ON A DISPLAY |
US5214744A (en) * | 1990-12-14 | 1993-05-25 | Westinghouse Electric Corp. | Method and apparatus for automatically identifying targets in sonar images |
US5257182B1 (en) * | 1991-01-29 | 1996-05-07 | Neuromedical Systems Inc | Morphological classification system and method |
US6018587A (en) * | 1991-02-21 | 2000-01-25 | Applied Spectral Imaging Ltd. | Method for remote sensing analysis be decorrelation statistical analysis and hardware therefor |
US5331550A (en) * | 1991-03-05 | 1994-07-19 | E. I. Du Pont De Nemours And Company | Application of neural networks as an aid in medical diagnosis and general anomaly detection |
US5105468A (en) * | 1991-04-03 | 1992-04-14 | At&T Bell Laboratories | Time delay neural network for printed and cursive handwritten character recognition |
US5260871A (en) * | 1991-07-31 | 1993-11-09 | Mayo Foundation For Medical Education And Research | Method and apparatus for diagnosis of breast tumors |
GB9116562D0 (en) * | 1991-08-01 | 1991-09-18 | British Textile Tech | Sample evaluation |
DE69218912T2 (en) * | 1991-08-28 | 1997-10-09 | Becton Dickinson Co | GRAVITY ATTRACTION MACHINE FOR ADAPTABLE AUTOCLUSTER FORMATION OF N-DIMENSIONAL DATA FLOWS |
US5776709A (en) * | 1991-08-28 | 1998-07-07 | Becton Dickinson And Company | Method for preparation and analysis of leukocytes in whole blood |
US5941832A (en) * | 1991-09-27 | 1999-08-24 | Tumey; David M. | Method and apparatus for detection of cancerous and precancerous conditions in a breast |
US6418424B1 (en) | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6850252B1 (en) | 1999-10-05 | 2005-02-01 | Steven M. Hoffberg | Intelligent electronic appliance system and method |
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US5903454A (en) | 1991-12-23 | 1999-05-11 | Hoffberg; Linda Irene | Human-factored interface corporating adaptive pattern recognition based controller apparatus |
US6400996B1 (en) | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US10361802B1 (en) | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
DE69329554T2 (en) * | 1992-02-18 | 2001-05-31 | Neopath Inc | METHOD FOR IDENTIFYING OBJECTS USING DATA PROCESSING TECHNIQUES |
ES2127810T3 (en) * | 1992-02-18 | 1999-05-01 | Neopath Inc | PROCEDURE TO IDENTIFY NORMAL BIOMEDICAL SPECIMENS. |
US5283418A (en) * | 1992-02-27 | 1994-02-01 | Westinghouse Electric Corp. | Automated rotor welding processes using neural networks |
JP3165247B2 (en) * | 1992-06-19 | 2001-05-14 | シスメックス株式会社 | Particle analysis method and device |
DE4224621C2 (en) * | 1992-07-25 | 1994-05-05 | Boehringer Mannheim Gmbh | Method for analyzing a component of a medical sample using an automatic analysis device |
US5388164A (en) * | 1992-08-19 | 1995-02-07 | Olympus Optical Co., Ltd. | Method for judging particle agglutination patterns using neural networks |
US5742702A (en) * | 1992-10-01 | 1998-04-21 | Sony Corporation | Neural network for character recognition and verification |
US5319722A (en) * | 1992-10-01 | 1994-06-07 | Sony Electronics, Inc. | Neural network for character recognition of rotated characters |
US6026174A (en) * | 1992-10-14 | 2000-02-15 | Accumed International, Inc. | System and method for automatically detecting malignant cells and cells having malignancy-associated changes |
US5733721A (en) * | 1992-11-20 | 1998-03-31 | The Board Of Regents Of The University Of Oklahoma | Cell analysis method using quantitative fluorescence image analysis |
US5719784A (en) * | 1992-12-14 | 1998-02-17 | University Of Washington At Seattle | Order-based analyses of cell and tissue structure |
CN1036118C (en) * | 1992-12-29 | 1997-10-15 | 陈立奇 | Tumor image diagnostic method and system |
EP0610916A3 (en) * | 1993-02-09 | 1994-10-12 | Cedars Sinai Medical Center | Method and apparatus for providing preferentially segmented digital images. |
US5426010A (en) * | 1993-02-26 | 1995-06-20 | Oxford Computer, Inc. | Ultra high resolution printing method |
US5619619A (en) * | 1993-03-11 | 1997-04-08 | Kabushiki Kaisha Toshiba | Information recognition system and control system using same |
JP3265044B2 (en) * | 1993-03-23 | 2002-03-11 | 株式会社コーナン・メディカル | Corneal endothelial cell morphology determination method |
US5479526A (en) * | 1993-03-23 | 1995-12-26 | Martin Marietta | Pixel designator for small objects |
JP3535873B2 (en) * | 1993-04-10 | 2004-06-07 | フラウンホッファー−ゲゼルシャフト ツァ フェルダールング デァ アンゲヴァンテン フォアシュンク エー.ファオ. | Target taxonomy |
US5475768A (en) * | 1993-04-29 | 1995-12-12 | Canon Inc. | High accuracy optical character recognition using neural networks with centroid dithering |
US5587833A (en) * | 1993-07-09 | 1996-12-24 | Compucyte Corporation | Computerized microscope specimen encoder |
EP0644414B1 (en) * | 1993-08-19 | 2001-11-21 | Hitachi, Ltd. | Classification and examination device of particles in fluid |
US6136540A (en) | 1994-10-03 | 2000-10-24 | Ikonisys Inc. | Automated fluorescence in situ hybridization detection of genetic abnormalities |
US5352613A (en) * | 1993-10-07 | 1994-10-04 | Tafas Triantafillos P | Cytological screening method |
CA2132269C (en) * | 1993-10-12 | 2000-02-01 | Rainer Hermann Doerrer | Interactive automated cytology method and system |
US5797130A (en) * | 1993-11-16 | 1998-08-18 | Neopath, Inc. | Method for testing proficiency in screening images of biological slides |
JP3165309B2 (en) * | 1993-12-22 | 2001-05-14 | 株式会社日立製作所 | Particle image analyzer |
KR970006423B1 (en) * | 1993-12-29 | 1997-04-28 | 한국전기통신공사 | Pattern recognition using neural network |
EP0745243B9 (en) * | 1994-02-14 | 2008-01-02 | AutoCyte North Carolina LLC | Automated cytological specimen classification methods |
US5493619A (en) * | 1994-03-11 | 1996-02-20 | Haley; Paul H. | Normalization method for eliminating false detections in side scan sonar images |
US5493539A (en) * | 1994-03-11 | 1996-02-20 | Westinghouse Electric Corporation | Two-stage detection and discrimination system for side scan sonar equipment |
US5486999A (en) * | 1994-04-20 | 1996-01-23 | Mebane; Andrew H. | Apparatus and method for categorizing health care utilization |
US6463438B1 (en) * | 1994-06-03 | 2002-10-08 | Urocor, Inc. | Neural network for cell image analysis for identification of abnormal cells |
US5625705A (en) * | 1994-06-03 | 1997-04-29 | Neuromedical Systems, Inc. | Intensity texture based classification system and method |
US6287850B1 (en) * | 1995-06-07 | 2001-09-11 | Affymetrix, Inc. | Bioarray chip reaction apparatus and its manufacture |
DE69527585T2 (en) * | 1994-06-08 | 2003-04-03 | Affymetrix Inc | Method and device for packaging chips |
WO1996003709A1 (en) * | 1994-07-26 | 1996-02-08 | Neuromedical Systems, Inc. | Inspection device and method |
US5647025A (en) * | 1994-09-20 | 1997-07-08 | Neopath, Inc. | Automatic focusing of biomedical specimens apparatus |
WO1996009594A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for automated identification of thick cell groupings on a biological specimen |
US5757954A (en) * | 1994-09-20 | 1998-05-26 | Neopath, Inc. | Field prioritization apparatus and method |
WO1996009598A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Cytological slide scoring apparatus |
US5566249A (en) * | 1994-09-20 | 1996-10-15 | Neopath, Inc. | Apparatus for detecting bubbles in coverslip adhesive |
WO1996009600A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for identification and integration of multiple cell patterns |
JPH10506484A (en) * | 1994-09-20 | 1998-06-23 | ネオパス,インク. | Biological analysis self-calibration device |
US5627908A (en) * | 1994-09-20 | 1997-05-06 | Neopath, Inc. | Method for cytological system dynamic normalization |
CA2200455A1 (en) * | 1994-09-20 | 1996-03-28 | Louis R. Piloco | Apparatus for illumination stabilization and homogenization |
US5715327A (en) * | 1994-09-20 | 1998-02-03 | Neopath, Inc. | Method and apparatus for detection of unsuitable conditions for automated cytology scoring |
AU3371395A (en) * | 1994-09-20 | 1996-04-19 | Neopath, Inc. | Biological specimen analysis system processing integrity checking apparatus |
US5638459A (en) * | 1994-09-20 | 1997-06-10 | Neopath, Inc. | Method and apparatus for detecting a microscope slide coverslip |
US5978497A (en) * | 1994-09-20 | 1999-11-02 | Neopath, Inc. | Apparatus for the identification of free-lying cells |
US5740269A (en) * | 1994-09-20 | 1998-04-14 | Neopath, Inc. | Method and apparatus for robust biological specimen classification |
AU3586195A (en) * | 1994-09-20 | 1996-04-09 | Neopath, Inc. | Apparatus for automated identification of cell groupings on a biological specimen |
US5692066A (en) * | 1994-09-20 | 1997-11-25 | Neopath, Inc. | Method and apparatus for image plane modulation pattern recognition |
AU3675495A (en) * | 1994-09-30 | 1996-04-26 | Neopath, Inc. | Method and apparatus for highly efficient computer aided screening |
US5453676A (en) * | 1994-09-30 | 1995-09-26 | Itt Automotive Electrical Systems, Inc. | Trainable drive system for a windshield wiper |
WO1996012187A1 (en) | 1994-10-13 | 1996-04-25 | Horus Therapeutics, Inc. | Computer assisted methods for diagnosing diseases |
US5524631A (en) * | 1994-10-13 | 1996-06-11 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Passive fetal heart rate monitoring apparatus and method with enhanced fetal heart beat discrimination |
JP3537194B2 (en) * | 1994-10-17 | 2004-06-14 | オリンパス株式会社 | Light microscope |
US6600996B2 (en) | 1994-10-21 | 2003-07-29 | Affymetrix, Inc. | Computer-aided techniques for analyzing biological sequences |
US5795716A (en) | 1994-10-21 | 1998-08-18 | Chee; Mark S. | Computer-aided visualization and analysis system for sequence evaluation |
JP3189608B2 (en) * | 1995-02-01 | 2001-07-16 | 株式会社日立製作所 | Flow type particle image analysis method |
US5708591A (en) * | 1995-02-14 | 1998-01-13 | Akzo Nobel N.V. | Method and apparatus for predicting the presence of congenital and acquired imbalances and therapeutic conditions |
US5884296A (en) * | 1995-03-13 | 1999-03-16 | Minolta Co., Ltd. | Network and image area attribute discriminating device and method for use with said neural network |
FR2733596B1 (en) * | 1995-04-28 | 1997-07-18 | Hycel Groupe Lisabio | METHOD AND DEVICE FOR IDENTIFYING PARTICLES |
US5625706A (en) * | 1995-05-31 | 1997-04-29 | Neopath, Inc. | Method and apparatus for continously monitoring and forecasting slide and specimen preparation for a biological specimen population |
US5671288A (en) * | 1995-05-31 | 1997-09-23 | Neopath, Inc. | Method and apparatus for assessing slide and specimen preparation quality |
US5619428A (en) * | 1995-05-31 | 1997-04-08 | Neopath, Inc. | Method and apparatus for integrating an automated system to a laboratory |
US6252979B1 (en) * | 1995-06-07 | 2001-06-26 | Tripath Imaging, Inc. | Interactive method and apparatus for sorting biological specimens |
US5787208A (en) * | 1995-06-07 | 1998-07-28 | Neopath, Inc. | Image enhancement method and apparatus |
US6898532B1 (en) | 1995-06-07 | 2005-05-24 | Biomerieux, Inc. | Method and apparatus for predicting the presence of haemostatic dysfunction in a patient sample |
US5889880A (en) * | 1995-06-07 | 1999-03-30 | Autocyte, Inc. | Interactive automated cytology method incorporating both manual and automatic determinations |
US6321164B1 (en) | 1995-06-07 | 2001-11-20 | Akzo Nobel N.V. | Method and apparatus for predicting the presence of an abnormal level of one or more proteins in the clotting cascade |
US6242876B1 (en) | 1995-06-07 | 2001-06-05 | Valeo Electrical Systems, Inc. | Intermittent windshield wiper controller |
US6429017B1 (en) | 1999-02-04 | 2002-08-06 | Biomerieux | Method for predicting the presence of haemostatic dysfunction in a patient sample |
US6720149B1 (en) * | 1995-06-07 | 2004-04-13 | Affymetrix, Inc. | Methods for concurrently processing multiple biological chip assays |
US5745601A (en) * | 1995-07-31 | 1998-04-28 | Neopath, Inc. | Robustness of classification measurement apparatus and method |
US5642433A (en) * | 1995-07-31 | 1997-06-24 | Neopath, Inc. | Method and apparatus for image contrast quality evaluation |
US5621519A (en) * | 1995-07-31 | 1997-04-15 | Neopath, Inc. | Imaging system transfer function control method and apparatus |
US6118581A (en) * | 1995-09-15 | 2000-09-12 | Accumed International, Inc. | Multifunctional control unit for a microscope |
US5690892A (en) * | 1995-09-15 | 1997-11-25 | Accumed, Inc. | Cassette for use with automated specimen handling system |
US6148096A (en) * | 1995-09-15 | 2000-11-14 | Accumed International, Inc. | Specimen preview and inspection system |
US5963368A (en) * | 1995-09-15 | 1999-10-05 | Accumed International, Inc. | Specimen management system |
CA2185511C (en) * | 1995-09-15 | 2008-08-05 | Vladimir Dadeshidze | Cytological specimen analysis system with individualized patient data |
US6430309B1 (en) | 1995-09-15 | 2002-08-06 | Monogen, Inc. | Specimen preview and inspection system |
US6091842A (en) * | 1996-10-25 | 2000-07-18 | Accumed International, Inc. | Cytological specimen analysis system with slide mapping and generation of viewing path information |
US5930732A (en) * | 1995-09-15 | 1999-07-27 | Accumed International, Inc. | System for simplifying the implementation of specified functions |
WO1997011350A2 (en) * | 1995-09-19 | 1997-03-27 | Morphometrix Technologies Inc. | A neural network assisted multi-spectral segmentation system |
US5732150A (en) * | 1995-09-19 | 1998-03-24 | Ihc Health Services, Inc. | Method and system for multiple wavelength microscopy image analysis |
JPH0991430A (en) * | 1995-09-27 | 1997-04-04 | Hitachi Ltd | Pattern recognition device |
US20040175718A1 (en) * | 1995-10-16 | 2004-09-09 | Affymetrix, Inc. | Computer-aided visualization and analysis system for sequence evaluation |
IL115985A0 (en) | 1995-11-14 | 1996-01-31 | Elop Electrooptics Ind Ltd | System and method for computerized archiving |
EP0864082B1 (en) * | 1995-11-30 | 2003-04-02 | Chromavision Medical Systems, Inc. | Method for automated image analysis of biological specimens |
US6151405A (en) * | 1996-11-27 | 2000-11-21 | Chromavision Medical Systems, Inc. | System and method for cellular specimen grading |
US6718053B1 (en) * | 1996-11-27 | 2004-04-06 | Chromavision Medical Systems, Inc. | Method and apparatus for automated image analysis of biological specimens |
US5835620A (en) * | 1995-12-19 | 1998-11-10 | Neuromedical Systems, Inc. | Boundary mapping system and method |
US5699794A (en) * | 1995-12-19 | 1997-12-23 | Neopath, Inc. | Apparatus for automated urine sediment sample handling |
WO1997023835A1 (en) * | 1995-12-21 | 1997-07-03 | Erudite Technology Iii, Inc. | A method for the detection, identification and alteration of molecular structure in various media |
US5850464A (en) * | 1996-01-16 | 1998-12-15 | Erim International, Inc. | Method of extracting axon fibers and clusters |
US6678669B2 (en) | 1996-02-09 | 2004-01-13 | Adeza Biomedical Corporation | Method for selecting medical and biochemical diagnostic tests using neural network-related applications |
US6361937B1 (en) | 1996-03-19 | 2002-03-26 | Affymetrix, Incorporated | Computer-aided nucleic acid sequencing |
US5724253A (en) * | 1996-03-26 | 1998-03-03 | International Business Machines Corporation | System and method for searching data vectors such as genomes for specified template vector |
DE19616997A1 (en) * | 1996-04-27 | 1997-10-30 | Boehringer Mannheim Gmbh | Process for automated microscope-assisted examination of tissue or body fluid samples |
WO1998002843A1 (en) * | 1996-07-12 | 1998-01-22 | Erim International, Inc. | Mosaic construction, processing, and review of very large electronic micrograph composites |
US5810747A (en) * | 1996-08-21 | 1998-09-22 | Interactive Remote Site Technology, Inc. | Remote site medical intervention system |
US6396941B1 (en) * | 1996-08-23 | 2002-05-28 | Bacus Research Laboratories, Inc. | Method and apparatus for internet, intranet, and local viewing of virtual microscope slides |
US6272235B1 (en) * | 1997-03-03 | 2001-08-07 | Bacus Research Laboratories, Inc. | Method and apparatus for creating a virtual microscope slide |
US6404906B2 (en) * | 1997-03-03 | 2002-06-11 | Bacus Research Laboratories,Inc. | Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope |
US6031930A (en) * | 1996-08-23 | 2000-02-29 | Bacus Research Laboratories, Inc. | Method and apparatus for testing a progression of neoplasia including cancer chemoprevention testing |
US6122396A (en) * | 1996-12-16 | 2000-09-19 | Bio-Tech Imaging, Inc. | Method of and apparatus for automating detection of microorganisms |
JP3445799B2 (en) * | 1996-12-25 | 2003-09-08 | 株式会社日立製作所 | Pattern recognition apparatus and pattern recognition method |
JP3702978B2 (en) * | 1996-12-26 | 2005-10-05 | ソニー株式会社 | Recognition device, recognition method, learning device, and learning method |
US5937103A (en) * | 1997-01-25 | 1999-08-10 | Neopath, Inc. | Method and apparatus for alias free measurement of optical transfer function |
US6753161B2 (en) * | 1997-03-27 | 2004-06-22 | Oncosis Llc | Optoinjection methods |
GB9714347D0 (en) * | 1997-07-09 | 1997-09-10 | Oxoid Ltd | Image analysis systems and devices for use therewith |
US5959726A (en) * | 1997-07-25 | 1999-09-28 | Neopath, Inc. | Modulation transfer function test compensation for test pattern duty cycle |
WO1999010771A1 (en) * | 1997-08-22 | 1999-03-04 | Lxr Biotechnology, Inc. | Focusing and autofocusing in scanning laser imaging |
US6198839B1 (en) * | 1997-09-05 | 2001-03-06 | Tripath Imaging, Inc. | Dynamic control and decision making method and apparatus |
US6502040B2 (en) | 1997-12-31 | 2002-12-31 | Biomerieux, Inc. | Method for presenting thrombosis and hemostasis assay data |
US6181811B1 (en) | 1998-01-13 | 2001-01-30 | Neopath, Inc. | Method and apparatus for optimizing biological and cytological specimen screening and diagnosis |
US6166142A (en) | 1998-01-27 | 2000-12-26 | E. I. Du Pont De Nemours And Company | Adhesive compositions based on blends of grafted metallocene catalyzed and polar ethylene copolymers |
CA2331508A1 (en) * | 1998-05-09 | 1999-11-18 | Ikonisys, Inc. | Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis |
US20090111101A1 (en) * | 1998-05-09 | 2009-04-30 | Ikonisys, Inc. | Automated Cancer Diagnostic Methods Using FISH |
US20080241848A1 (en) * | 1998-05-09 | 2008-10-02 | Ikonisys, Inc. | Methods for prenatal diagnosis of aneuploidy |
US7901887B2 (en) * | 1998-05-09 | 2011-03-08 | Ikonisys, Inc. | Automated cancer diagnostic methods using fish |
US6821484B1 (en) | 1998-09-02 | 2004-11-23 | Accip Biotech Aps | Apparatus for isolation of particles, preferably cell clusters |
US6091843A (en) * | 1998-09-03 | 2000-07-18 | Greenvision Systems Ltd. | Method of calibration and real-time analysis of particulates |
US6572824B1 (en) * | 1998-09-18 | 2003-06-03 | Cytyc Corporation | Method and apparatus for preparing cytological specimens |
US6357285B1 (en) | 1998-11-09 | 2002-03-19 | Veeco Instruments Inc. | Method and apparatus for the quantitative and objective correlation of data from a local sensitive force detector |
FR2785713B1 (en) * | 1998-11-10 | 2000-12-08 | Commissariat Energie Atomique | CONTROL SYSTEM FOR LIFT AND TELEMANIPULATION UNITS PLACED IN CONFINED ENCLOSURES |
US7612020B2 (en) | 1998-12-28 | 2009-11-03 | Illumina, Inc. | Composite arrays utilizing microspheres with a hybridization chamber |
US7966078B2 (en) | 1999-02-01 | 2011-06-21 | Steven Hoffberg | Network media appliance system and method |
CA2362055C (en) * | 1999-02-04 | 2010-07-20 | Akzo Nobel N.V. | A method and apparatus for predicting the presence of haemostatic dysfunction in a patient sample |
US6297044B1 (en) * | 1999-02-23 | 2001-10-02 | Oralscan Laboratories, Inc. | Minimally invasive apparatus for testing lesions of the oral cavity and similar epithelium |
WO2000062247A1 (en) * | 1999-04-13 | 2000-10-19 | Chromavision Medical Systems, Inc. | Histological reconstruction and automated image analysis |
US6743576B1 (en) | 1999-05-14 | 2004-06-01 | Cytokinetics, Inc. | Database system for predictive cellular bioinformatics |
US7151847B2 (en) * | 2001-02-20 | 2006-12-19 | Cytokinetics, Inc. | Image analysis of the golgi complex |
US6876760B1 (en) | 2000-12-04 | 2005-04-05 | Cytokinetics, Inc. | Classifying cells based on information contained in cell images |
US6651008B1 (en) | 1999-05-14 | 2003-11-18 | Cytokinetics, Inc. | Database system including computer code for predictive cellular bioinformatics |
DE19937778B4 (en) * | 1999-08-10 | 2011-11-24 | Cellasys Gmbh | Method and device for characterizing the functional state of cells and alterations of cells |
US6593102B2 (en) | 1999-10-29 | 2003-07-15 | Cytyc Corporation | Cytological stain composition |
US7369304B2 (en) * | 1999-10-29 | 2008-05-06 | Cytyc Corporation | Cytological autofocusing imaging systems and methods |
US6348325B1 (en) | 1999-10-29 | 2002-02-19 | Cytyc Corporation | Cytological stain composition |
US6665060B1 (en) | 1999-10-29 | 2003-12-16 | Cytyc Corporation | Cytological imaging system and method |
US6661501B1 (en) | 1999-10-29 | 2003-12-09 | Cytyc Corporation | Cytological stain composition including verification characteristic |
JP2003524775A (en) | 1999-11-04 | 2003-08-19 | メルテック・マルチ−エピトープ−リガンド−テクノロジーズ・ゲーエムベーハー | Automatic analysis of microscope images |
CA2392534A1 (en) | 1999-11-30 | 2001-06-07 | Oncosis | Method and apparatus for selectively targeting specific cells within a cell population |
US6535626B1 (en) | 2000-01-14 | 2003-03-18 | Accumed International, Inc. | Inspection system with specimen preview |
US7016551B1 (en) * | 2000-04-10 | 2006-03-21 | Fuji Xerox Co., Ltd. | Image reader |
US7236623B2 (en) * | 2000-04-24 | 2007-06-26 | International Remote Imaging Systems, Inc. | Analyte recognition for urinalysis diagnostic system |
US6947586B2 (en) * | 2000-04-24 | 2005-09-20 | International Remote Imaging Systems, Inc. | Multi-neural net imaging apparatus and method |
US7179612B2 (en) | 2000-06-09 | 2007-02-20 | Biomerieux, Inc. | Method for detecting a lipoprotein-acute phase protein complex and predicting an increased risk of system failure or mortality |
AU2001270126A1 (en) * | 2000-06-23 | 2002-01-08 | Cytokinetics, Inc. | Image analysis for phenotyping sets of mutant cells |
IL138123A0 (en) * | 2000-08-28 | 2001-10-31 | Accuramed 1999 Ltd | Medical decision support system and method |
AU3289202A (en) * | 2000-10-24 | 2002-05-21 | Oncosis Llp | Method and device for selectively targeting cells within a three -dimensional specimen |
US7027628B1 (en) | 2000-11-14 | 2006-04-11 | The United States Of America As Represented By The Department Of Health And Human Services | Automated microscopic image acquisition, compositing, and display |
WO2002044695A1 (en) * | 2000-11-16 | 2002-06-06 | Burstein Technologies, Inc. | Methods and apparatus for detecting and quantifying lymphocytes with optical biodiscs |
US7099510B2 (en) * | 2000-11-29 | 2006-08-29 | Hewlett-Packard Development Company, L.P. | Method and system for object detection in digital images |
US7218764B2 (en) * | 2000-12-04 | 2007-05-15 | Cytokinetics, Inc. | Ploidy classification method |
US20040262318A1 (en) * | 2000-12-08 | 2004-12-30 | Ardais Corporation | Container, method and system for cryptopreserved material |
US6599694B2 (en) | 2000-12-18 | 2003-07-29 | Cytokinetics, Inc. | Method of characterizing potential therapeutics by determining cell-cell interactions |
US6466690C1 (en) * | 2000-12-19 | 2008-11-18 | Bacus Res Lab Inc | Method and apparatus for processing an image of a tissue sample microarray |
US7155049B2 (en) * | 2001-01-11 | 2006-12-26 | Trestle Acquisition Corp. | System for creating microscopic digital montage images |
US6816606B2 (en) | 2001-02-21 | 2004-11-09 | Interscope Technologies, Inc. | Method for maintaining high-quality focus during high-throughput, microscopic digital montage imaging |
US6993169B2 (en) * | 2001-01-11 | 2006-01-31 | Trestle Corporation | System and method for finding regions of interest for microscopic digital montage imaging |
US6798571B2 (en) | 2001-01-11 | 2004-09-28 | Interscope Technologies, Inc. | System for microscopic digital montage imaging using a pulse light illumination system |
US7016787B2 (en) | 2001-02-20 | 2006-03-21 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US6956961B2 (en) * | 2001-02-20 | 2005-10-18 | Cytokinetics, Inc. | Extracting shape information contained in cell images |
US20020165839A1 (en) * | 2001-03-14 | 2002-11-07 | Taylor Kevin M. | Segmentation and construction of segmentation classifiers |
EP1405058A4 (en) * | 2001-03-19 | 2007-07-04 | Ikonisys Inc | System and method for increasing the contrast of an image produced by an epifluorescence microscope |
US20020186875A1 (en) * | 2001-04-09 | 2002-12-12 | Burmer Glenna C. | Computer methods for image pattern recognition in organic material |
WO2002097714A1 (en) * | 2001-04-09 | 2002-12-05 | Lifespan Biosciences, Inc. | Computer method for image pattern recognition in organic material |
US20030143637A1 (en) * | 2001-08-31 | 2003-07-31 | Selvan Gowri Pyapali | Capture layer assemblies for cellular assays including related optical analysis discs and methods |
US20040071328A1 (en) * | 2001-09-07 | 2004-04-15 | Vaisberg Eugeni A. | Classifying cells based on information contained in cell images |
JP2005502872A (en) * | 2001-09-07 | 2005-01-27 | バースタイン テクノロジーズ,インコーポレイティド | Identification and quantification of leukocyte types based on nuclear morphology using an optical biodisc system |
US6767733B1 (en) | 2001-10-10 | 2004-07-27 | Pritest, Inc. | Portable biosensor apparatus with controlled flow |
US8346483B2 (en) * | 2002-09-13 | 2013-01-01 | Life Technologies Corporation | Interactive and automated tissue image analysis with global training database and variable-abstraction processing in cytological specimen classification and laser capture microdissection applications |
US8722357B2 (en) * | 2001-11-05 | 2014-05-13 | Life Technologies Corporation | Automated microdissection instrument |
US10156501B2 (en) | 2001-11-05 | 2018-12-18 | Life Technologies Corporation | Automated microdissection instrument for determining a location of a laser beam projection on a worksurface area |
US8715955B2 (en) | 2004-09-09 | 2014-05-06 | Life Technologies Corporation | Laser microdissection apparatus and method |
US8676509B2 (en) | 2001-11-13 | 2014-03-18 | Dako Denmark A/S | System for tracking biological samples |
JP2005509882A (en) * | 2001-11-20 | 2005-04-14 | バースタイン テクノロジーズ,インコーポレイティド | Optical biodisc and fluid circuit for cell analysis and related methods |
US20040010481A1 (en) * | 2001-12-07 | 2004-01-15 | Whitehead Institute For Biomedical Research | Time-dependent outcome prediction using neural networks |
US20040020993A1 (en) * | 2001-12-28 | 2004-02-05 | Green Larry R. | Method for luminescent identification and calibration |
US7764821B2 (en) * | 2002-02-14 | 2010-07-27 | Veridex, Llc | Methods and algorithms for cell enumeration in a low-cost cytometer |
WO2003069421A2 (en) * | 2002-02-14 | 2003-08-21 | Immunivest Corporation | Methods and algorithms for cell enumeration in a low-cost cytometer |
EP1428169B1 (en) * | 2002-02-22 | 2017-01-18 | Olympus America Inc. | Focusable virtual microscopy apparatus and method |
US20040060987A1 (en) * | 2002-05-07 | 2004-04-01 | Green Larry R. | Digital image analysis method for enhanced and optimized signals in fluorophore detection |
US7469056B2 (en) * | 2002-05-14 | 2008-12-23 | Ge Healthcare Niagara Inc. | System and methods for rapid and automated screening of cells |
US20040126837A1 (en) * | 2002-05-23 | 2004-07-01 | Matthew Baker | Pseudo-tissues and uses thereof |
US20050037406A1 (en) * | 2002-06-12 | 2005-02-17 | De La Torre-Bueno Jose | Methods and apparatus for analysis of a biological specimen |
US7272252B2 (en) * | 2002-06-12 | 2007-09-18 | Clarient, Inc. | Automated system for combining bright field and fluorescent microscopy |
US7200252B2 (en) * | 2002-10-28 | 2007-04-03 | Ventana Medical Systems, Inc. | Color space transformations for use in identifying objects of interest in biological specimens |
US8712118B2 (en) * | 2003-04-10 | 2014-04-29 | Carl Zeiss Microimaging Gmbh | Automated measurement of concentration and/or amount in a biological sample |
US20040202357A1 (en) | 2003-04-11 | 2004-10-14 | Perz Cynthia B. | Silhouette image acquisition |
US7324694B2 (en) * | 2003-05-23 | 2008-01-29 | International Remote Imaging Systems, Inc. | Fluid sample analysis using class weights |
US20040241659A1 (en) * | 2003-05-30 | 2004-12-02 | Applera Corporation | Apparatus and method for hybridization and SPR detection |
WO2005010677A2 (en) * | 2003-07-18 | 2005-02-03 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US20050014217A1 (en) * | 2003-07-18 | 2005-01-20 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7235353B2 (en) * | 2003-07-18 | 2007-06-26 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7042639B1 (en) * | 2003-08-21 | 2006-05-09 | The United States Of America As Represented By The Administrator Of Nasa | Identification of cells with a compact microscope imaging system with intelligent controls |
CN1296699C (en) * | 2003-12-19 | 2007-01-24 | 武汉大学 | Microscopic multispectral marrow and its peripheral blood cell auto-analyzing instrument and method |
US7425426B2 (en) * | 2004-03-15 | 2008-09-16 | Cyntellect, Inc. | Methods for purification of cells based on product secretion |
US20050273271A1 (en) * | 2004-04-05 | 2005-12-08 | Aibing Rao | Method of characterizing cell shape |
US7653260B2 (en) | 2004-06-17 | 2010-01-26 | Carl Zeis MicroImaging GmbH | System and method of registering field of view |
US8582924B2 (en) | 2004-06-30 | 2013-11-12 | Carl Zeiss Microimaging Gmbh | Data structure of an image storage and retrieval system |
US7316904B1 (en) | 2004-06-30 | 2008-01-08 | Chromodynamics, Inc. | Automated pap screening using optical detection of HPV with or without multispectral imaging |
US7323318B2 (en) * | 2004-07-15 | 2008-01-29 | Cytokinetics, Inc. | Assay for distinguishing live and dead cells |
US20070031818A1 (en) * | 2004-07-15 | 2007-02-08 | Cytokinetics, Inc., A Delaware Corporation | Assay for distinguishing live and dead cells |
US8189899B2 (en) * | 2004-07-30 | 2012-05-29 | Veridex, Llc | Methods and algorithms for cell enumeration in a low-cost cytometer |
US7792338B2 (en) * | 2004-08-16 | 2010-09-07 | Olympus America Inc. | Method and apparatus of mechanical stage positioning in virtual microscopy image capture |
US7328198B1 (en) * | 2004-12-31 | 2008-02-05 | Cognitech, Inc. | Video demultiplexing based on meaningful modes extraction |
GB0503629D0 (en) * | 2005-02-22 | 2005-03-30 | Durand Technology Ltd | Method and apparatus for automated analysis of biological specimen |
JP4214124B2 (en) * | 2005-03-14 | 2009-01-28 | 株式会社バイオエコーネット | Ear thermometer |
CN100351057C (en) * | 2005-03-14 | 2007-11-28 | 南开大学 | Method and equipment for deep information extraction for micro-operation tool based-on microscopic image processing |
US20060246576A1 (en) | 2005-04-06 | 2006-11-02 | Affymetrix, Inc. | Fluidic system and method for processing biological microarrays in personal instrumentation |
US20070031043A1 (en) | 2005-08-02 | 2007-02-08 | Perz Cynthia B | System for and method of intelligently directed segmentation analysis for automated microscope systems |
US20070091109A1 (en) * | 2005-09-13 | 2007-04-26 | Roscoe Atkinson | Image quality |
JP4915071B2 (en) * | 2005-09-22 | 2012-04-11 | 株式会社ニコン | Microscope and virtual slide creation system |
US7783092B2 (en) * | 2006-01-17 | 2010-08-24 | Illinois Institute Of Technology | Method for enhancing diagnostic images using vessel reconstruction |
US20080003667A1 (en) * | 2006-05-19 | 2008-01-03 | Affymetrix, Inc. | Consumable elements for use with fluid processing and detection systems |
US8311310B2 (en) * | 2006-08-11 | 2012-11-13 | Koninklijke Philips Electronics N.V. | Methods and apparatus to integrate systematic data scaling into genetic algorithm-based feature subset selection |
EP2166965B1 (en) | 2007-07-17 | 2017-05-17 | Neal Marc Lonky | Frictional trans-epithelial tissue disruption and collection apparatus |
US8795197B2 (en) | 2007-07-17 | 2014-08-05 | Histologics, LLC | Frictional trans-epithelial tissue disruption collection apparatus and method of inducing an immune response |
EP2255310B1 (en) | 2008-02-29 | 2019-02-20 | Dako Denmark A/S | Systems and methods for tracking and providing workflow information |
US8135202B2 (en) * | 2008-06-02 | 2012-03-13 | Nec Laboratories America, Inc. | Automated method and system for nuclear analysis of biopsy images |
JP2012514477A (en) * | 2009-01-09 | 2012-06-28 | イントレクソン コーポレイション | Genetic analysis of cells |
KR20110108390A (en) | 2009-01-12 | 2011-10-05 | 신텔렉트 인코포레이티드 | Laser mediated sectioning and transfer of cell colonies |
US9310598B2 (en) | 2009-03-11 | 2016-04-12 | Sakura Finetek U.S.A., Inc. | Autofocus method and autofocus device |
US9607202B2 (en) | 2009-12-17 | 2017-03-28 | University of Pittsburgh—of the Commonwealth System of Higher Education | Methods of generating trophectoderm and neurectoderm from human embryonic stem cells |
US9044213B1 (en) | 2010-03-26 | 2015-06-02 | Histologics, LLC | Frictional tissue sampling and collection method and device |
EP2585578A4 (en) | 2010-05-08 | 2014-01-08 | Univ Twente | A simple and affordable method for immuophenotyping using a microfluidic chip sample preparation with image cytometry |
US10139613B2 (en) | 2010-08-20 | 2018-11-27 | Sakura Finetek U.S.A., Inc. | Digital microscope and method of sensing an image of a tissue sample |
US8388891B2 (en) * | 2010-12-28 | 2013-03-05 | Sakura Finetek U.S.A., Inc. | Automated system and method of processing biological specimens |
CA2839531A1 (en) | 2011-06-17 | 2012-12-20 | Constitution Medical, Inc. | Systems and methods for sample display and review |
US9459196B2 (en) | 2011-07-22 | 2016-10-04 | Roche Diagnostics Hematology, Inc. | Blood analyzer calibration and assessment |
CN102359938B (en) * | 2011-09-16 | 2012-09-05 | 长沙高新技术产业开发区爱威科技实业有限公司 | Morphological analytical apparatus and method for erythrocytes |
BR112013032938B1 (en) * | 2011-09-16 | 2021-06-29 | Ave Science & Technology Co., Ltd | DEVICE AND METHODS FOR PERFORMING MORPHOLOGICAL ANALYSIS FOR ERYTHROCYTES |
HUE048285T2 (en) | 2012-01-11 | 2020-07-28 | 77 Elektronika Mueszeripari Kft | Two stage categorization of objects in images |
WO2013106842A2 (en) * | 2012-01-13 | 2013-07-18 | The Charles Stark Draper Laboratory, Inc. | Stem cell bioinformatics |
US10201332B1 (en) | 2012-12-03 | 2019-02-12 | Healoe Llc | Device and method of orienting a biopsy device on epithelial tissue |
US9904842B2 (en) | 2012-12-19 | 2018-02-27 | Koninklijke Philips N.V. | System and method for classification of particles in a fluid sample |
EP2973408A4 (en) * | 2013-03-15 | 2017-06-14 | Richard Harry Turner | A system and methods for the in vitro detection of particles and soluble chemical entities in body fluids |
DE102013103971A1 (en) | 2013-04-19 | 2014-11-06 | Sensovation Ag | Method for generating an overall picture of an object composed of several partial images |
US10007102B2 (en) | 2013-12-23 | 2018-06-26 | Sakura Finetek U.S.A., Inc. | Microscope with slide clamping assembly |
DE102014202860B4 (en) * | 2014-02-17 | 2016-12-29 | Leica Microsystems Cms Gmbh | Providing sample information with a laser microdissection system |
WO2016115537A2 (en) * | 2015-01-15 | 2016-07-21 | Massachusetts Institute Of Technology | Systems, methods, and apparatus for in vitro single-cell identification and recovery |
US10304188B1 (en) | 2015-03-27 | 2019-05-28 | Caleb J. Kumar | Apparatus and method for automated cell analysis |
US10007863B1 (en) | 2015-06-05 | 2018-06-26 | Gracenote, Inc. | Logo recognition in images and videos |
CN105259095A (en) * | 2015-10-14 | 2016-01-20 | 南昌西尔戴尔医疗科技有限公司 | Negative-exclusion-method intelligent screening system for cervical cancer cellpathology |
US11013466B2 (en) | 2016-01-28 | 2021-05-25 | Healoe, Llc | Device and method to control and manipulate a catheter |
CN105717116A (en) * | 2016-02-23 | 2016-06-29 | 中国农业大学 | Species identification method and system for animal-origin meat and bone meal |
JP6941123B2 (en) | 2016-06-30 | 2021-09-29 | コニカ ミノルタ ラボラトリー ユー.エス.エー.,インコーポレイテッド | Cell annotation method and annotation system using adaptive additional learning |
US11280803B2 (en) | 2016-11-22 | 2022-03-22 | Sakura Finetek U.S.A., Inc. | Slide management system |
US10345218B2 (en) | 2016-12-06 | 2019-07-09 | Abbott Laboratories | Automated slide assessments and tracking in digital microscopy |
US20190385707A1 (en) * | 2017-03-03 | 2019-12-19 | Fenologica Biosciences, Inc. | Phenotype measurement systems and methods |
CN107909017A (en) * | 2017-11-06 | 2018-04-13 | 余帝乾 | The method, apparatus and system of Car license recognition under a kind of complex background |
US10402623B2 (en) | 2017-11-30 | 2019-09-03 | Metal Industries Research & Development Centre | Large scale cell image analysis method and system |
TWI699816B (en) * | 2017-12-26 | 2020-07-21 | 雲象科技股份有限公司 | Method for controlling autonomous microscope system, microscope system, and computer readable storage medium |
KR102041402B1 (en) * | 2018-08-09 | 2019-11-07 | 주식회사 버즈폴 | Cervical learning data generation system |
EP3844482A4 (en) * | 2018-08-30 | 2022-05-25 | Becton, Dickinson and Company | Characterization and sorting for particle analyzers |
SE544735C2 (en) * | 2018-11-09 | 2022-11-01 | Mm18 Medical Ab | Method for identification of different categories of biopsy sample images |
CN111767929A (en) * | 2019-03-14 | 2020-10-13 | 上海市第一人民医院 | Method and system for constructing sub-macular neovascularization model |
CN110633651B (en) * | 2019-08-26 | 2022-05-13 | 武汉大学 | Abnormal cell automatic identification method based on image splicing |
CN111209879B (en) * | 2020-01-12 | 2023-09-19 | 杭州电子科技大学 | Unsupervised 3D object identification and retrieval method based on depth circle view |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB203586A (en) * | 1922-11-20 | 1923-09-13 | Gerhard Kerff | Improvements in and relating to the manufacture of corrugated headers for water tube boilers |
US3333248A (en) * | 1963-12-20 | 1967-07-25 | Ibm | Self-adaptive systems |
US4000417A (en) * | 1975-08-25 | 1976-12-28 | Honeywell Inc. | Scanning microscope system with automatic cell find and autofocus |
US4048616A (en) * | 1975-12-03 | 1977-09-13 | Geometric Data Corporation | Pattern recognition system with keyboard entry for adaptive sensitivity |
DE2903625A1 (en) * | 1978-02-03 | 1979-08-09 | Rush Presbyterian St Luke | DEVICE FOR AUTOMATIC BLOOD ANALYSIS |
JPS5661650A (en) * | 1979-10-24 | 1981-05-27 | Omron Tateisi Electronics Co | Analyzing device of cell |
US4612614A (en) * | 1980-09-12 | 1986-09-16 | International Remote Imaging Systems, Inc. | Method of analyzing particles in a fluid sample |
US4501495A (en) * | 1981-06-17 | 1985-02-26 | Smithkline Beckman Corporation | Slide carrier |
US4513438A (en) * | 1982-04-15 | 1985-04-23 | Coulter Electronics, Inc. | Automated microscopy system and method for locating and re-locating objects in an image |
US4591980A (en) * | 1984-02-16 | 1986-05-27 | Xerox Corporation | Adaptive self-repairing processor array |
US4700298A (en) * | 1984-09-14 | 1987-10-13 | Branko Palcic | Dynamic microscope image processing scanner |
GB8514591D0 (en) * | 1985-06-10 | 1985-07-10 | Shandon Southern Prod | Centrifugation |
US4807979A (en) * | 1986-01-24 | 1989-02-28 | Geno Saccomanno | Microscope slide marking device |
US4833625A (en) * | 1986-07-09 | 1989-05-23 | University Of Arizona | Image viewing station for picture archiving and communications systems (PACS) |
US4821118A (en) * | 1986-10-09 | 1989-04-11 | Advanced Identification Systems, Inc. | Video image system for personal identification |
US4805225A (en) * | 1986-11-06 | 1989-02-14 | The Research Foundation Of The State University Of New York | Pattern recognition method and apparatus |
US4965725B1 (en) * | 1988-04-08 | 1996-05-07 | Neuromedical Systems Inc | Neural network based automated cytological specimen classification system and method |
-
1988
- 1988-04-08 US US07179060 patent/US4965725B1/en not_active Expired - Lifetime
-
1989
- 1989-03-22 DE DE68926796T patent/DE68926796T2/en not_active Expired - Fee Related
- 1989-03-22 EP EP89302889A patent/EP0336608B1/en not_active Expired - Lifetime
- 1989-03-22 SG SG1996004815A patent/SG46454A1/en unknown
- 1989-03-22 ES ES89302889T patent/ES2090033T3/en not_active Expired - Lifetime
- 1989-03-22 AT AT89302889T patent/ATE140327T1/en not_active IP Right Cessation
- 1989-03-24 RO RO146063A patent/RO106931B1/en unknown
- 1989-03-24 JP JP1504988A patent/JPH04501325A/en active Pending
- 1989-03-24 MC MC892101D patent/MC2101A1/en unknown
- 1989-03-24 WO PCT/US1989/001221 patent/WO1989009969A1/en active IP Right Grant
- 1989-03-24 AU AU35415/89A patent/AU628342B2/en not_active Ceased
- 1989-03-24 RU SU894831388A patent/RU2096827C1/en not_active IP Right Cessation
- 1989-03-24 BR BR898907355A patent/BR8907355A/en not_active IP Right Cessation
- 1989-03-24 HU HU892848A patent/HU208186B/en not_active IP Right Cessation
- 1989-04-04 CA CA000595659A patent/CA1323700C/en not_active Expired - Fee Related
- 1989-04-05 IL IL89859A patent/IL89859A0/en unknown
- 1989-04-07 ZA ZA892558A patent/ZA892558B/en unknown
- 1989-04-07 CN CN89102194A patent/CN1031811C/en not_active Expired - Fee Related
- 1989-10-11 US US07420105 patent/US5287272B1/en not_active Expired - Lifetime
-
1990
- 1990-10-05 BG BG92969A patent/BG51463A3/en unknown
- 1990-10-05 FI FI904922A patent/FI101653B1/en not_active IP Right Cessation
- 1990-11-01 DK DK262490A patent/DK262490D0/en not_active Application Discontinuation
-
1996
- 1996-10-03 GR GR960402602T patent/GR3021252T3/en unknown
-
1998
- 1998-03-27 HK HK98102660A patent/HK1003583A1/en not_active IP Right Cessation
Also Published As
Publication number | Publication date |
---|---|
RU2096827C1 (en) | 1997-11-20 |
FI101653B (en) | 1998-07-31 |
SG46454A1 (en) | 1998-02-20 |
GR3021252T3 (en) | 1997-01-31 |
EP0336608B1 (en) | 1996-07-10 |
ES2090033T3 (en) | 1996-10-16 |
FI101653B1 (en) | 1998-07-31 |
US4965725B1 (en) | 1996-05-07 |
ZA892558B (en) | 1989-12-27 |
WO1989009969A1 (en) | 1989-10-19 |
FI904922A0 (en) | 1990-10-05 |
DE68926796D1 (en) | 1996-08-14 |
AU628342B2 (en) | 1992-09-17 |
HK1003583A1 (en) | 1998-10-30 |
JPH04501325A (en) | 1992-03-05 |
MC2101A1 (en) | 1991-02-15 |
ATE140327T1 (en) | 1996-07-15 |
EP0336608A2 (en) | 1989-10-11 |
HU208186B (en) | 1993-08-30 |
HUT59239A (en) | 1992-04-28 |
CN1037035A (en) | 1989-11-08 |
RO106931B1 (en) | 1993-07-30 |
BR8907355A (en) | 1991-03-19 |
US4965725A (en) | 1990-10-23 |
CN1031811C (en) | 1996-05-15 |
US5287272A (en) | 1994-02-15 |
IL89859A0 (en) | 1989-12-15 |
BG51463A3 (en) | 1993-05-14 |
AU3541589A (en) | 1989-11-03 |
DK262490A (en) | 1990-11-01 |
DE68926796T2 (en) | 1996-11-07 |
US5287272B1 (en) | 1996-08-27 |
EP0336608A3 (en) | 1990-08-29 |
HU892848D0 (en) | 1990-12-28 |
DK262490D0 (en) | 1990-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA1323700C (en) | Neural network based automated cytological specimen classification system and method | |
Beaufort et al. | Automatic recognition of coccoliths by dynamical neural networks | |
US6549661B1 (en) | Pattern recognition apparatus and pattern recognition method | |
Jambhekar | Red blood cells classification using image processing | |
US6330350B1 (en) | Method and apparatus for automatically recognizing blood cells | |
WO1991002329A1 (en) | A method and an apparatus for differentiating a sample of biological cells | |
WO1995034050A1 (en) | Neural network for cell image analysis for identification of abnormal cells | |
Beksaç et al. | An artificial intelligent diagnostic system on differential recognition of hematopoietic cells from microscopic images | |
Şengür et al. | White blood cell classification based on shape and deep features | |
Basnet et al. | A novel solution of using deep learning for white blood cells classification: enhanced loss function with regularization and weighted loss (ELFRWL) | |
CN103077399A (en) | Biological microscopic image classification method based on integrated cascade structure | |
Ridoy et al. | An automated approach to white blood cell classification using a lightweight convolutional neural network | |
Gumble et al. | Analysis & classification of acute lymphoblastic leukemia using KNN algorithm | |
Kareem | An evaluation algorithms for classifying leukocytes images | |
Bazoon et al. | A hierarchical artificial neural network system for the classification of cervical cells | |
Walker et al. | Image analysis as a tool for quantitative phycology: a computational approach to cyanobacterial taxa identification | |
Lina et al. | Focused color intersection for leukocyte detection and recognition system | |
Sankaran et al. | Quantitation of Malarial parasitemia in Giemsa stained thin blood smears using Six Sigma threshold as preprocessor | |
Bengtsson et al. | High resolution segmentation of cervical cells. | |
Ridoy et al. | A lightweight convolutional neural network for white blood cells classification | |
Bhavana et al. | Identification of Blood group and Blood cells through Image Processing | |
McKenna et al. | A comparison of neural network architectures for cervical cell classification | |
Anilkumar et al. | Efficacy of cielab and cmyk color spaces in leukemia image analysis: a comparison by statistical techniques | |
Iqbal et al. | Towards Efficient Segmentation and Classification of White Blood Cell Cancer Using Deep Learning | |
Olaniyi et al. | In-line grading system for mango fruits using GLCM feature extraction and soft-computing techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MKLA | Lapsed |