Download presentation
Presentation is loading. Please wait.
Published byMaude Atkinson Modified over 6 years ago
1
IMAGE ANALYSIS AND SEGMENTATION OF ANATOMICAL FEATURES OF CERVIX UTERI IN COLOR SPACE
Viara Van Raad STI – Medical Systems, 733 Bishop St, Makai Tower, Honolulu, HI, USA 96813 2005 MAP Mathematical Model To select a set of 3-tuple from an image point we use the distance measure (Eq.1) with pre-calculated probability Each image pixel is chosen to belong to one of two classes, labeled as ‘0’ or ‘1’ selected using the inequality in (Eq 2). : Introduction Computer Aided Diagnosis (CAD) in colposcopy can lead to automatic evaluation of the health status of the cervix in vivo . The aim is to develop algorithm for automated segmentation of the diagnostic and anatomic features using Gaussian Mixture Model (GMM). The detection of cancer precursors in vivo using color characteristics is performed by estimating the a priori probability of the modelled tissues. The method is known as MAP. The anatomic landmarks are the cervical canal (CC aka os); the Columnar Epithelium (CE), the Squamous Epithelium (SE) and the Transformation Zone (TZ) - representing both the healthy and metaplastic tissues, depicted in Figure 1. (Eq 1) (Eq 2) using the mean values, covariance matrices and the ‘predicted’ probability of occurrence for both tissue descriptors and The parameters above are the pooled average estimates from the experimental part described before. Figure 1. a) Digital cervical images with visible transformation zone (TZ) and squamous epithelium (SE) of normal cervix. b) cervical canal (CC) appears centrally and is surrounded by the TZ. c) columnar epithelium (CE) has prominent villous structure. Results We achieved 95 % accuracy testing 20 images, differentiating the CC from the rest of the image. No success in the automatic differentiation task to separate SE and CE areas. We achieved 85% accuracy for differentiating the CIN I SIL lesions. The verification of the results was performed by a comparison between location-wise overlap with the binary image resulted from (Eq 2) and the adjacent annotated binary mask. Both successes of ‘hit-and-miss’ on S and S complement were taken in account and averaged. MAP Probability Estimates - Flowchart and Results: The algorithm description for obtaining the MAP probability estimates for three tissues is depicted by the flowchart (left) and Figure 2. Step 1: Input an image from the ‘training’ set of N pre-selected annotated images (true-color RGB, size 512 x 512 pixels, TIFF format). Step 2: Delineate the modelled tissues: CC, TZ, SE, Background with custom built Matlab function. Step 3: Automatically calculate the MAP estimates of area, mean RGB values and standard deviations for the CC, TZ, SE, Background. Binary ‘masks’ are formed and later automatically ‘matched’ with the adjacent training images. The annotation object is selected, starting from the most enclosed object – for example the CC ( Figure 2). Step 4: Proceed with the next image. Step 5: Calculate the pooled average of the values obtained in Step 3. Results of Pooled Average MAP estimates: CC occupies 2%, TZ - 28% and SE - 70% of the entire visible cervical area on average for an adult female below 50 yrs of age. Figure 3. a) Each of the 3 color RGB images on the left is from healthy women b) Illustration of segmented cervical canal (CC) in binary images c) Each of the 3 color RGB images has a lesion d) Segmented CIN I and II SIL lesions as binary images on the right. The images are segmented using previously described MAP algorithm for S and S complement. The algorithm detects lesions with blood like color. Conclusions It is possible to segment and differentiate CIN SIL lesions automatically using MAP algorithm. CC can be segmented accurately, while SE and CE can not be differentiated as the color is inconsistent for either CE or SE. Figure 2. A stylised image of the cervix (far left image) and the four ‘compartments-like’ descriptors of the four classes for CC, TZ, SE and Background.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.