Presentation is loading. Please wait.

Presentation is loading. Please wait.

Automatic Lung Cancer Diagnosis from CT Scans (Week 4)

Similar presentations


Presentation on theme: "Automatic Lung Cancer Diagnosis from CT Scans (Week 4)"— Presentation transcript:

1 Automatic Lung Cancer Diagnosis from CT Scans (Week 4)
REU Student: Maria Jose Mosquera Chuquicusma Graduate Student: Sarfaraz Hussein Professor: Dr. Ulas Bagci

2 Background & Project Objective
Lung cancer is the first cause of mortality among all types of cancer CT Scans CAD Systems Classify lung nodules as malignant or benign using unsupervised learning methods CAD Systems Help radiologists with diagnosis: validation and blind spots. They help save time. Avoid misdiagnosis, which could lead to unnecessary surgeries, deaths, etc.

3 Previous Methods TumorNet 3D CNN Based-Multitask Learning TumorNet
Supervised Multi-view CNN 3D-tensor (2D image patches) Data augmentation (Gaussian, noise) Gaussian Process Regression 1) Estimate testing labels 2) To get rid of inter-observer (radiologist) variation! 3D CNN Based Multi-Task Learning Lung nodule dataset to fine-tune (transfer learning) a 3D CNN trained on Sport-1M dataset 3D CNN learns to discriminate features corresponding to 6 attributes Graph regularized sparse representation: multi-task learning

4 What I Have Learned? Medical imaging concepts/terminology/software
Hospital Visit Deep Learning concepts/terminology Supervised Unsupervised MATLAB/Python Supervised & unsupervised learning concepts/terminology - Architectures that help unsupervised learning overall (Clustering, RBMs[Markov Chains], Autoencoders, GANs, etc.) - Unsupervised pre-training to aid supervised learning - Multi-task learning/ transfer learning (fine-tuning) Unsupervised - Representation learning

5 Literatures Review

6 Previous Approaches & Challenges
- Not enough labeled data Lung nodules variations Supervised learning architectures 3D CNNs vs. 2D CNNs Multi-view/ Multi-scale Fine-tuning pre-trained networks - Unsupervised learning architectures Representation learning - Disentangled factors learning - Invariance Properties Autoencoders 0) Time consuming 1) False positive and false negative misdiagnose 2) Multi-view and multi-scale to aid in feature extraction for scans (volumetric) 3) Autoencoders: focus on getting all features and reconstructing images (not great) 4) Representation learning (stacked architectures, multi-task and transfer learning) - Disentangled factor learning: transformation of data helps feature learning. - Redundancy reduction - Invariance properties of the learned representations

7 Tasks Accomplished (This Week)
Programming (MATLAB/Python): Used Pre-Trained CNNs to Extract Features and Classify Lung Nodules from LIDC-IDRI dataset Fined-tuned CNNs GAN code Literatures: 1. Generative Adversarial Nets 2. Lung Nodule Based on Computed Tomography Using Taxonomic Diversity Indexes and an SVM 3. Categorical Generative Adversarial Nets 4. Stacked Generative Adversarial Nets LIDC-IDRI DataSet

8 Results (Supervised Classification)

9 Results (Supervised Classification)

10 Results (Supervised Classification)

11 Results (Supervised Classification)

12 Our Next Step… - Study different GAN architectures and incorporate them - Why GANs? Unsupervised Learning Have never been used Generate instead of reconstruct Better feature representation learning


Download ppt "Automatic Lung Cancer Diagnosis from CT Scans (Week 4)"

Similar presentations


Ads by Google