Automatic Lung Cancer Diagnosis from CT Scans (Week 3)

Slides:



Advertisements
Similar presentations
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Advertisements

Unsupervised Learning With Neural Nets Deep Learning and Neural Nets Spring 2015.
Robust Real-Time Object Detection Paul Viola & Michael Jones.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Overview of Back Propagation Algorithm
Face Detection using the Viola-Jones Method
Building high-level features using large-scale unsupervised learning Anh Nguyen, Bay-yuan Hsu CS290D – Data Mining (Spring 2014) University of California,
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
Dr. Z. R. Ghassabi Spring 2015 Deep learning for Human action Recognition 1.
Deep Residual Learning for Image Recognition
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Automatic Lung Nodule Detection Using Deep Learning
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
When deep learning meets object detection: Introduction to two technologies: SSD and YOLO Wenchi Ma.
CNN architectures Mostly linear structure
Vision-inspired classification
Welcome deep loria !.
Big data classification using neural network
Automatic Lung Cancer Diagnosis from CT Scans (Week 1)
Learning Deep Generative Models by Ruslan Salakhutdinov
Deep Learning Amin Sobhani.
Automatic Lung Cancer Diagnosis from CT Scans (Week 2)
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Journal Club M Havaei, et al. Université de Sherbrooke, Canada
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
ECE 5424: Introduction to Machine Learning
Computer Science and Engineering, Seoul National University
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
Automatic Lung Cancer Diagnosis from CT Scans (Week 4)
Restricted Boltzmann Machines for Classification
Generative Adversarial Networks
Robust Lung Nodule Classification using 2
REU student: Winona Richey Graduate student: Naji Khosravan
Implementing Boosting and Convolutional Neural Networks For Particle Identification (PID) Khalid Teli .
Ajita Rattani and Reza Derakhshani,
Inception and Residual Architecture in Deep Convolutional Networks
Synthesis of X-ray Projections via Deep Learning
Neural networks (3) Regularization Autoencoder
Deep Learning Qing LU, Siyuan CAO.
Deep Belief Networks Psychology 209 February 22, 2013.
CS 698 | Current Topics in Data Science
CS 698 | Current Topics in Data Science
Structure learning with deep autoencoders
Deep Learning Workshop
Introduction to Deep Learning for neuronal data analyses
Department of Electrical and Computer Engineering
A New Approach to Track Multiple Vehicles With the Combination of Robust Detection and Two Classifiers Weidong Min , Mengdan Fan, Xiaoguang Guo, and Qing.
Multiple Organ Detection in CT Volumes using CNN Week 2
Bird-species Recognition Using Convolutional Neural Network
CNNs and compressive sensing Theoretical analysis
Multiple Organ Detection in CT Volumes using CNN Week 1
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Deep Learning Hierarchical Representations for Image Steganalysis
Logistic Regression & Parallel SGD
Dog/Cat Classifier Christina Stiff.
Machine Learning Ensemble Learning: Voting, Boosting(Adaboost)
Very Deep Convolutional Networks for Large-Scale Image Recognition
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Neural Networks Geoff Hulten.
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
Neural networks (3) Regularization Autoencoder
COSC 4335: Part2: Other Classification Techniques
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
Human-object interaction
Introduction to Neural Networks
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Presented By: Firas Gerges (fg92)
Presentation transcript:

Automatic Lung Cancer Diagnosis from CT Scans (Week 3) REU Student: Maria Jose Mosquera Chuquicusma Graduate Student: Sarfaraz Hussein Professor: Dr. Ulas Bagci

Tasks Accomplished Programming/Other: Familiarization with MATLAB and its Pre-Trained Neural Networks Familiarization with ITK-SNAP Software Literatures: 1. Representation Learning: A Review and New Perspectives 2. Why Does Unsupervised Pre-training Help Deep Learning? 3. Multiview Convolutional Neural Network for Lung Nodule Classification 4. Multi-stage Neural Networks with Single-sided Classifiers for False Positive Reduction and its Evaluation using Lung X-ray CT Images 5. Pulmonary Nodule Classification with Deep Neural Networks on Computed Tomography Images 6. An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax

1: Representation Learning: A Review and New Perspectives General purpose priors for good representations Deep Representations PCA/ICA Markov Random Fields Bayesian Networks Restricted Boltzman Machines KL Divergence, Contrastive Divergence Different types of Autoencoders (sparse, regularized, stacked, contracted) Deep Belief Networks Etc.

2: Why Does Unsupervised Pre-training Help Deep Learning? Pre-trained weights gathered through unsupervised learning are more beneficial then setting random weights Better generalization Training cost optimized  better regularization  less overfitting Great for large layers and deeper networks; it can hurt smaller networks Training and testing errors were lower Act as a regularizer (maintained) Advantage does not vanish as the number of samples increases Variance reduction Pre-train more layers  generalization

3: Multiview Convolutional Neural Network for Lung Nodule Classification Multiview CNN for classification of pulmonary nodules (malignant or benign) Motivation: Multiple view provide different information 2 Tasks: binary and ternary classifications Experiments performed for both tasks with 2 different input channel modes (1 and 7) and 5 different architecture with Batch Normalization Batch Normalization used for faster learning and higher overall accuracy (but caused overfitting) Deeper Network  separability of classes Sensitivity, Specificity, ROC, t-SNE

3: Multiview Convolutional Neural Network for Lung Nodule Classification

3: Multiview Convolutional Neural Network for Lung Nodule Classification

4: Multi-Stage Neural Networks with Single-Sided Classifiers Cascaded multi-stage CNN with single-sided classifiers Reduce the positives of lung nodule classification in CT scans Irregular lesions in non-nodules create an imbalance Inverse balanced dataset used CNN stage filtering for non-nodules Suspicious and obvious non-nodules Threshold Last CNN calculates probabilities

4: Multi-Stage Neural Networks with Single-Sided Classifiers

4: Multi-Stage Neural Networks with Single-Sided Classifiers

5: Pulmonary Nodule Classification with Deep Neural Networks on Computed Tomography Images Deep Convolutional Neural Network Motivation: Diminish and ultimately reduce false positives and false negatives Classify whether ROIs contain a nodule or non-nodule 2 Tasks: binary and ternary classifications Experiments performed where tested CF and DD Adjusted learning rate and momentum of weight updating 5 different trainings: training 4 gave most optimal results

5: Pulmonary Nodule Classification with Deep Neural Networks on Computed Tomography Images

6: An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax Shannon’s Information Theory (closely related to MI and MAP) Received signal ~ transmitted signal + noise Monte Carlo (MC) sampling Markov Chains 𝑋 →𝑌 → 𝑌 → 𝑌 →𝑅 Want to establish a relationship between 𝑋 and 𝑅

6: An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax