Inception and Residual Architecture in Deep Convolutional Networks

Slides:



Advertisements
Similar presentations
ImageNet Classification with Deep Convolutional Neural Networks
Advertisements

Spatial Pyramid Pooling in Deep Convolutional
Autoencoders Mostafa Heidarpour
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
A shallow introduction to Deep Learning
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
ConvNets for Image Classification
Deep Residual Learning for Image Recognition
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Lecture 3b: CNN: Advanced Layers
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Bassem Makni SML 16 Click to add text 1 Deep Learning of RDF rules Semantic Machine Learning.
Convolutional Neural Networks at Constrained Time Cost (CVPR 2015) Authors : Kaiming He, Jian Sun (MSR) Presenter : Hyunjun Ju 1.
Convolutional Neural Networks
Automatic Lung Cancer Diagnosis from CT Scans (Week 3)
When deep learning meets object detection: Introduction to two technologies: SSD and YOLO Wenchi Ma.
Wenchi MA CV Group EECS,KU 03/20/2017
Deep Learning and Its Application to Signal and Image Processing and Analysis Class III - Fall 2016 Tammy Riklin Raviv, Electrical and Computer Engineering.
Big data classification using neural network
Deep Residual Learning for Image Recognition
Convolutional Sequence to Sequence Learning
Deep Residual Networks
Convolutional Neural Network
Summary of “Efficient Deep Learning for Stereo Matching”
Object Detection based on Segment Masks
Deep Learning Amin Sobhani.
Chilimbi, et al. (2014) Microsoft Research
Computer Science and Engineering, Seoul National University
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
The Problem: Classification
Mini Presentations - part 2
Recovery from Occlusion in Deep Feature Space for Face Recognition
Training Techniques for Deep Neural Networks
Understanding the Difficulty of Training Deep Feedforward Neural Networks Qiyue Wang Oct 27, 2017.
Deep Belief Networks Psychology 209 February 22, 2013.
Machine Learning: The Connectionist
Deep Residual Learning for Image Recognition
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Dynamic Routing Using Inter Capsule Routing Protocol Between Capsules
ECE 599/692 – Deep Learning Lecture 6 – CNN: The Variants
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
Layer-wise Performance Bottleneck Analysis of Deep Neural Networks
Bird-species Recognition Using Convolutional Neural Network
Image Classification.
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Logistic Regression & Parallel SGD
Introduction to Deep Learning with Keras
Very Deep Convolutional Networks for Large-Scale Image Recognition
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Smart Robots, Drones, IoT
A Proposal Defense On Deep Residual Network For Face Recognition Presented By SAGAR MISHRA MECE
Neural Networks Geoff Hulten.
Lecture: Deep Convolutional Neural Networks
Deep Neural Networks for Onboard Intelligence
Use 3D Convolutional Neural Network to Inspect Solder Ball Defects
Deep Learning Some slides are from Prof. Andrew Ng of Stanford.
Going Deeper with Convolutions
实习生汇报 ——北邮 张安迪.
Inception-v4, Inception-ResNet and the Impact of
Deep Residual Learning for Automatic Seizure Detection
Heterogeneous convolutional neural networks for visual recognition
Course Recap and What’s Next?
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Natalie Lang Tomer Malach
CS295: Modern Systems: Application Case Study Neural Network Accelerator Sang-Woo Jun Spring 2019 Many slides adapted from Hyoukjun Kwon‘s Gatech “Designing.
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Object Detection Implementations
End-to-End Facial Alignment and Recognition
YOLO-based Object Detection on ARM Mali GPU
CRCV REU 2019 Aaron Honculada.
Presentation transcript:

Inception and Residual Architecture in Deep Convolutional Networks Wenchi Ma Computer Vision Group EECS,KU

Inception: From NIN to Googlenet micro network Enhance the abstraction ability of the local model A general nonlinear function approximator Better results in image recognition and detection

Residual learning: for deeper neural networks With the network depth gets increasing, accuracy gets saturated and then degrades! Such degradation is not caused by overfitting, and adding more layers to a suitable deep model leads to higher training error Residual learning: for deeper neural networks Residual learning: a building block

Inception: Balance model size and computation cost Deeper: (a)integrate low/mid/high-level features and classifiers (b)the “levels” of features can be enriched by the number of stacked layers(depth) Wider: More powerful ability of local abstraction Contradiction: Increasing model size and computational cost tend to translate to immediate quality gains for most tasks wile computational efficiency decreases and high parameter count suffers

Inception: Balance model size and computation cost Main source of computation load : high dimensional convolution Higher dimensional representations are easier to process locally within a network. Increasing the activations per tile in a convolutional network allows for more disentangled features. The resulting networks will train faster General principles: Avoid representational bottlenecks, especially early in the network Maintain higher dimensional representations Balance the width and depth of the network The representation size should gently decrease from the inputs and outputs

Inception: Balance model size and computation cost fully-connected convolution Same receptive filed Mini-network replacing the 5 5 convolutions Mini-network replacing the 3 3 convolutions Less parameters Same inputs and outputs 3*3 3*1 1*3

Inception: Efficient Grid Size Reduction Expensive computation bottleneck

Inception-v3(factorization idea)

Inception-v3 Inception-v1 Inception-v2 All evaluations are done on the 48238 non-blacklisted examples on the ILSVRC-2012 validation set(227) Train the networks with stochastic gradient utilizing the TensorFlow distribution machine learning system using 50 replicas each on a Nvidia Kepler GPU BN:The fully connected layer is also batch-normalized

Inception-V4

Residual connections and Inception ResNet model

Inception-ResNet-v1 and Inception-ResNet-v2 networks

Inception-ResNet-v1 and Inception-ResNet-v2 networks Top-5 error Top-1 error Data: ILSVRC-2012 validation set Train the networks with stochastic gradient utilizing the TensorFlow distribution machine learning system using 20 replicas running each on a Nvidia Kepler GPU

Combination of Inception and Residual Large scale, Deep Convolutional Network: Ensure stable training(Residual) Decrease the scale of the net as a whole(Less parameters) More efficient computation Improve the accuracy

Deeper and Wider but not Bigger Thank you!