Object Classification through Deconvolutional Neural Networks

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

Face Recognition: A Convolutional Neural Network Approach
Semi-Stochastic Gradient Descent Methods Jakub Konečný University of Edinburgh BASP Frontiers Workshop January 28, 2014.
Lecture 5: CNN: Regularization
ImageNet Classification with Deep Convolutional Neural Networks
Reinforcement Learning & Apprenticeship Learning Chenyi Chen.
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Sparse vs. Ensemble Approaches to Supervised Learning
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Sparse vs. Ensemble Approaches to Supervised Learning
Machine Learning CS 165B Spring 2012
ENSEMBLE LEARNING David Kauchak CS451 – Fall 2013.
Computational Intelligence: Methods and Applications Lecture 36 Meta-learning: committees, sampling and bootstrap. Włodzisław Duch Dept. of Informatics,
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
CLASSIFICATION: Ensemble Methods
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
Face Detection using the Spectral Histogram representation By: Christopher Waring, Xiuwen Liu Department of Computer Science Florida State University Presented.
Class Imbalance in Text Classification
Tips for Training Neural Network
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –Ensemble Methods: Bagging, Boosting Readings: Murphy 16.4; Hastie 16.
COMP24111: Machine Learning Ensemble Models Gavin Brown
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
1 January 24, 2016Data Mining: Concepts and Techniques 1 Data Mining: Concepts and Techniques — Chapter 7 — Classification Ensemble Learning.
CS 189 Brian Chu Slides at: brianchu.com/ml/ brianchu.com/ml/ Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge)
Ensemble Methods Construct a set of classifiers from the training data Predict class label of previously unseen records by aggregating predictions made.
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
ConvNets for Image Classification
Lecture 3b: CNN: Advanced Layers
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Lecture 4b Data augmentation for CNN training
Lecture 2b: Convolutional NN: Optimization Algorithms
Kaggle Winner Presentation Template. Agenda 1.Background 2.Summary 3.Feature selection & engineering 4.Training methods 5.Important findings 6.Simple.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Deep Residual Learning for Image Recognition
Convolutional Neural Network
Machine Learning for Data Certification at CMS
Deep Reinforcement Learning
ISBI Camelyon16 Challenge Prague, April 13, 2016
Computer Science and Engineering, Seoul National University
Learning Mid-Level Features For Recognition
Object Detection with Bootstrapping
Object Detection with Bootstrapping Carlos Rubiano Mentor: Oliver Nina
Hierarchical Deep Convolutional Neural Network
Basic machine learning background with Python scikit-learn
Neural networks (3) Regularization Autoencoder
Schizophrenia Classification Using
ECE 6504 Deep Learning for Perception
ECE 5424: Introduction to Machine Learning
Efficient Deep Model for Monocular Road Segmentation
A “Holy Grail” of Machine Learing
Bird-species Recognition Using Convolutional Neural Network
NormFace:
Convolutional Neural Networks for Visual Tracking
Object Classification through Deconvolutional Neural Networks
Object Classification through Deconvolutional Neural Networks
Using decision trees and their ensembles for analysis of NIR spectroscopic data WSC-11, Saint Petersburg, 2018 In the light of morning session on superresolution.
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Ensembles.
Convolutional networks
Lecture: Deep Convolutional Neural Networks
Object Classification through Deconvolutional Neural Networks
Lecture 06: Bagging and Boosting
Neural networks (3) Regularization Autoencoder
Face Recognition: A Convolutional Neural Network Approach
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Learning and Memorization
Coronary Artery Bypass Risk Prediction Using Neural Networks
CRCV REU 2019 Kara Schatz.
CRCV REU 2019 Aaron Honculada.
Presentation transcript:

Object Classification through Deconvolutional Neural Networks Student: Carlos Rubiano Mentor: Oliver Nina

Deep Convolutional Neural Networks have achieved success in large scale object classification

Problem Despite success, issue in way networks learn filters from training data Large amount of training data risks overfitting This can imply memorization Overfitting can be minimized by training different independent models with same/or different parts of training data Combine classification results from different models

Methodology Bootstrap aggregation to perform uniformly sampling with replacement of a training set D that contains n samples to create d subsets of n' where n ≥ n' Train n networks with different samples of the dataset and combine results through majority voting, probabilistic averaging and geometric mean Voting ensemble using same dataset and same amount of data due to randomness of minibatch process for stochastic gradient descent in DCNN

Preliminary Testings

Additional Results Kaggle, world's largest community of data scientists Currently holding a CIFAR-10 Object Recognition competition Applied our methodology and submitted results to competition Achieved 7th/155

What's next? Also exploring other ways for data augmentation Such as sampling with custom features and image variants Implement state of art Network in Network and combine with maxout and our methodology