Week 8 Farzain Majeed.

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

CSC321: Introduction to Neural Networks and Machine Learning Lecture 24: Non-linear Support Vector Machines Geoffrey Hinton.
NEURAL NETWORKS Backpropagation Algorithm
Hopefully a clearer version of Neural Network. I1 O2 O1 H1 H2I2.
Neural networks Introduction Fitting neural networks
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Human Visual System Neural Network
Logistic Regression Rong Jin. Logistic Regression Model  In Gaussian generative model:  Generalize the ratio to a linear model Parameters: w and c.
Model Selection and Validation “All models are wrong; some are useful.”  George E. P. Box Some slides were taken from: J. C. Sapll: M ODELING C ONSIDERATIONS.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Neural Networks Chapter Feed-Forward Neural Networks.
Hopefully a clearer version of Neural Network. With Actual Weights.
Neural Net Update Dave Bailey. What’s New You can now save and load datasets from a file e.g. saving the dataset: You can now save and load datasets from.
Hugo Woolf CS Research 2009 Morphology based OCR.
Authors : Ramon F. Astudillo, Silvio Amir, Wang Lin, Mario Silva, Isabel Trancoso Learning Word Representations from Scarce Data By: Aadil Hayat (13002)
Appendix B: An Example of Back-propagation algorithm
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CSC321: Lecture 7:Ways to prevent overfitting
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
1 Creating Situational Awareness with Data Trending and Monitoring Zhenping Li, J.P. Douglas, and Ken. Mitchell Arctic Slope Technical Services.
Assignment 4: Deep Convolutional Neural Networks
Neural Networks: An Introduction and Overview
Big data classification using neural network
Quantum Simulation Neural Networks
Neural Networks for Quantum Simulation
Computer Science and Engineering, Seoul National University
DeepCount Mark Lenson.
Recap Finds the boundary with “maximum margin”
Artificial Neural Networks
Intro to NLP and Deep Learning
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Lecture 5 Smaller Network: CNN
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
A “Holy Grail” of Machine Learing
Generalization ..
Bird-species Recognition Using Convolutional Neural Network
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 12: Combining models Geoffrey Hinton.
CSSE463: Image Recognition Day 20
Logistic Regression & Parallel SGD
Artificial Intelligence 13. Multi-Layer ANNs
Face Recognition with Neural Networks
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Recap Finds the boundary with “maximum margin”
Neural Networks Geoff Hulten.
Advanced Artificial Intelligence
The whole is… 10-FRAME PART-WHOLE EARLY NUMBER SENSE.
Semantic Similarity Detection
Coding neural networks: A gentle Introduction to keras
Advanced Artificial Intelligence
Neural networks (1) Traditional multi-layer perceptrons
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
DeltaV Neural - Expert In Expert mode, the user can select the training parameters, recommend you use the defaults for most applications.
Keras.
Deep Learning for the Soft Cutoff Problem
Dynamics of Training Noh, Yung-kyun Mar. 11, 2003
David Kauchak CS51A Spring 2019
Neural Networks: An Introduction and Overview
Car Damage Classification
Practical session on neural network modelling
Neural Networks Weka Lab
Deep Neural Networks as Scientific Models
REU - End to End Self Driving Car
Renewable Energy.
Week 9 Farzain Majeed.
Truman Action Recognition Status update
Adrian E. Gonzalez , David Parra Department of Computer Science
Presentation transcript:

Week 8 Farzain Majeed

Reading Dataset Much of the work involved with training a neural net comes from experiments and actually formatting the data The CommaAI is rather tricky in that the file format is .h5 which we first un-package and save the frames as .jpg’s This makes the dataset much easier to work with A lot of spent was time scripting and automating this whole process.

YOLO Next we needed to use YOLO, preferably in Keras. I found an open source project by the name of YAD2K which provided the YOLO9000 weights and model in a Keras/TF friendly manner. YOLO outputs many classes, while we are looking for just a few (car, pedestrian, traffic light) It may be worth it in the future to retrain YOLO to only predict these parameters. Though, this may actually lead to worse performance because of a lack of data.

Early Results Early results with the new attention based model were interesting. I found that the training loss nicely converged to an amount lower than on the old model. The validation loss jumped around more, but at one point converged to a smaller loss as well. The main problem is that the side task loss didn’t converge at all, which indicates it did not learn

Explanation and Next Week Its possible that the noise from the side task (or even having a side task itself) led to less overfitting and improved results. I need to keep on adjusting parameters to help it converge. I also would like to look more into visualizing the layers to see what the network pays attention to.