LAB 3 AIRBAG DEPLOYMENT SENSOR PREDICTION NETWORK Warning This lab could save someone’s life!

Slides:



Advertisements
Similar presentations
Data Mining Classification: Alternative Techniques
Advertisements

CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Training and Testing Neural Networks 서울대학교 산업공학과 생산정보시스템연구실 이상진.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
The back-propagation training algorithm
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Estimation of Oil Saturation Using Neural Network Hong Li Computer System Technology NYC College of Technology –CUNY Ali Setoodehnia, Kamal Shahrabi Department.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Artificial Neural Networks
An Introduction To The Backpropagation Algorithm Who gets the credit?
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Neural networks - Lecture 111 Recurrent neural networks (II) Time series processing –Networks with delayed input layer –Elman network Cellular networks.
Artificial neural networks:
Forecasting Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
Chapter 3 Forecasting.
SE 501 Software Development Processes Dr. Basit Qureshi College of Computer Science and Information Systems Prince Sultan University Lecture for Week 8.
Chapter 4 Supervised learning: Multilayer Networks II.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Chapter 9 Neural Network.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Appendix B: An Example of Back-propagation algorithm
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Application of Multi-Layer Perceptron (MLP) Neural Networks in Identification and Picking P-wave arrival Haijiang Zhang Department of Geology and Geophysics.
Multi-Layer Perceptron
Akram Bitar and Larry Manevitz Department of Computer Science
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CSC321: Lecture 7:Ways to prevent overfitting
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Chapter 8: Adaptive Networks
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Network and Deep Learning 王强昌 MLA lab.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan , Ajith Abraham  * Environmental Systems Engineering.
Forecast 2 Linear trend Forecast error Seasonal demand.
CPH Dr. Charnigo Chap. 11 Notes Figure 11.2 provides a diagram which shows, at a glance, what a neural network does. Inputs X 1, X 2,.., X P are.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
An Introduction To The Backpropagation Algorithm.
Evolutionary Computation Evolving Neural Network Topologies.
Multiple-Layer Networks and Backpropagation Algorithms
Deep Learning Amin Sobhani.
Learning in Neural Networks
Predicting Salinity in the Chesapeake Bay Using Neural Networks
Data mining and statistical learning, lecture 1b
A Simple Artificial Neuron
Deep Learning Hierarchical Representations for Image Steganalysis
Artificial Neural Network & Backpropagation Algorithm
network of simple neuron-like computing elements
An Introduction To The Backpropagation Algorithm
Temporal Back-Propagation Algorithm
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
David Kauchak CS158 – Spring 2019
A Data Partitioning Scheme for Spatial Regression
Time Series Forecasting with Recurrent Neural Networks NN3 Competition Mahmoud Abou-Nasr Research & Advanced Engineering Ford Motor Company
Presentation transcript:

LAB 3 AIRBAG DEPLOYMENT SENSOR PREDICTION NETWORK Warning This lab could save someone’s life!

Two Approaches to Investigate Lab 3 Part I: Time Series Forecasting Using Madaline Part II: Time Series Forecasting Using Tapped Delay Line Neural Network (TDNN) Submit formal report on Lab 3 by 11/2/04 (no class 10/26)

Part I: Time Series Forecasting Using Madaline  Aim is to predict airbag sensor output at time t +1 based on prior outputs at t, t-1, t-2, t-3 ……. using Madaline  Initially 3 delay elements will be applied to inputs Select and organize appropriate data file AIRBAGx for Training and Test of Madaline where x=last digit of your ss# Select Network Architecture with 1 hidden layer of nodes and one output Select suitable # of hidden layer nodes for problem giving reasons for your choice

Construct Madaline Network Train & Test. (Choose suitable epoch) Plot and Comment on Prediction Results How accurate is Madaline prediction for additional future time interval t+2 ? Compile a brief report on the results of Madaline forecasting

Example of Training File !Square Rotate Training File (80 samples) 3/27/04 !This data collected at 800 samples per second. !Shaker table moving at 20 Hz with a square wave. !Noise added to the sample by having the MEMs loosely mounted. !MEMS rotated during collection of samples. ! ! t -2 t-1 t t

Results for Madaline Construct, Train and Test Madaline Report your test results using Excel chart Compute the RMS error and comment on accuracy of Madaline prediction

Test Results for Madaline Actual Predicted

Part II: Time Series Forecasting Tapped Delay-Line Neural Network + Backprop Learning  Aim is to predict airbag sensor output at time t +1 based on prior outputs at t, t-1, t-2, t-3 ……. using a TDNN  Initially 3 delay elements will be applied to inputs Select and Organize Airbag appropriate data file AIRBAGx for Training and Test of TDNN where x=last digit of your ss# Select Network Architecture as Multilayer Perceptron Feedfoward with 1 hidden layer of nodes and one output Select # of hidden layer nodes giving reasons for your choice

Construct Network Train & Test. Choose suitable learning coefficient, momentum term and epoch Record train & test parameters as well as RMS Error and Classification Rate after experiment Plot and Comment on TDNN Prediction Results How accurate is your TDNN prediction for additional future time interval t+2 ? Repeat experiment for memory depth of 6 & compare with results for depth of 3 Note that in this scheme, delayed inputs have same weight as current input creating a Linear Trace Memory

Non-Linear Trace Memories Should more recent inputs have greater influence than older inputs? Empirically verify your answer by applying kernel function to delayed inputs to produce a non-linear trace memory Refer to Mohan page 140 – Use convolution of input sequence with kernel function c i One example is kernel function in which previous input has half the weight of input immediately succeeding it. Write MATLAB fn to compute transformed inputs  Use a memory depth of 6

My TDNN After Training

Test Results File DESIRED PREDICTED RMS Test ERROR = Train for 20,000 presentations 80 Training Samples, 20 test Linear Trace Memory Memory Depth =3

TDNN Test Results Linear Trace Memory, Depth = 3

Devise a scheme for airbag deployment with advance warning using TDNN Predictor Assume under crash conditions, airbag sensor output voltage is in range 0.25± 0.25 or 4.75 ± 0.25 Modify your database to introduce a minimum of 2 crash conditions at least 10 time delays apart Does your TDNN predict impending crash conditions? How much notice does TDNN offer driver (in time delays) ? If one time delay = 1ms, can the average adult driver react to the information in time to prevent disaster? Comment on usefulness of Advance Warning scheme Airbag Deployment with Advance Warning

Automobile airbag sensors sometimes deploy accidentally leading to a many unnecessary car crashes Describe a scheme to enhance airbag sensor reliability through redundancy (multiple sensors) Airbag Sensor Reliability Enhancement