CISR GW-TRI Center for Intelligent Systems Research GW Transportation Research Institute The George Washington University, Virginia Campus, 20101 Academic.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
Introduction to Neural Networks Computing
Motorcycle Rider Braking Simulator Study of Motorcycle Rider Braking Behavior NHTSA-Honda 11/16/09 P. Rau.
Introduction to VISSIM
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications.
Artificial neural networks:
Classification Neural Networks 1
Machine Learning Neural Networks
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Decision Support Systems
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Artificial Neural Networks
Lecture 09 Clustering-based Learning
CS 484 – Artificial Intelligence
JAVED KHAN ET AL. NATURE MEDICINE – Volume 7 – Number 6 – JUNE 2001
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Automated Drowsiness Detection For Improved Driving Safety Aytül Erçil November 13, 2008.
Prepared by: Badiuzaman Bin Baharu Supervisor: Dr. Nasreen Bt. Badruddin.
Artificial Neural Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Chapter 9 Neural Network.
Artificial Neural Network Unsupervised Learning
Overview of Transportation Safety & Security Area of Excellence at GW and Center for Intelligent Systems Research The George Washington.
An Introduction to Artificial Neural Networks Wu Ping.
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Image Classification 영상분류
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Akram Bitar and Larry Manevitz Department of Computer Science
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
1 Traffic accident analysis using machine learning paradigms Miao Chong, Ajith Abraham, Mercin Paprzycki Informatica 29, P89, 2005 Report: Hsin-Chan Tsai.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Vision Based Automation of Steering using Artificial Neural Network Team Members: Sriganesh R. Prabhu Raj Kumar T. Senthil Prabu K. Raghuraman V. Guide:
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Machine Learning Supervised Learning Classification and Regression
Learning with Perceptrons and Neural Networks
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
Creating fuzzy rules from numerical data using a neural network
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Classification Neural Networks 1
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

CISR GW-TRI Center for Intelligent Systems Research GW Transportation Research Institute The George Washington University, Virginia Campus, Academic Way, Ashburn, VA NDIA 3 rd Annual Intelligent Vehicle Systems Symposium Driving Simulator Experiment: Detecting Driver Fatigue by Monitoring Eye and Steering Activity Dr. Azim Eskandarian, Riaz Sayed (GWU)

CISR GW-TRI Research Objective Conduct Simulator Experiment and Analyze the Data, to search for a system for automatic detection of drowsiness based on driver’s performance

CISR GW-TRI Significance of the Problem Drowsiness/Fatigue Related Accident Data: NHTSA Estimates 100,000 drowsiness/fatigue related Crashes Annually FARS indicates an annual average of 1,544 fatalities Fatigue has been estimated to be involved in 10-40% of crashes on highways (rural Interstate) 15% of single vehicle fatal truck crashes Fatigue is the most frequent contributor to crashes in which a truck driver was fatally injured

CISR GW-TRI A drowsy/sleepy driver is unable to determine when he/she will have an uncontrolled sleep onset Fall asleep crashes are very serious in terms of injury severity An accident involving driver drowsiness has a high fatality rate because the perception, recognition, and vehicle control abilities reduces sharply while falling asleep Driver drowsiness detection technologies can reduce the risk of a catastrophic accident by warning the driver of his/her drowsiness Significance of the Problem

CISR GW-TRI Driver Drowsiness Detection Techniques 1. Sensing of driver physical and physiological phenomenon –Analyzing changes in brain wave or EEG –Analyzing changes in eye activity and Facial expressions Good detection accuracy is achieved by these techniques Disadvantages: –Electrodes have to be attached to the body of the driver for sensing the signals –Non-contact type sensing is also highly dependant on environmental conditions

CISR GW-TRI 2. Analyzing changes in performance output of the vehicle hardware –Steering, speed, acceleration, lateral position, and braking etc. Advantages: –No wires, cameras, monitors or other devices are to be attached or aimed at the driver –Due to the non-obtrusive nature of these methods they are more practically applicable Driver Drowsiness Detection Techniques

CISR GW-TRI Approach for Drowsiness Detection and Driver Warning

CISR GW-TRI Experiment Conducted in the Vehicle Simulator Lab of the CISR. GWU VA Campus, Ashburn VA. Twelve subjects between the ages of 23 and 43 Test Scenario consisted of a continuous rural Interstate highway, with traffic in both directions Speed limit of 55 mph. Morning session 8 – 10 am Night session 1 – 3 am

CISR GW-TRI CISR Driving Simulator

CISR GW-TRI Eye Tracking Equipment

CISR GW-TRI Sample Data From Simulator RUN# ZONETIME SPEEDLIM CRASHB CRASHV LANEX BRAKEFOR BRAKETAP STEERPOS STEERVAR LATPLACE LATPLVAR SPEED SPEEDVAR SPEEDDEV

CISR GW-TRI Lateral Position of Vehicle

CISR GW-TRI Power Spectrum Density for Vehicle Lateral Position

CISR GW-TRI Steering Angle filter correction for curves

CISR GW-TRI Hypothesis The hypothesized relationship between driver state of alertness and steering wheel position is that under an alert state, drivers make small amplitude movements of the steering wheel, corresponding to small adjustments in vehicle trajectory, but under a drowsy state, these movements become less precise and larger in amplitude resulting in sharp changes in trajectory (Planque et al. 1991).

CISR GW-TRI A Hybrid Artificial Neural Network Architecture W j1 Unsupervised Layer : Clustering Competitive Algorithm Supervised Layer: Classification Feedforward Algorithm 2 8 X 8

CISR GW-TRI Hybrid Artificial Neural Network Architecture

CISR GW-TRI ANN Training for Unsupervised Competitive Layer 1. Initialize the weight vector randomly for each neuron. 2. Present the input vector X(n). 3. Compute the winning neuron using the Euclidean distance as a metric. Where W i = [w 1, w 2, …. w 8 ] T is the weight vector of neuron i. b i is the bias to stop the formation of dead neurons.

CISR GW-TRI ANN Training Competitive Layer Continued N number of time a neuron wins in competitive layer  and  are learning constants and o(n) is the outcome of the present competition (=1 if neuron wins & else = 0). C i initially set to small random value 4. Update the weight vector of the winning neuron Wi * only. 5. Continue with step (2) two until change in the weight vectors reaches a minimum value.

CISR GW-TRI ANN Training Competitive Layer Continued The competitive algorithm moves the weight vectors of all the neurons closer to the center of the clusters. Each neuron (or set of neurons) of the competitive layer represents a cluster. The Output of the neuron is 1 if it wins the competition and 0 if it losses. The Output of the Competitive layer is an n-dimensional binary vector T(n) = [t 1, t 2, …….., t n ] T.

CISR GW-TRI ANN Training for supervised feed forward layer Step 1: Initialize the synaptic weights and the thresholds to small random numbers. Step 2: Present the network with an epoch of training exemplars Step 3: Apply Input vector X(n) to the input layer and the desired response d(n) to the output layer of neurons. The output of each neuron is calculated as

CISR GW-TRI ANN Training Continued

CISR GW-TRI ANN Training Continued N = No. of training sets in one epoch  = Learning rate parameter  = Momentum constant Step 5: Iterate the computation by presenting new epochs of training examples until the mean square error (MSE) computed over entire epoch achieve a minimum value. MSE is given by:

CISR GW-TRI ANN Training Parameters Hybrid architecture using an unsupervised clustering algorithm and a classifier (Back propagation learning algorithm in batch mode) Tanhyperbolic activation function, with output range from –1 to 1 Variable learning rate and momentum were used Cross validation during training

CISR GW-TRI Input Discretization of Steering Angle Algorithm to select r (ranges) for each driver to compensate performance variability between drivers Discretized steering angle for one driver :

CISR GW-TRI Some drivers are more “sensitive” to vehicle lateral position and make very accurate corrections to the steering for lane keeping while other are less “sensitive” and make less accurate corrections. The result is a low amplitude signal (steering angle) for more “sensitive” drivers and relatively high amplitude signal for less “sensitive” drivers. Larger values for P k will make the descritization ranges wider to accommodate large amplitude while small values will make them shorter for small amplitudes. Therefore, same ANN (8-dimensional descritization) can be used Accounting for Individual Driver Behaviors

CISR GW-TRI  Eye closure data is recorded at 60 Hz  C i = No. of zero’s in 1 second of data  C i is further discretized according to the following scheme Input Discretization of Eye closures

CISR GW-TRI Algorithm to select r (ranges) for each driver to compensate eye closure variability between drivers P values are representative of variability of eye closures (blinking) for each driver Sample of a few seconds of Discretized Eye closures for one driver : Input Discretization of Eye closures

CISR GW-TRI Input Vector  The two vectors are combined to form a 12 dim vector J(T)  Vector J(T) is summed over 15 sec time interval to get the input vector X(n)

CISR GW-TRI Input and Desired Output Vector Each row represents the sum of discretized input over a selected time interval, e.g., 15 sec.

CISR GW-TRI ANN Performance During Training

CISR GW-TRI ANN Test Data Driving data from 12 subjects available 1 subject night session not recorded due to equipment error. 1 subject morning data not available, software error. Remaining 10 were used for training ANN and testing results, NOTE: training data and testing of the ANN were not the same, Testing data selected randomly from the sets not used in the training

CISR GW-TRI Results Actual TotalsNetwork Output WakeSleep Wake Sleep Mis-classified False Alarm Actual TotalsNetwork Output WakeSleep Wake Sleep Mis-classified False Alarm Crash Prediction: All crashes that occurred due to driver falling asleep during the experiment were predicted before the crash occurred.

CISR GW-TRI Morning and Night session results

CISR GW-TRI Morning and Night session results

CISR GW-TRI Morning and Night session results

CISR GW-TRI Morning and Night session results

CISR GW-TRI Morning and Night session results

CISR GW-TRI Time Before Crash When the ANN Generated a first Warning

CISR GW-TRI Conclusions A non-intrusive method of drowsiness detection using steering data is possible A method using ANN is developed and successfully predicts drowsiness (91% Success Rate) Method is solely based on driver’s (Vehicle) steering performance Same method may be applied to detection of fatigue or other related driver performance Further refining and validation of the algorithm is recommended Capturing individual driver’s steering while drowsy requires additional research

CISR GW-TRI Recommended Additional Research Additional Simulator Experiments –Validate the Developed Algorithm –Additional Road Conditions –More Diversified Group of Drivers Road (Experimental) Tests in an Instrumented Vehicle Further Refining the Algorithm Based on the Road Test Data Testing of Other Fatigue Related Scenarios Research on Warning Systems Integrated With This Detection System