MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
NEURAL NETWORKS Backpropagation Algorithm
1 Image Classification MSc Image Processing Assignment March 2003.

Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Improved BP algorithms ( first order gradient method) 1.BP with momentum 2.Delta- bar- delta 3.Decoupled momentum 4.RProp 5.Adaptive BP 6.Trinary BP 7.BP.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
A Hybrid Self-Organizing Neural Gas Network James Graham and Janusz Starzyk School of EECS, Ohio University Stocker Center, Athens, OH USA IEEE World.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Artificial Neural Networks
Classification Part 3: Artificial Neural Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
NEURAL NETWORKS FOR DATA MINING
Lecture 3 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 3/1 Dr.-Ing. Erwin Sitompul President University
Artificial Intelligence Techniques Multilayer Perceptrons.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Non-Bayes classifiers. Linear discriminants, neural networks.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Intro. ANN & Fuzzy Systems Lecture 13. MLP (V): Speed Up Learning.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
1 Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
1 Creating Situational Awareness with Data Trending and Monitoring Zhenping Li, J.P. Douglas, and Ken. Mitchell Arctic Slope Technical Services.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Machine Learning Supervised Learning Classification and Regression
Big data classification using neural network
Deep Feedforward Networks
Artificial Neural Networks
Artificial Neural Networks I
Data Mining, Neural Network and Genetic Programming
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
One-layer neural networks Approximation problems
Derivation of a Learning Rule for Perceptrons
ECE 539 Final Project Mark Slosarek
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
A Framework for Automatic Resource and Accuracy Management in A Cloud Environment Smita Vijayakumar.
Development of mean value engine model using ANN
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
Machine Learning Today: Reading: Maria Florina Balcan
ECE 471/571 - Lecture 17 Back Propagation.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
network of simple neuron-like computing elements
An Introduction To The Backpropagation Algorithm
Multi-Layer Perceptron
Neural Networks Geoff Hulten.
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Boltzmann Machine (BM) (§6.4)
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
2. Matrix-Vector Formulation of Backpropagation Learning
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Presentation transcript:

MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010

Background and Project Description : MST is a plasma physics experiment currently running in the physics department. MST operates in a pulsed mode, taking plasma shots that last approximately 40-60ms. Before the shot, gas input at 15 different points in time during the shot and plasma current is set. Density values drift from one shot to the next, requiring operator input to adjust the gas input profile. Currently a human operator is required to manually set gas input valves to stabilize density. Density fluctuates around desired level due to human error No automatic control system exists due to the complexity and non-linearity of the system in question A MLP will be designed to emulate and eventually replace the skilled human operator.

ANN Control System Setup: Control system given system parameters and requested output Error computed from actual output BP learning Eventual convergence ? Requested density Human input Error Density Feature space compression MLP MST (system to control) Current Gas input

MLP Design: Simplest MLP required to adequately classify output data 40 element feature space <= 40 elements in first hidden layer <= 2 hidden layers 20 element output layer Pre-processing of raw date to compress feature space prior to MLP input BP training algorithm Initially train MLP from human operator responses Eventual self learning when integrated into control system Calculation of error (requested density – actual density) BP learning to allow real time adaptability to varying system conditions Optimization of initial training parameters and network design in progress Number of layers, Number of elements Training parameters: learning rate, momentum, etc…

Feature Space Data: High speed digitizer captures experiment data 30000 data points for Ip and Ne 8000 actual relevant data points 21 data points for Gas input ANN complexity with ~60000 input neurons would be wasteful of computing resourced Real time operation required (~1 update every 2 min) Feature space compressed by averaging points in a given data set 20 data points for Ip and Ne 2 data points for Gas input Relevant data extracted without loss accuracy Initial Feature Space Compressed Feature Space

Expected Results / Initial Testing: MLP given requested plasma density and plasma current System successfully generates gas waveform Gas waveform very similar to training set Not integrated with actual system Gas output appears similar to human operator Accurate model? Further testing / training required with multiple data sets

Discussion / Problems / Future Modifications: Control method Output types: Output gas values based on requested density Larger training data set Data available from every shot Known density result Output change in gas valued to error in density Smaller training data set Data only when operator changes gas settings Requested density value not known Over specification of fitted data Training sometimes results of MLP copying human gas output regardless of density