An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Slide number 1 EE3P BEng Final Year Project Group Session 2 Processing Patterns using a Multi- Layer Perceptron (MLP) Martin Russell.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Alberto Trindade Tavares ECE/CS/ME Introduction to Artificial Neural Network and Fuzzy Systems.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
The back-propagation training algorithm
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
Gene based diagnostic prediction of cancers by using Artificial Neural Network Liya Wang ECE/CS/ME539.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
A Neural Network Approach to Predicting Stock Performance John Piefer ECE/CS 539 Project Presentation.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Explorations in Neural Networks Tianhui Cai Period 3.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Machine Learning Chapter 4. Artificial Neural Networks
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
Position Reconstruction in Miniature Detector Using a Multilayer Perceptron By Adam Levine.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
CSC321: Neural Networks Lecture 2: Learning with linear neurons Geoffrey Hinton.
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
Computer Go : A Go player Rohit Gurjar CS365 Project Presentation, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Artificial Neural Networks Approach to Stock Prediction Presented by Justin Jaeck.
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Neural Networks 2nd Edition Simon Haykin
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Web-Mining Agents Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Übungen)
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Evolutionary Computation Evolving Neural Network Topologies.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Outline Problem Description Data Acquisition Method Overview
Lecture 12. MLP (IV): Programming & Implementation
CSE 473 Introduction to Artificial Intelligence Neural Networks
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
CSE P573 Applications of Artificial Intelligence Neural Networks
Lecture 12. MLP (IV): Programming & Implementation
Lecture 11. MLP (III): Back-Propagation
Neural Networks Advantages Criticism
Training a Neural Network
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Temporal Back-Propagation Algorithm
Artificial Intelligence 10. Neural Networks
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
Artificial Neural Networks / Spring 2002
Presentation transcript:

An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project

Problem description: In order to control the surface finishing results in automated surface finishing process, it is important to specify the relationship between surface waviness and cutting parameters such as feed rate, cross feed, tool displacement, and spindle speed. In this project, an artificial neural network is used to predict surface waviness (output) from feed rate, cross feed, tool displacement, and spindle speed (inputs).

Outline: The data file consists of six columns. The first four columns represent feed rate, cross feed, tool displacement, and spindle speed, and are used as feature vectors. The last two columns represent depth of cut and surface waviness, and are used as target vectors. I have collected twenty-eight real sets of data performed on actual CNC machine and workpieces. These sets of data will be used to train and test the neural network.

Method: Back propagation perceptron learning would work best for this application. A multi-layer perceptron neural network back propagation algorithm with one hidden layer is implemented in this project. Since this is application problem, bpappro.m is used with some modifications. Multi-layer perceptron Input vector Target vector output

Results: parameters used that produce best results: For depth of cut prediction: -Learning rate (between 0 and 1) = 0.3 -Momentum constant (between 0 and 1) = 0.8 -Hidden neurons: 15 -Maximum number of epochs to run = 800 -Epoch size (# of samples) = 25 Training data set results: -Mean square error = Maximum absolute error =

For waviness prediction: -Learning rate (between 0 and 1) = 0.2 -Momentum constant (between 0 and 1) = 0.8 -Hidden neurons: 24 -Maximum number of epochs to run = 800 -Epoch size (# of samples) = 26 Training data set results: -Mean square error = Maximum absolute error = 3.839

Conclusion/Discussion: Multi-layer perceptron neural network back propagation algorithm seems to work better than time series prediction and other methods that could be used for this project. But still, the mean square error is big. It is very difficult for the program to converge. Other existing mathematical models to predict depth of cut and waviness seem to work much better than my neural network in this project.