FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Introduction to Neural Networks
1 1 Mechanical Design and Production Dept, Faculty of Engineering, Zagazig University, Egypt. Mechanical Design and Production Dept, Faculty of Engineering,
Slides from: Doug Gray, David Poole
EE 690 Design of Embodied Intelligence
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
1 Pertemuan 1 SEJARAH JARINGAN SYARAF TIRUAN Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1.
Machine Learning Neural Networks
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Artificial Neural Networks (ANNs)
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
CS 4700: Foundations of Artificial Intelligence
Hub Queue Size Analyzer Implementing Neural Networks in practice.
CS 484 – Artificial Intelligence
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Introduction to Directed Data Mining: Neural Networks
Radial Basis Function (RBF) Networks
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Multiple-Layer Networks and Backpropagation Algorithms
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
Artificial Intelligence Techniques Multilayer Perceptrons.
MATLAB.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
1 Introduction to Neural Networks And Their Applications.
FUZZY LOGIC 1.
Neural Networks II By Jinhwa Kim. 2 Neural Computing is a problem solving methodology that attempts to mimic how human brain function Artificial Neural.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Introduction to Neural Networks and Example Applications in HCI Nick Gentile.
Akram Bitar and Larry Manevitz Department of Computer Science
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
PMS RI 2010/2011 Prof. dr Zikrija Avdagić, dipl.ing.el. ANFIS Editor GUI ANFIS Editor GUI.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Prepared by Fayes Salma.  Introduction: Financial Tasks  Data Mining process  Methods in Financial Data mining o Neural Network o Decision Tree  Trading.
Big data classification using neural network
Waqas Haider Khan Bangyal
Neural Network Architecture Session 2
One-layer neural networks Approximation problems
CSSE463: Image Recognition Day 17
Generalization ..
of the Artificial Neural Networks.
An Introduction To The Backpropagation Algorithm
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Introduction.
Presentation transcript:

FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor: Dr. Ahmad Elja’afreh

Neural networks Neural networks are composed of simple elements operating in parallel. These elements are inspired by biological nervous systems. As in nature, the connections between elements largely determine the network function. You can train a neural network to perform a particular function by adjusting the values of the connections (weights) between elements. Typically, neural networks are adjusted, or trained, so that a particular input leads to a specific target output. There, the network is adjusted, based on a comparison of the output and the target, until the network output matches the target. Typically, many such input/target pairs are needed to train a network

Neural networks Neural networks have been trained to perform complex functions in various fields, including pattern recognition, identification, classification, speech, vision, and control systems

Dynamic and Static Neural networks Neural networks can be classified into dynamic and static categories. Static (feedforward) networks have no feedback elements and contain no delays; the output is calculated directly from the input through feedforward connections. In dynamic networks, the output depends not only on the current input to the network, but also on the current or previous inputs, outputs, or states of the network.

Using the Toolbox There are four ways you can use the Neural Network Toolbox™ software. The first way is through the four graphical user interfaces (GUIs) that will be described later. (You can open these GUIs from a master GUI, which you can open with the command nnstart.) These provide a quick and easy way to access the power of the toolbox for the following tasks: •Function fitting •Pattern recognition •Data clustering •Time series analysis

Neural Network Toolbox™ Applications We will demonstrate only a few of the applications in function fitting, pattern recognition, clustering, and time series analysis. The following table provides an idea of the diversity of applications for which neural networks provide state-of-the-art solutions.

Neural Network Applications Business Applications Industry Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, and nonlinear modeling Electronics Speech recognition, speech compression, vowel classification, and text-to-speech synthesis Speech Market analysis, automatic bond rating, and stock trading advisory systems Securities Check and other document reading and credit application evaluation Banking

Neural Network Design Steps You will follow the standard steps for designing neural networks to solve problems in four application areas: Function fitting, pattern recognition, clustering, and time series analysis. 0 Collect data. 1 Create the network. 2 Configure the network. 3 Initialize the weights and biases. 4 Train the network. 5 Validate the network. 6 Use the network. You will follow these steps using both the GUI tools and command-line

Function fitting Neural networks are good at fitting functions. In fact, there is proof that afairly simple neural network can fit any practical function Suppose, for instance, that you have data from a housing application You want to design a network that can predict the value of a house (in $1000s), given 13 pieces of geographical and real estate information. You have a total of 506 example homes for which you have those 13 items of data and their associated market values. You can solve this problem in two ways: •Use a graphical user interface, nftool, as described in “Using the Neural Network Fitting Tool”. •Use command-line functions, as described in “Using Command-Line Functions”.

Using the Neural Network Fitting Tool 1 Open the Neural Network Start GUI with this command: nnstart

2 Click Fitting Tool to open the Neural Network Fitting Tool 2 Click Fitting Tool to open the Neural Network Fitting Tool. (You can also use the command nftool.)

3 Click Next to proceed. 4 Click Load Example Data Set in the Select Data window. The Fitting Data Set Chooser window opens.

5 Select House Pricing, and click Import 5 Select House Pricing, and click Import. This returns you to the Select Data window.

6 Click Next to display the Validation and Test Data window, shown in the following figure. The validation and test data sets are each set to 15% of the original data

Notes With these settings, the input vectors and target vectors will be randomly divided into three sets as follows: •70% will be used for training. •15% will be used to validate that the network is generalizing and to stop training before overfitting. •The last 15% will be used as a completely independent test of network generalization.

7 Click Next The standard network that is used for function fitting is a two-layer feedforward network, with a sigmoid transfer function in the hidden layer and a linear transfer function in the output layer.

8 Click next. 9 Click Train. The training continued until the validation error failed to decrease for six iterations (validation stop).

10 Under Plots, click Regression. This is used to validate the network performance. The following regression plots display the network outputs with respect to targets for training, validation, and test sets.

For a perfect fit, the data Fitting a Function should fall along a 45 degree line, where the network outputs are equal to the targets. For this problem, the fit is reasonably good for all data sets, with R values in each case of 0.93 or above. If even more accurate results were required, you could retrain the network by clicking Retrain in nftool. This will change the initial weights and biases of the network, and may produce an improved network after retraining. Other options are provided on the following pane.

11 View the error histogram to obtain additional verification of network performance. Under the Plots pane, click Error Histogram.

12 Click Next in the Neural Network Fitting Tool to evaluate the network

Network’s Performance At this point, you can test the network against new data. If you are dissatisfied with the network’s performance on the original or new data, you can do one of the following: •Train it again. •Increase the number of neurons. •Get a larger training data set. If the performance on the training set is good, but the test set performance is significantly worse, which could indicate overfitting, then reducing the number of neurons can improve your results. If training performance is poor, then you may want to increase the number of neurons

13 If you are satisfied with the network performance, click Next 13 If you are satisfied with the network performance, click Next. 14 Use the buttons on this screen to generate scripts or to save your results.

The End