1 Introduction to Bio-Inspired Models During the last three decades, several efficient machine learning tools have been inspired in biology and nature:

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Multi-Layer Perceptron (MLP)
Slides from: Doug Gray, David Poole
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Data Mining Classification: Alternative Techniques
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
CS 4700: Foundations of Artificial Intelligence
Kostas Kontogiannis E&CE
Artificial Neural Networks
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Artificial Intelligence (CS 461D)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Chapter 5 NEURAL NETWORKS
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function Networks
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Machine Learning Chapter 4. Artificial Neural Networks
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
CS621 : Artificial Intelligence
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Artificial Neural Networks An Introduction. Outline Introduction Biological and artificial neurons Perceptrons (problems) Backpropagation network Training.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Machine Learning Supervised Learning Classification and Regression
Artificial Neural Networks
Supervised Learning in ANNs
Learning with Perceptrons and Neural Networks
Artificial Intelligence (CS 370D)
Artificial neural networks:
Machine Learning Today: Reading: Maria Florina Balcan
Artificial Intelligence Lecture No. 28
Lecture Notes for Chapter 4 Artificial Neural Networks
Introduction to Neural Networks
Design of Experiments CHM 585 Chapter 15.
Presentation transcript:

1 Introduction to Bio-Inspired Models During the last three decades, several efficient machine learning tools have been inspired in biology and nature: Artificial Neural Networks (ANN) are inspired in the brain to automatically learn and generalize (model) observed data. Evolutive and genetic algorithms offer a solution to standard optimization problems when no much information about the function to optimize is available. Artificial ant colonies offer an alternative solution for optimization problems. All these methods share some common properties: They are inspired in nature (not in human logical reasoning). They are automatic (no human intervention) and nonlinear. They provide efficient solutions to some hard NP-problems.

2 Introduction to Artificial Neural Networks Artificial Neural Networks are inspired in the structure and functioning of the brain, which is a collection of interconnected neurons (the simplest computing elements performing information processing): Each neuron consists of a cell body, that contains a cell nucleus. There are number of fibers, called dendrites, and a single long fiber called axon branching out from the cell body. The axon connects one neuron to others (through the dendrites). The connecting junction is called synapse.

3 Functioning of a “Neuron” The synapses releases chemical transmitter substances. The chemical substances enter the dendrite, raising or lowering the electrical potential of the cell body. When the potential reaches a threshold, an electric pulse or action potential is sent down to the axon affecting other neurons. (Therefore, there is a nonlinear activation). Excitatory and inhibitory synapses. nonlinear activation function neuron potential: mixed input of neighboring neurons weights (+ or -, excitatory or inhibitory) (threshold)

4 The neural activity (output) is given by a no linear function. 1. Init the neural weight with random values 2. Select the input and output data and train it 3. Compute the error associate with the output 4. Compute the error associate with the hidden neurons 5. Compute and update the neural weight according to these values Gradient descent Inputs Outputs Multilayer perceptron. Backpropagation algorithm

5 Time Series Modeling and Forecast Sometimes the chaotic time series have a stochastic look difficult to predict An example is Henon map

6 Given a time series with 2000 points (T=20), generated from a Lorenz system (chaotic behavior). To check modeling power different parameters are tested. Example: Supervised Fitting and Prediction Three variables (x,y,z) (x n,y n,z n ) (x n+1,y n+1,z n+1 ) Continuous System Neural Network 3:k:3 h 1 h 2 h k y 1 y i x 2 x 3 x j W ik w kj xnxn ynyn znzn x n+1 y n+1 z i z n+1 3:6:3 3:15:3

7 Dynamical Behavior A simple model doesn’t capture the complete structure of the system, then the dynamics of the system is not reproduce. 3:6:3 3:15:3 A complex system it’s overfitting the problem and the dynamics of the system is not reproduce Only a intermediate model with an appropriate amount of parameters can model the functional structure of the system and the dynamics

8 Time series from a infrared laser. X i-1 X i-2 X i-3 X i-j XiXi The Neural network reproduces laser behavior Infrared laser intensity is modeled using a neural network. Only time lagged intensities are used. Net 6:5:5:1 The Neural Network can be synchronized with the time series obtained from the laser.

9 Structural Learning: Modular Neural Networks With the aim of giving some flexibility to the network topology, modular neural networks combine different neural blocks into a global topology. Fully-connected topology (too many parameters). Combining several blocks (parameter reduction). Assigning different subnets to specific tasks we can simplify the complexity of the model. 2*4+4*4+4*1+9= 37 weights 2(2*2)+2(2*2)+4*1+9= 29 weights In most of the cases, block division is a heuristic task !!! How to obtain an optimal “block division” for a given problem ?

10 Functional Networks Functional networks are a generalization of neural networks which combine both qualitative domain knowledge and data. y f f x z u f + f -1 y I F x z u F I F F Qualitative knowledge:x 3 =F(x 1,x 2 ), Initial Topology Theorem. The simplest functional form is: Simplified Topology Learning (least squares): {    n } {a 1,..., a n } This is the optimal “block division” for this problem !!! Data:(x 1i,x 2i,x 3i ),i=1,2,...

11 Some FN Architectures Associative Model: F(x,y) is an associative operator. Sliced-Conditioned Model: where  and  are covenient basis for the x- and y-constant slices. Separable Model: A simple topology.

12 A First Example. Functional vs Neural Neural Network Functional Network ( separable model) 2:2:2:1 MLP15 parametersRMSE= :3:3:1 MLP25 parametersRMSE= parametersRMSE=  = {1,x,x 2,x 3 } Knowledge of the network structure (separable). Non-parametric approach to learn the neuron functions !!!! 100 points of Training Data with Uniform Noise in (-0.01,0.01). 25x25 points from the exact surface for Validation. Appropriate family of functions (polynomial).

13 Functional Nets & Modular Neural Nets Advantages and shortcomings of Black-box topology with no problem connection. Efficient non-parametric models for approximating functions. Neural Nets Parametric learning techniques (supply basis functions). Model driven optimal topology. Functional Nets The topology of the network is obtained from the Functional network. The neuron functions are Approximated using MLPs. Hybrid functional-neural networks (Modular networks)

14 Another example. Nonlinear Time Series Nonlinear Maps (the Lozi model) Sensitivity to initial conditions Fractal geometry Time series modeling and forecasting is an important problem with many practical applications. Goal: predicting the future using past values. x 1, x 2,…, x n ¿¿¿ x n+1 ??? Modeling methods: x n+1 = F ( x 1, x 2,…, x n) Nonlinear time series may exhibit complex seemingly stochastic behavior. There are many well-known techniques for linear time series (ARMA, etc.). Nonlinear time series modeling is a difficult task because: Trajectories starting at very close initial points split away after a few iterates. Evolve in a irregular fractal space. X 1 =0.8 X 1 = X 1 =

15 Functional Models (separation) 500 training points 1000 validation points. FN MFNN 1 MFNN 2 Separable Functional Net: ¿which basis family?  ={sin(x),…,sin(mx), cos(x),…,cos(mx)} With 4*m parameters Symmetric Modular Functional Neural Net: 1:m:1 With 6*m parameters Asymmetric Modular Functional Neural Net: 1:2m:1 1:2:1 With 2*m-2 parameters m=11 (44 pars) RMSE=5.3e-3 m=7 (42 pars) RMSE=1.5e-3 m=7 (42 pars) RMSE=4.0e-4

16 Minimum Description Length The Minimum Description Length (MDL) algorithm has proved to be simple and efficient in several problems about Model Selection. Description Length for a model