Neural Networks Basic concepts ArchitectureOperation.

Slides:



Advertisements
Similar presentations
KULIAH II JST: BASIC CONCEPTS
Advertisements

NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
NEURAL NETWORKS Biological analogy
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Chapter Seven The Network Approach: Mind as a Web.
Neural Networks Chapter Feed-Forward Neural Networks.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
Neuron Model and Network Architecture
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Multiple-Layer Networks and Backpropagation Algorithms
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Chapter 9 Neural Network.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Computer Go : A Go player Rohit Gurjar CS365 Project Presentation, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Akram Bitar and Larry Manevitz Department of Computer Science
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Perceptrons Michael J. Watts
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
The Language of Thought : Part II Joe Lau Philosophy HKU.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
An Introduction To The Backpropagation Algorithm.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Neural Network Architecture Session 2
Neural Networks.
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Chapter 12 Advanced Intelligent Systems
Artificial Intelligence Methods
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
Prepared by: Mahmoud Rafeek Al-Farra
CSSE463: Image Recognition Day 17
The Network Approach: Mind as a Web
Lecture 09: Introduction Image Recognition using Neural Networks
General Aspects of Learning
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Neural Networks Basic concepts ArchitectureOperation

2 What is a NN A large number of very simple neuron-like processing elements A large number of very simple neuron-like processing elements A large number of weighed connections between the elements. The weights encode the knowledge of the network A large number of weighed connections between the elements. The weights encode the knowledge of the network Highly parallel, distributed control Highly parallel, distributed control An emphasis on learning internal representations automatically An emphasis on learning internal representations automatically

3 Architecture Simple processing elements linked together Simple processing elements linked together Numeric inputs with weights Numeric inputs with weights A function that computes the output A function that computes the output The links The links Excitatory – positive weight Excitatory – positive weight Inhibitory – negative weight Inhibitory – negative weight

4 Basic Computational Neuron

5 Network Layers Input layer - nodes that accept input Input layer - nodes that accept input Output layer - nodes that deliver output - the result of the network processing Output layer - nodes that deliver output - the result of the network processing Hidden layers - intermediate layers, optional, used for better learning Hidden layers - intermediate layers, optional, used for better learning

6

7 Feed-Forward and Recurrent NN Feed-Forward : Feed-Forward : Signals to travel one way only: from input to output Signals to travel one way only: from input to output The output of any layer does not affect that same layer The output of any layer does not affect that same layer Recurrent : Recurrent : Signals travel in both directions by introducing loops in the network Signals travel in both directions by introducing loops in the network

8 Operation Operation 1. Task: Find a mapping function between a given set of inputs and a predefined set of outputs 2. Algorithm: 1) Feed the input 2) Compare the obtained output with the desired output 3) If different – adjust the weights 4) Repeat until there is no difference How to adjust the weights?

9 Supervised learning: Supervised learning: Feed-Forward – by an algorithm implemented into the NN driver Feed-Forward – by an algorithm implemented into the NN driver Recurrent – using the back-propagation algorithm (quite complicated – not explained here) Recurrent – using the back-propagation algorithm (quite complicated – not explained here) Unsupervised learning: Unsupervised learning: The desired outcome is not known explicitly in advance. The desired outcome is not known explicitly in advance. Feedback from the environment Feedback from the environment An evaluation function An evaluation function

10 Advantages and Problems Advantages: NN are robust – even if some inputs are missing, the results will be correct in most of the cases Problems: How to specify the inputs How to specify the inputs Local NN – each input corresponds to an item in our model Local NN – each input corresponds to an item in our model Distributed NN – the set of inputs correspond to an item Distributed NN – the set of inputs correspond to an item How to implement reasoning: How to implement reasoning: Hybrid networks: symbolic networks and neural networks Hybrid networks: symbolic networks and neural networks