Artificial Neural Network Yalong Li Some slides are from 701-3_24_2011_ann.pdf.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Artificial Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Machine Learning Neural Networks
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
PERCEPTRON. Chapter 3: The Basic Neuron  The structure of the brain can be viewed as a highly interconnected network of relatively simple processing.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Artificial Neural Networks
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
CS 484 – Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
2101INT – Principles of Intelligent Systems Lecture 10.
Machine Learning Chapter 4. Artificial Neural Networks
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Chapter 18 Connectionist Models
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural Networks.
Learning with Perceptrons and Neural Networks
Artificial Intelligence (CS 370D)
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Artificial Neural Networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Simple learning in connectionist networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
CS621: Artificial Intelligence
Machine Learning Today: Reading: Maria Florina Balcan
Classification Neural Networks 1
Artificial Intelligence Chapter 3 Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Artificial Intelligence Lecture No. 28
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Simple learning in connectionist networks
Seminar on Machine Learning Rada Mihalcea
Introduction to Neural Network
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf

Structure Motivation Artificial neural networks Learning: Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Some facts about our brain Performance tends to degrade gracefully under partial damage Learn (reorganize itself) from experience Recovery from damage is possible Performs massively parallel computations extremely efficiently For example, complex visual perception occurs within less than 100 ms, that is, 10 processing steps!(processing speed of synapses about 100hz) Supports our intelligence and self-awareness

Neural Networks in the Brain Cortex, midbrain, brainstem and cerebellum Visual System 10 or 11 processing stages have been identified feedforward earlier processing stages (near the sensory input) to later ones (near the motor output) feedback

Neurons and Synapses Basic computational unit in the nervous system is the nerve cell, or neuron.

Synaptic Learning One way brain learn is by altering the strengths of connections between neurons, and by adding or deleting connections between neurons LTP(long-term potentiation) Long-Term Potentiation: An enduring (>1 hour) increase in synaptic efficacy that results from high-frequency stimulation of an afferent (input) pathway The efficacy of a synapse can change as a result of experience, providing both memory and learning through long-term potentiation. One way this happens is through release of more neurotransmitter. Hebbs Postulate: "When an axon of cell A... excites[s] cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells so that A's efficiency as one of the cells firing B is increased.“ Points to note about LTP: Synapses become more or less important over time (plasticity) LTP is based on experience LTP is based only on local information (Hebb's postulate)

? ?

Structure Motivation Artificial neural networks Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Multilayer Networks of Sigmoid Units

Connectionist Models Consider humans: Neuron switching time ~.001 second Number of neurons ~10 10 fs Connections per neuron ~ Scene recognition time ~.1 second 100 inference steps doesn’t seem like enough → Much parallel compution Properties of artificial neural nets(ANN’s): Many neuron-like threshold switching units Many weighted interconnections among units Highly parallel, distributed process

Structure Motivation Artificial neural networks Learning: Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Backpropagation Algorithm Looks for the minium of the error function in weight space using the method of gradient descent. The combination of weights which minimizes the error function is considered to be a solution of the learning problem.

Sigmoid unit

Error Gradient for a Sigmoid Unit

Gradient Descent

Incremental(Stochastic) Gradient Descent

Backpropagation Algorithm(MLE)

Derivation of the BP rule: Goal: Error: Notation:

Backpropagation Algorithm(MLE) For ouput unit j:

Backpropagation Algorithm(MLE) For hidden unit j:

More on Backpropagation

Structure Motivation Artificial neural networks Learning: Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Overfitting in ANNs

Dealing with Overfitting

K-Fold Cross Validation

Leave-Out-One Cross Validation

Structure Motivation Artificial neural networks Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Expressive Capabilities of ANNs Single Layer: Preceptron XOR problem problem

Single Layer: Perceptron

Representational Power of Perceptrons hyperplane decision surface in the n-dimensional space of instances wx = 0 Linear separable sets Logical: and, or, … How to learn w ?

Single Layer: Perceptron Nonliear sets of examples?

Multi-layer perceptron, XOR

Multi-layer perceptron

Expressive Capabilities of ANNs

Leaning Hidden Layer Representations problem

Leaning Hidden Layer Representations problem

Leaning Hidden Layer Representations problem Auto Encoder?

Training

Neural Nets for Face Recognition

Leaning Hidden Layer Representations

Structure Motivation Artificial neural networks Backpropagation Algorithm Overfitting Expressive Capabilities of ANNs Summary

Thank you!