1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately 10 11 neurons, each connected to, on average, 10 4.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
SYNAPSES AND NEURONAL INTEGRATION
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Biological and Artificial Neurons Michael J. Watts
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
Artificial Intelligence (CS 461D)
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
How does the mind process all the information it receives?
Neural communication How do neurons send messages to each other?
Artificial Neural Networks (ANNs)
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
Artificial Neural Networks
Neural Networks (NN) Ahmad Rawashdieh Sa’ad Haddad.
CS 484 – Artificial Intelligence
Artificial Neural Networks - Introduction -
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Physiology of a Muscle Contraction Human Anatomy and Physiology Dr. Steve W. Altstiel Naples Middle High School.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Machine Learning Chapter 4. Artificial Neural Networks
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
The Neuron An everyday (every second!) use of active transport
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
35.2.  Controls and coordinates functions throughout the body.  Responds to external and internal messages.  The body’s  communication system.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Copyright © 2009 Pearson Education, Inc. Neurons and Neurological Cells: The Cells of the Nervous System  The nervous system  Integrates and coordinates.
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College.
Artificial Intelligence & Neural Network
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks and Deep Learning Slides credit: Geoffrey Hinton and Yann LeCun.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neurons & Nervous Systems. nervous systems connect distant parts of organisms; vary in complexity Figure 44.1.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
The Neuron An everyday (every second!) use of active transport.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Chapter 17 The nervous system.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Ch. 31.  collects information about the body’s internal and external environment  processes and responds  Messages allow organs to act together and.
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Artificial Intelligence (CS 370D)
Introduction to Neural Networks And Their Applications
OVERVIEW OF BIOLOGICAL NEURONS
Artificial Intelligence Lecture No. 28
The Biological Basis of Behavior
ARTIFICIAL NEURAL networks.
Synaptic Transmission and Integration
Presentation transcript:

1 Azhari, Dr Computer Science UGM

Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4 others. Neuron activity is excited or inhibited through connections to other neurons. The fastest neuron switching times are known to be on the order of sec. 2

Gross physical structure: –There is one axon that branches –There is a dendritic tree that collects input from other neurons Axons typically contact dendritic trees at synapses –A spike of activity in the axon causes charge to be injected into the post- synaptic neuron Spike generation: –There is an axon hillock that generates outgoing spikes whenever enough charge has flowed in at synapses to depolarize the cell membrane axon body dendritic tree

4 The cell itself includes a nucleus (at the center). To the right of cell 2, the dendrites provide input signals to the cell. To the right of cell 1, the axon sends output signals to cell 2 via the axon terminals. These axon terminals merge with the dendrites of cell 2.

Signals can be transmitted unchanged or they can be altered by synapses. A synapse is able to increase or decrease the strength of the connection from the neuron to neuron and cause excitation or inhibition of a subsequence neuron. This is where information is stored. The information processing abilities of biological neural systems must follow from highly parallel processes operating on representations that are distributed over many neurons. One motivation for ANN is to capture this kind of highly parallel computation based on distributed representations. 5

When a spike travels along an axon and arrives at a synapse it causes vesicles of transmitter chemical to be released –There are several kinds of transmitter The transmitter molecules diffuse across the synaptic cleft and bind to receptor molecules in the membrane of the post-synaptic neuron thus changing their shape. – This opens up holes that allow specific ions in or out. The effectiveness of the synapse can be changed – vary the number of vesicles of transmitter – vary the number of receptor molecules. Synapses are slow, but they have advantages over RAM –Very small –They adapt using locally available signals (but how?)

Each neuron receives inputs from other neurons -Some neurons also connect to receptors -Cortical neurons use spikes to communicate -The timing of spikes is important The effect of each input line on the neuron is controlled by a synaptic weight –The weights can be positive or negative The synaptic weights adapt so that the whole network learns to perform useful computations –Recognizing objects, understanding language, making plans, controlling the body You have about 10 neurons each with about 10 weights –A huge number of weights can affect the computation in a very short time. Much better bandwidth than pentium. 11 3

An ANN is composed of processing elements called or perceptrons, organized in different ways to form the network’s structure. An ANN consists of perceptrons. Each of the perceptrons receives inputs, processes inputs and delivers a single output. 8 The input can be raw input data or the output of other perceptrons. The output can be the final result (e.g. 1 means yes, 0 means no) or it can be inputs to other perceptrons.

9

10 Inputs: –Each input corresponds to a single attribute of the problem. –For example for the diagnosis of a disease, each symptom, could represent an input to one node. –Input could be image (pattern) of skin texture, if we are looking for the diagnosis of normal or cancerous cells. Outputs: –The outputs of the network represent the solution to a problem. –For diagnosis of a disease, the answer could be yes or no. Weights: –A key element of ANN is weight. –Weight expresses relative strength of the entering signal from various connections that transfers data from input point to the output point.

These are simple but computationally limited –If we can make them learn we may get insight into more complicated neurons 0 0 y

McCulloch-Pitts (1943): influenced Von Neumann! –First compute a weighted sum of the inputs from other neurons –Then send out a fixed size spike of activity if the weighted sum exceeds a threshold. –Maybe each spike is like the truth value of a proposition and each neuron combines truth values to compute the truth value of another proposition! 1 if 0 otherwise y z 1 0 threshold

0 otherwise y z 0 threshold These have a confusing name. They compute a linear weighted sum of their inputs The output is a non-linear function of the total input

These give a real-valued output that is a smooth and bounded function of their total input. –Typically they use the logistic function –They have nice derivatives which make learning easy. If we treat as a probability of producing a spike, we get stochastic binary neurons

1 AND 10      X1 X2 Z AND w1 w2  w1, w2 harus dicari berdasarkan nilai w1, w2 awal yang diberikan Real world Neuron model

Result for stable weight Activation function

ANN learning is well-suited to problems in which the training data corresponds to noisy, complex sensor data. It is also applicable to problems for which more symbolic representations are used. It is appropriate for problems with the characteristics: –Input is high-dimensional discrete or real- valued (e.g. raw sensor input) –Output is discrete or real valued –Output is a vector of values –Possibly noisy data –Long training times accepted –Fast evaluation of the learned function required. –Not important for humans to understand the weights 17 Examples: Speech phoneme recognition Image classification Financial prediction Medical diagnosis

18

19

20

21

22

23

24 A NN is a machine learning approach inspired by the way in which the brain performs a particular learning task Knowledge about the learning task is given in the form of examples. Inter neuron connection strengths (weights) are used to store the acquired information (the training examples). During the learning process the weights are modified in order to model the particular learning task correctly on the training examples.