1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Artificial Neural Networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Phantom Limb Phenomena. Hand movement observation by individuals born without hands: phantom limb experience constrains visual limb perception. Funk M,
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Artificial Neural Network
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
Biological and Artificial Neurons Michael J. Watts
Machine Learning Neural Networks
Neural Networks.
Introduction CS/CMPE 537 – Neural Networks. CS/CMPE Neural Networks (Sp 2004/2005) - Asim LUMS2 Biological Inspiration The brain is a highly.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #30 4/15/02 Neural Networks.
Chapter 6: Multilayer Neural Networks
Artificial Neurons, Neural Networks and Architectures
Artificial neural networks.
Neural Networks (NN) Ahmad Rawashdieh Sa’ad Haddad.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial neural networks:
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
2101INT – Principles of Intelligent Systems Lecture 10.
Artificial Neural Network Unsupervised Learning
Establishing the Equivalence between Recurrent Neural Networks and Turing Machines. Ritesh Kumar Sinha(02d05005) Kumar Gaurav Bijay( )
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Artificial Intelligence & Neural Network
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
1 Financial Informatics –XV: Perceptron Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence Methods Neural Networks Lecture 1 Rakesh K. Bissoondeeal Rakesh K.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Intro. ANN & Fuzzy Systems Lecture 3 Basic Definitions of ANN.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Advanced information retreival
Artificial Intelligence (CS 370D)
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Financial Informatics –XVII: Unsupervised Learning
Binary Decision Diagrams
OVERVIEW OF BIOLOGICAL NEURONS
G5AIAI Introduction to AI
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Machine Learning. Support Vector Machines A Support Vector Machine (SVM) can be imagined as a surface that creates a boundary between points of data.
Artificial Intelligence Lecture No. 28
ARTIFICIAL NEURAL networks.

PYTHON Deep Learning Prof. Muhammad Saeed.
Presentation transcript:

1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2, IRELAND November 19 th,

2 Neural Networks Artificial Neural Networks The basic premise of the course, Neural Networks, is to introduce our students to an alternative paradigm of building information systems.

3 Artificial Neural Networks An ANN system can be characterised by its ability to learn; its dynamic capability; and its interconnectivity

4 Artificial Neural Networks: An Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

5 Artificial Neural Networks: An Operational View A neuron is an information processing unit forming the key ingredient of a neural network: The diagram above is a model of a biological neuron. There are three key ingredients of this neuron labelled x k which is connected to the (rest of the) neurons in the network labelled x 1, x 2, x 3,…x j. A set of links, the biological equivalent of synapses, which the k th neuron has with the (rest of the) neurons in the network. Note that each link has a WEIGHT denoted by labelled w k1, w k2,…w kj, where the first subscript (k in this case) denotes the recipient neurons and the second subscript (1,2,3…..j) denotes the neurons transmitting to the recipient neurons. The synaptic weight w kj may lie in a range that includes negative (inhibitory) values and positive (excitatory) values. (From Haykin 1999:10-12)

6 Artificial Neural Networks: An Operational View The k th neuron adds up the inputs of all the transmitting neurons at the summing junction or the adder, denoted by . The adder acts as a linear combiner and generates a weighted average usually denoted by u k : u k = w k1 i*x 1 + w k2 *x 2 + w k3 *x 3 + ………. + w kj *x j ; the bias (b k )has the effect of increasing or decreasing the net input to the activation function depending on the value of the bias. (From Haykin 1999:10-12)

7 ANN’s: an Operational View Finally, the linear combination, denoted as v k = u k + b k, is passed through the activation function which engenders the non-linear behaviour seen in the behaviour of the biological neurons: the inputs to and outputs from a given neuron show a complex, often non-linear behaviour. For example, if the output from the adder was positive or zero then the neuron will emit a signal, y k = 1 if  (v k )  0, however if the output from the adder was negative then there will be no output, y k = 0 if  (v k )< 0. There are other models of the activiation function as we will see later. (From Haykin 1999:10-12)

8 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

9 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

10 ANN’s: an Operational View Discontinuous Output Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function f(net) net No output Output Threshold (θ) (Normalised) output (eg. 1)

11 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function The notion of a discontinuous function simulates the fundamental notion that biological neurons usually fire if there is ‘enough’ stimulus available in the environment. But discontinuous is biologically implausible, so there must be some degree of continuity in the output such that an artificial neuron has a degree of biological plausibility.

12 ANN’s: an Operational View Pseudo-Continuous Output Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function f(net) net Output=α Output β Threshold (θ) Saturation Threshold (θ’)

13 ANN’s: an Operational View A schematic for an 'electronic' neuron ykyk  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Input Signals Output Signal Summing Junction Activation Function

14 ANN’s: an Operational View Neural Nets as directed graphs A directed graph is a geometrical object consisting of a set of points (called nodes) along with a set of directed line segments (called links) between them. A neural network is a parallel distributed information processing structure in the form of a directed graph.

15 ANN’s: an Operational View Input Connections Processing Unit Output Connection Fan Out

16 ANN’s: an Operational View A neural network comprises A set of processing units A state of activation An output function for each unit A pattern of connectivity among units A propagation rule for propagating patterns of activities through the network An activation rule for combining the inputs impinging on a unit with the current state of that unit to produce a new level of activation for the unit A learning rule whereby patterns of connectivity are modified by experience An environment within which the system must operate

17 The McCulloch-Pitts Network. McCulloch and Pitts demonstrated that any logical function can be duplicated by some network of all-or- none neurons referred to as an artificial neural network (ANN). Thus, an artificial neuron can be embedded into a network in such a manner as to fire selectively in response to any given spatial temporal array of firings of other neurons in the ANN. Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006

18 The McCulloch-Pitts Network Demonstrates that any logical function can be implemented by some network of neurons. There are rules governing the excitatory and inhibitory pathways. All computations are carried out in discrete time intervals. Each neuron obeys a simple form of a linear threshold law: Neuron fires whenever at least a given (threshold) number of excitatory pathways, and no inhibitory pathways, impinging on it are active from the previous time period. If a neuron receives a single inhibitory signal from an active neuron, it does not fire. The connections do not change as a function of experience. Thus the network deals with performance but not learning.

19 The McCulloch-Pitts Network Computations in a McCulloch-Pitts Network ‘Each cell is a finite-state machine and accordingly operates in discrete time instants, which are assumed synchronous among all cells. At each moment, a cell is either firing or quiet, the two possible states of the cell’ – firing state produces a pulse and quiet state has no pulse. (Bose and Liang 1996:21) ‘Each neural network built from McCulloch-Pitts cells is a finite-state machine is equivalent to and can be simulated by some neural network.’ (ibid 1996:23) ‘The importance of the McCulloch-Pitts model is its applicability in the construction of sequential machines to perform logical operations of any degree of complexity. The model focused on logical and macroscopic cognitive operations, not detailed physiological modelling of the electrical activity of the nervous system. In fact, this deterministic model with its discretization of time and summation rules does not reveal the manner in which biological neurons integrate their inputs.’ (ibid 1996:25)

20 The McCulloch-Pitts Network Consider a McCulloch-Pitts network which can act as a minimal model of the sensation of heat from holding a cold object to the skin and then removing it or leaving it on permanently. Each cell has a threshold of TWO, hence fires whenever it receives two excitatory (+) and no inhibitory (-) signals from other cells at a previous time. Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006

21 The McCulloch-Pitts Network 1 3 A B Heat Receptors Cold Hot Cold Heat Sensing Network

22 The McCulloch-Pitts Network 1 3 A B Heat Receptors Cold Hot Cold Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT 1NoYesNo 2 YesNo 3 YesNo 4 YesNo Truth tables of the firing neurons when the cold object contacts the skin and is then removed

23 The McCulloch-Pitts Network Heat Sensing Network ‘Feel hot’/’Feel cold’ neurons show how to create OUTPUT UNIT RESPONSE to given INPUTS that depend ONLY on the previous values. This is known as a TEMPORAL CONTRAST ENHANCEMENT. The absence or presence of a stimulus in the PREVIOUS time cycle plays a major role here. The McCulloch-Pitts Network demonstrates how this ENHANCEMENT can be simulated using an ALL-OR-NONE Network.

24 The McCulloch-Pitts Network Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT Truth tables of the firing neurons for the case when the cold object is left in contact with the skin – a simulation of temporal contrast enhancement

25 The McCulloch-Pitts Network Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT 1NoYesNo 2 Yes No 3 Yes No Yes Truth tables of the firing neurons for the case when the cold object is left in contact with the skin – a simulation of temporal contrast enhancement 1 3 A B Heat Receptors Cold Hot Cold

26 The McCulloch-Pitts Network A B Memory Models Three stimulus model Permanent Memory model + +

27 The McCulloch-Pitts Network In the permanent memory model, the output neuron has threshold ‘1’; neuron 2 fires if the light has ever been on anytime in the past. Levine, D. S. (1991:16) Memory Models Permanent Memory model

28 The McCulloch-Pitts Network Memory Models 1 2 A B Three stimulus model TimeCell 1Cell ACell BCell 2 1YesNo 2YesNoYesNo 3Yes No 4 Yes Consider, the three stimulus all-or-none neural network. In this network, neuron 1 responds to a light being on. Each of the neurons has threshold ‘3’. In the three stimulus model neuron 2 fires after the light has been on three time units in a row. All connections are unit positive

29 The McCulloch-Pitts Network Why is a McCulloch-Pitts a FSM? A finite state machine (FSM)is an AUTOMATON.An input string is read from left to right; the machine looks at each symbol in turn. At any time the FSM is in one of many finitely interval states. The state changes after each input symbol is read. The NEW STATE depends (only) on the symbol just read and on the current state.

30 The McCulloch-Pitts Network Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006 ‘The McCulloch-Pitts model, though it uses an oversimplified formulation of neural activity patterns, presages some issues that are still important in current cognitive models. [..][Some] Modern connectionist networks contain three types of units or nodes – input units, output units, and hidden units. The input units react to particular data features from the environment […]. The output units generate particular organismic responses […]. The hidden units are neither input nor output units themselves but, via network connections, influence output units to respond to prescribed patterns of input unit firings or activities. [..] [This] input-output-hidden trilogy can be seen as analogous to the distinction between sensory neurons, motor neurons, and all other (interneurons) in the brain’ Levine, Daniel S. (1991: 14-15)

31 The McCulloch-Pitts Network Linear Neuron: Output is the weighted sum of all the inputs; McCulloch-Pitts Neuron: Output is the thresholded value of the weighted sum Input Vector? X = X (1,-20,4,-2); Weight vector? w ji =w(wj1,wj2,wj3,wj4) =[0.8,0.2,-1,-0.9]  0 th input x1x1 x2x2 x3x3 x4x4 yjyj j w j1 w j2 w j3 w j4

32 The McCulloch-Pitts Network v j =  w ji x i; y=f(v); y=0 if v 0 Input Vector? X = X (1,-20,4,-2); Weight vector? w ji =w(wj1,wj2,wj3,wj4) =[0.8,0.2,-1,-0.9] w j0 =0, x 0 =0  0 th input x1x1 x2x2 x3x3 x4x4 yjyj j w j1 w j2 w j3 w j4

33 The McCulloch-Pitts Network Input Vector? X = X (1,-20,4,-2); Weight vector? w=w(w j1,w j2,w j3,w j4 ) =[0.8,0.2,-1,-0.9] w j0 =0, x 0 =0 v j =  w ji x i; y=f(v); f  activation function Linear Neuron: y=v McCulloch Pitts: y=0 if v 0 Sigmoid activation function: f(v)=1 /(1+exp(-v))

34 The McCulloch-Pitts Network What are the circumstance in a neuron with a sigmoidal activation function will act like a McCulloch Pitts network? Large synaptic weights What are the circumstance in a neuron with a sigmoidal activation function will act like a linear neuron? Small synaptic weights

35 The McCulloch-Pitts Network The key outcome of early research in artificial neural networks clearly demonstrated the theoretical importance (brain like behaviour and logical basis) and extensive utility (regime switching modes) of threshold behaviour. This behaviour was emulated through the use of the squashing functions and is the basis of many a simulation.