Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,

Similar presentations


Presentation on theme: "1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,"— Presentation transcript:

1 1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2, IRELAND November 19 th, 2008. https://www.cs.tcd.ie/Khurshid.Ahmad/Teaching.html

2 2 Neural Networks Artificial Neural Networks The basic premise of the course, Neural Networks, is to introduce our students to an alternative paradigm of building information systems.

3 3 Artificial Neural Networks An ANN system can be characterised by its ability to learn; its dynamic capability; and its interconnectivity

4 4 Artificial Neural Networks: An Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

5 5 Artificial Neural Networks: An Operational View A neuron is an information processing unit forming the key ingredient of a neural network: The diagram above is a model of a biological neuron. There are three key ingredients of this neuron labelled x k which is connected to the (rest of the) neurons in the network labelled x 1, x 2, x 3,…x j. A set of links, the biological equivalent of synapses, which the k th neuron has with the (rest of the) neurons in the network. Note that each link has a WEIGHT denoted by labelled w k1, w k2,…w kj, where the first subscript (k in this case) denotes the recipient neurons and the second subscript (1,2,3…..j) denotes the neurons transmitting to the recipient neurons. The synaptic weight w kj may lie in a range that includes negative (inhibitory) values and positive (excitatory) values. (From Haykin 1999:10-12)

6 6 Artificial Neural Networks: An Operational View The k th neuron adds up the inputs of all the transmitting neurons at the summing junction or the adder, denoted by . The adder acts as a linear combiner and generates a weighted average usually denoted by u k : u k = w k1 i*x 1 + w k2 *x 2 + w k3 *x 3 + ………. + w kj *x j ; the bias (b k )has the effect of increasing or decreasing the net input to the activation function depending on the value of the bias. (From Haykin 1999:10-12)

7 7 ANN’s: an Operational View Finally, the linear combination, denoted as v k = u k + b k, is passed through the activation function which engenders the non-linear behaviour seen in the behaviour of the biological neurons: the inputs to and outputs from a given neuron show a complex, often non-linear behaviour. For example, if the output from the adder was positive or zero then the neuron will emit a signal, y k = 1 if  (v k )  0, however if the output from the adder was negative then there will be no output, y k = 0 if  (v k )< 0. There are other models of the activiation function as we will see later. (From Haykin 1999:10-12)

8 8 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

9 9 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function

10 10 ANN’s: an Operational View Discontinuous Output Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function f(net) net No output Output Threshold (θ) (Normalised) output (eg. 1)

11 11 ANN’s: an Operational View Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function The notion of a discontinuous function simulates the fundamental notion that biological neurons usually fire if there is ‘enough’ stimulus available in the environment. But discontinuous is biologically implausible, so there must be some degree of continuity in the output such that an artificial neuron has a degree of biological plausibility.

12 12 ANN’s: an Operational View Pseudo-Continuous Output Input Signals ykyk Output Signal  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Summing Junction Activation Function f(net) net Output=α Output β Threshold (θ) Saturation Threshold (θ’)

13 13 ANN’s: an Operational View A schematic for an 'electronic' neuron ykyk  w k3 w k1 w k2 w k4 Neuron x k x1x1 x2x2 x3x3 x4x4 bkbk Input Signals Output Signal Summing Junction Activation Function

14 14 ANN’s: an Operational View Neural Nets as directed graphs A directed graph is a geometrical object consisting of a set of points (called nodes) along with a set of directed line segments (called links) between them. A neural network is a parallel distributed information processing structure in the form of a directed graph.

15 15 ANN’s: an Operational View Input Connections Processing Unit Output Connection Fan Out

16 16 ANN’s: an Operational View A neural network comprises A set of processing units A state of activation An output function for each unit A pattern of connectivity among units A propagation rule for propagating patterns of activities through the network An activation rule for combining the inputs impinging on a unit with the current state of that unit to produce a new level of activation for the unit A learning rule whereby patterns of connectivity are modified by experience An environment within which the system must operate

17 17 The McCulloch-Pitts Network. McCulloch and Pitts demonstrated that any logical function can be duplicated by some network of all-or- none neurons referred to as an artificial neural network (ANN). Thus, an artificial neuron can be embedded into a network in such a manner as to fire selectively in response to any given spatial temporal array of firings of other neurons in the ANN. Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006

18 18 The McCulloch-Pitts Network Demonstrates that any logical function can be implemented by some network of neurons. There are rules governing the excitatory and inhibitory pathways. All computations are carried out in discrete time intervals. Each neuron obeys a simple form of a linear threshold law: Neuron fires whenever at least a given (threshold) number of excitatory pathways, and no inhibitory pathways, impinging on it are active from the previous time period. If a neuron receives a single inhibitory signal from an active neuron, it does not fire. The connections do not change as a function of experience. Thus the network deals with performance but not learning.

19 19 The McCulloch-Pitts Network Computations in a McCulloch-Pitts Network ‘Each cell is a finite-state machine and accordingly operates in discrete time instants, which are assumed synchronous among all cells. At each moment, a cell is either firing or quiet, the two possible states of the cell’ – firing state produces a pulse and quiet state has no pulse. (Bose and Liang 1996:21) ‘Each neural network built from McCulloch-Pitts cells is a finite-state machine is equivalent to and can be simulated by some neural network.’ (ibid 1996:23) ‘The importance of the McCulloch-Pitts model is its applicability in the construction of sequential machines to perform logical operations of any degree of complexity. The model focused on logical and macroscopic cognitive operations, not detailed physiological modelling of the electrical activity of the nervous system. In fact, this deterministic model with its discretization of time and summation rules does not reveal the manner in which biological neurons integrate their inputs.’ (ibid 1996:25)

20 20 The McCulloch-Pitts Network Consider a McCulloch-Pitts network which can act as a minimal model of the sensation of heat from holding a cold object to the skin and then removing it or leaving it on permanently. Each cell has a threshold of TWO, hence fires whenever it receives two excitatory (+) and no inhibitory (-) signals from other cells at a previous time. Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006

21 21 The McCulloch-Pitts Network 1 3 A B 4 2 - + + + + + + + + + + Heat Receptors Cold Hot Cold Heat Sensing Network

22 22 The McCulloch-Pitts Network 1 3 A B 4 2 - + + + + + + + + + + Heat Receptors Cold Hot Cold Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT 1NoYesNo 2 YesNo 3 YesNo 4 YesNo Truth tables of the firing neurons when the cold object contacts the skin and is then removed

23 23 The McCulloch-Pitts Network Heat Sensing Network ‘Feel hot’/’Feel cold’ neurons show how to create OUTPUT UNIT RESPONSE to given INPUTS that depend ONLY on the previous values. This is known as a TEMPORAL CONTRAST ENHANCEMENT. The absence or presence of a stimulus in the PREVIOUS time cycle plays a major role here. The McCulloch-Pitts Network demonstrates how this ENHANCEMENT can be simulated using an ALL-OR-NONE Network.

24 24 The McCulloch-Pitts Network Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT 1 2 3 Truth tables of the firing neurons for the case when the cold object is left in contact with the skin – a simulation of temporal contrast enhancement

25 25 The McCulloch-Pitts Network Heat Sensing Network TimeCell 1Cell 2Cell aCell bCell 3Cell 4 INPUT HIDDEN OUTPUT 1NoYesNo 2 Yes No 3 Yes No Yes Truth tables of the firing neurons for the case when the cold object is left in contact with the skin – a simulation of temporal contrast enhancement 1 3 A B 4 2 - + + + + + + + + + + Heat Receptors Cold Hot Cold

26 26 The McCulloch-Pitts Network + + + + + + + + 1 2 A B 1 2 + + Memory Models Three stimulus model Permanent Memory model + +

27 27 The McCulloch-Pitts Network In the permanent memory model, the output neuron has threshold ‘1’; neuron 2 fires if the light has ever been on anytime in the past. Levine, D. S. (1991:16) 1 2 + + Memory Models Permanent Memory model

28 28 The McCulloch-Pitts Network Memory Models 1 2 A B Three stimulus model TimeCell 1Cell ACell BCell 2 1YesNo 2YesNoYesNo 3Yes No 4 Yes Consider, the three stimulus all-or-none neural network. In this network, neuron 1 responds to a light being on. Each of the neurons has threshold ‘3’. In the three stimulus model neuron 2 fires after the light has been on three time units in a row. All connections are unit positive

29 29 The McCulloch-Pitts Network Why is a McCulloch-Pitts a FSM? A finite state machine (FSM)is an AUTOMATON.An input string is read from left to right; the machine looks at each symbol in turn. At any time the FSM is in one of many finitely interval states. The state changes after each input symbol is read. The NEW STATE depends (only) on the symbol just read and on the current state.

30 30 The McCulloch-Pitts Network Artificial Neural Networks for Real Neuroscientists: Khurshid Ahmad, Trinity College, 28 Nov 2006 ‘The McCulloch-Pitts model, though it uses an oversimplified formulation of neural activity patterns, presages some issues that are still important in current cognitive models. [..][Some] Modern connectionist networks contain three types of units or nodes – input units, output units, and hidden units. The input units react to particular data features from the environment […]. The output units generate particular organismic responses […]. The hidden units are neither input nor output units themselves but, via network connections, influence output units to respond to prescribed patterns of input unit firings or activities. [..] [This] input-output-hidden trilogy can be seen as analogous to the distinction between sensory neurons, motor neurons, and all other (interneurons) in the brain’ Levine, Daniel S. (1991: 14-15)

31 31 The McCulloch-Pitts Network Linear Neuron: Output is the weighted sum of all the inputs; McCulloch-Pitts Neuron: Output is the thresholded value of the weighted sum Input Vector? X = X (1,-20,4,-2); Weight vector? w ji =w(wj1,wj2,wj3,wj4) =[0.8,0.2,-1,-0.9]  0 th input x1x1 x2x2 x3x3 x4x4 yjyj j w j1 w j2 w j3 w j4

32 32 The McCulloch-Pitts Network v j =  w ji x i; y=f(v); y=0 if v 0 Input Vector? X = X (1,-20,4,-2); Weight vector? w ji =w(wj1,wj2,wj3,wj4) =[0.8,0.2,-1,-0.9] w j0 =0, x 0 =0  0 th input x1x1 x2x2 x3x3 x4x4 yjyj j w j1 w j2 w j3 w j4

33 33 The McCulloch-Pitts Network Input Vector? X = X (1,-20,4,-2); Weight vector? w=w(w j1,w j2,w j3,w j4 ) =[0.8,0.2,-1,-0.9] w j0 =0, x 0 =0 v j =  w ji x i; y=f(v); f  activation function Linear Neuron: y=v McCulloch Pitts: y=0 if v 0 Sigmoid activation function: f(v)=1 /(1+exp(-v))

34 34 The McCulloch-Pitts Network What are the circumstance in a neuron with a sigmoidal activation function will act like a McCulloch Pitts network? Large synaptic weights What are the circumstance in a neuron with a sigmoidal activation function will act like a linear neuron? Small synaptic weights

35 35 The McCulloch-Pitts Network The key outcome of early research in artificial neural networks clearly demonstrated the theoretical importance (brain like behaviour and logical basis) and extensive utility (regime switching modes) of threshold behaviour. This behaviour was emulated through the use of the squashing functions and is the basis of many a simulation.


Download ppt "1 Financial Informatics –XIV: Basic Principles 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,"

Similar presentations


Ads by Google