Download presentation
Presentation is loading. Please wait.
Published byJames Wright Modified over 8 years ago
1
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill 2008.12.22 Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget
2
Summary Design a Neural Network ≡ a Function General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer Analysis & Optimal Network –Add feedback Recurrent Network Dynamic system with attractor –Minimize variance of estimate Not suitable for high dimensional problem – with
3
Contents Introduction –Computing, Invariance, Throwing Away Information Computing Function with Networks of Neurons: A General Algorithm Efficient Computing –Qualitative Analysis –Quantitative Analysis
4
Introduction Encoding information in population activity Computing with population codes Ex) Sensorimotor translation Invariance –Throwing away information Ex), Face recognition –Invariance to different patterns (because of noise)
5
i-th neuron’s activity Computing Function with Networks of Neurons: A General Algorithm (7.1) (7.2) Input variableSmooth functionOutput variable Input population activity Output population activity Neural Network Input Layer Intermediate Layer (2D) Output Layer
6
Computing Function with Networks of Neurons: A General Algorithm General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer
7
1. Feed-forward connection from input to intermediate layer Intermediate layer: 2-D array
8
2. Recurrent connection to removing ridges Mexican hat connectivity –Winner-takes-all (7.3)
9
3. Feed-forward connection from intermediate to output layer More input dimension, More intermediate dimension General algorithm for smooth function (7.4)
10
Efficient Computing: Qualitative Analysis 2 ways for more efficient computation –Feedforward Recurrent –Optimal networks Multi-dimensional Attractor Networks –Ex) 1D manifold: invariance attractor (n-1)D manifold: converge to InputIntermediateOutput
11
Efficient Computing: Quantitative Analysis Transient dynamics –t=0: Transient, noisy population input –t=∞: Smooth bump population Network estimates the encoded variables.
12
Efficient Computing: Quantitative Analysis Question: –How accurate are those estimates? = What is the variance of the estimates? Single variable case
13
Efficient Computing: Quantitative Analysis State equation of single variable case –It can have Line attractor There exists, such that –Initial condition (from (7.1)) – (7.5) (7.6) (7.7)
14
Efficient Computing: Quantitative Analysis Solving steps –1. Small noise assumption –2. Dynamics linearization around an equilibrium point on the attractor –3. Solve variance (7.8) (7.9)
15
Efficient Computing: Quantitative Analysis Coordinate transform Linearization (7.8) (7.9) (7.10)
16
Efficient Computing: Quantitative Analysis Eigenvalue analysis (7.11) (7.12) (7.13) (7.14)
17
Efficient Computing: Quantitative Analysis Variance –The efficiency of network depends only on the adjoint eigenvector of the linearized dynamics whose eigenvalue is zero. (7.14)
18
Efficient Computing: Quantitative Analysis Optimal network –Minimize variance –Optimal variance: Depends on “Correlation Structure” Not suitable for high dimensional problem (7.16)
19
Summary Design a Neural Network ≡ a Function General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer Analysis & Optimal Network –Add feedback Recurrent Attractor Network –Minimize variance of estimate Not suitable for high dimensional problem – with
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.