Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget
Summary Design a Neural Network ≡ a Function General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer Analysis & Optimal Network –Add feedback Recurrent Network Dynamic system with attractor –Minimize variance of estimate Not suitable for high dimensional problem – with
Contents Introduction –Computing, Invariance, Throwing Away Information Computing Function with Networks of Neurons: A General Algorithm Efficient Computing –Qualitative Analysis –Quantitative Analysis
Introduction Encoding information in population activity Computing with population codes Ex) Sensorimotor translation Invariance –Throwing away information Ex), Face recognition –Invariance to different patterns (because of noise)
i-th neuron’s activity Computing Function with Networks of Neurons: A General Algorithm (7.1) (7.2) Input variableSmooth functionOutput variable Input population activity Output population activity Neural Network Input Layer Intermediate Layer (2D) Output Layer
Computing Function with Networks of Neurons: A General Algorithm General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer
1. Feed-forward connection from input to intermediate layer Intermediate layer: 2-D array
2. Recurrent connection to removing ridges Mexican hat connectivity –Winner-takes-all (7.3)
3. Feed-forward connection from intermediate to output layer More input dimension, More intermediate dimension General algorithm for smooth function (7.4)
Efficient Computing: Qualitative Analysis 2 ways for more efficient computation –Feedforward Recurrent –Optimal networks Multi-dimensional Attractor Networks –Ex) 1D manifold: invariance attractor (n-1)D manifold: converge to InputIntermediateOutput
Efficient Computing: Quantitative Analysis Transient dynamics –t=0: Transient, noisy population input –t=∞: Smooth bump population Network estimates the encoded variables.
Efficient Computing: Quantitative Analysis Question: –How accurate are those estimates? = What is the variance of the estimates? Single variable case
Efficient Computing: Quantitative Analysis State equation of single variable case –It can have Line attractor There exists, such that –Initial condition (from (7.1)) – (7.5) (7.6) (7.7)
Efficient Computing: Quantitative Analysis Solving steps –1. Small noise assumption –2. Dynamics linearization around an equilibrium point on the attractor –3. Solve variance (7.8) (7.9)
Efficient Computing: Quantitative Analysis Coordinate transform Linearization (7.8) (7.9) (7.10)
Efficient Computing: Quantitative Analysis Eigenvalue analysis (7.11) (7.12) (7.13) (7.14)
Efficient Computing: Quantitative Analysis Variance –The efficiency of network depends only on the adjoint eigenvector of the linearized dynamics whose eigenvalue is zero. (7.14)
Efficient Computing: Quantitative Analysis Optimal network –Minimize variance –Optimal variance: Depends on “Correlation Structure” Not suitable for high dimensional problem (7.16)
Summary Design a Neural Network ≡ a Function General Building Algorithm for an Arbitrary Smooth Function –1. Feed-forward connection from input to intermediate layer –2. Recurrent connection to removing ridges –3. Feed-forward connection from intermediate to output layer Analysis & Optimal Network –Add feedback Recurrent Attractor Network –Minimize variance of estimate Not suitable for high dimensional problem – with