Download presentation
Presentation is loading. Please wait.
Published byRodney Dorsey Modified over 9 years ago
1
Projects: 1.Predictive coding in balanced spiking networks (Erwan Ledoux). 2.Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz). 3.Encoding and Decoding in the Auditory System (Izzett Burak Yildiz). 4.Quadratic programming of tuning curves: a theory for tuning curve shape (Ralph Bourdoukan). 5.The Bayesian synapse: A theory for synaptic short term plasticity (Sophie Deneve).
2
Projects: 1.Choose a project. Send email to sophie.deneve@ens.fr sophie.deneve@ens.fr 2.Once project assigned, take appointment with advisor ASAP (before April 17). 3.Plan another meeting with advisor (mid- May). 4.Prepare Oral presentation (June 5). Pedagogy, context, clarity, results not so important.
3
The efficient coding hypothesis Predicting sensory receptive fields
4
Schematics of the visual system
5
The retina
6
Center-surround RFs
7
Hubel and Wiesel
8
V1 orientation selective cell
9
Hubel and Wiesel model
10
How are receptive fields measured?
14
It is a linear regression problem
17
Solution:
18
Receptive fields of V1 simple cells
19
Optimal sensory coding?
20
The notion of surprise
21
The entropy of a distribution
22
Minimal and maximal entropy
23
Maximizing information transfer Conditional entropy H(Y|X): Surprise about Y when one knows X Or more shortly: With:
24
Maximizing information transfer Conditional entropy H(Y|X): Surprise about Y when one knows X Mutual information between X and Y:
25
Maximizing information Mutual information between x and y: Maximize …or… Minimize Interesting! Boring! Unreliable! Precise!
26
Sensory system as information channel
27
Maximizing information transfer Mutual information between x and r: Fixed (no noise) Maximize Generative models Analysis models
28
Maximizing information transfer
29
Distribution of responses
30
Entropy maximization
31
Infomax activation function
32
An example in the fly
33
But: neurons cannot have any activation function!
34
Information maximization
37
Two neurons
38
Each neuron maximizing its own entropy
39
Entropy of a 2D distribution
40
Two neurons
41
Entropy maximization = Independent component analysis
42
Entropy maximization, 2 neurons
43
Independent component analysis, N neurons
44
Application: visual processing
45
Transformation of the visual input
46
Entropy maximization
48
Weights learnt by ICA (image patch)
49
The distribution of natural images
50
Geometric interpretation of ICA
51
First stages of visual processing
52
The efficient coding hypothesis
53
Limitations of ICA Works only once… Great!
54
Limitations of ICA Works only once… … and then what? Great!
56
Limitations of ICA Complete basis. Number of features = Number of pixels
57
Limitations of ICA Bottleneck Number optic nerve fibers << Number of retinal receptors
58
Maximizing information transfer Mutual information between x and r: Fixed Minimize Reconstruction error Fixed (no noise) Maximize Generative models Analysis models
59
Maximizing information Mutual information between x and y: FixedMinimize Unreliable! Precise!
60
Maximizing information Mutual information between x and y: FixedMinimize Unreliable! Precise! must predict the sensory input as well as possible
61
Generative model Generate Independent, prior
62
Generative model Generate Independent, prior
63
Generative model Generate Independent, prior Find the dictionary of features,, minimizing
64
The Gaussian Distribution Minimize mean squared error
65
Generative model, recognition model Generate Recognize Independent, prior Minimize entropy Minimize expected reconstruction error
66
Separate the problem in two: Given current sensory input, and dictionary estimate the hidden state Given the current state estimates and sensory input update the to minimize reconstruction error. Repeat until convergence. Start with some random dictionary
67
How to estimate r= h? Generate Recognize Maximum a-posteriori (MAP)
68
How to estimate r= h? Generate Recognize Bayes rule:
69
Reconstruction error and MAP Normal distribution Variance of pixel noise Minus log posterior equivalent to reconstruction error with cost: Prior Cost
70
Minimize reconstruction error Reconstructed sensory input Neural responses Dictionary of features Reconstruction error Penalty or cost
71
Generate Recognize How to estimate r= h? Maximize log posterior probability:
72
Generate Recognize How to update the dictionary Minimize mean-squared error:
73
Generative model, recognition model Generate Recognize 1. Find 2. Update to minimize MSE most probable hidden states
74
What prior to use? Sparse coding Cost = number of neurons with non-zero responses Good!Bad! Many cortical neurons are near-silent…
75
Sparse responses of an edge detector … … Sparse prior:
76
Elementary features found by sparse coding
77
Limitation of the sparse coding approach applied to sensory RFs Generate Recognize “Predictive fields” “Receptive fields” Different!
78
Receptive fields depend on stimulus type
79
Carandini et al, JNeurosci 2005
80
f t Responses to natural scene are poorly predicted by the RF. STRF: Machens CK, Wehr MS, Zador AM. J Neurosci. 2004
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.