1 Neural Codes. 2 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

Probability & Certainty: Intro Probability & Certainty.
CONDITIONAL PROBABILITY and INDEPENDENCE In many experiments we have partial information about the outcome, when we use this info the sample space becomes.
AP STATISTICS.   Theoretical: true mathematical probability  Empirical: the relative frequency with which an event occurs in a given experiment  Subjective:
CHAPTER 10: Introducing Probability
An outcome is a possible result An event is a specific outcome Random means all outcomes are equally likely to occur or happen. random = fair A favorable.
Statistics 303 Chapter 4 and 1.3 Probability. The probability of an outcome is the proportion of times the outcome would occur if we repeated the procedure.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Some Probability Rules Compound Events
Probability Section 7.1.
PROBABILITY CONCEPTS Key concepts are described Probability rules are introduced Expected values, standard deviation, covariance and correlation for individual.
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
1 RES 341 RESEARCH AND EVALUATION WORKSHOP 4 By Dr. Serhat Eren University OF PHOENIX Spring 2002.
CHAPTER 10: Introducing Probability ESSENTIAL STATISTICS Second Edition David S. Moore, William I. Notz, and Michael A. Fligner Lecture Presentation.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Probability Section 7.1. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Probability Basics Section Starter Roll two dice and record the sum shown. Repeat until you have done 20 rolls. Write a list of all the possible.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
CHAPTER 10: Introducing Probability ESSENTIAL STATISTICS Second Edition David S. Moore, William I. Notz, and Michael A. Fligner Lecture Presentation.
1 Erklären Sie den Unterschied zwischen P(t,s), P(s,t), P(t|s) und P(s|t). Gibt es hier Gleichheiten zwischen irgendwelchen dieser Terme? Wann gilt P(s,t)
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Neural Codes. Neuronal codes Spiking models: Hodgkin Huxley Model (brief repetition) Reduction of the HH-Model to two dimensions (general) FitzHugh-Nagumo.
AP Statistics From Randomness to Probability Chapter 14.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 5: Probability: What are the Chances? Section 5.2 Probability Rules.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
CHAPTER 10: Introducing Probability
Virtual University of Pakistan
CHAPTER 6 Random Variables
Chapter 3: Probability Topics
CHAPTER 5 Probability: What Are the Chances?
Chapter 5: Probability: What are the Chances?
4 Elementary Probability Theory
Chapter 4 Probability.
Chapter 4 Basic Probability.
Statistics 300: Introduction to Probability and Statistics
From Randomness to Probability
4 Elementary Probability Theory
Elementary Statistics
Probability Rules!!! … and yea… Rules About Probability
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 14 Probability Rules!.
Warm-up We are going to collect some data and determine if it is “normal” Roll each pair of dice 10 times and record the SUM of the two digits in your.
Warmup The chance of winning a prize from Herff- Jones is 1/22. How would you set up a simulation using the random number table to determine the probability.
Honors Statistics From Randomness to Probability
CHAPTER 10: Introducing Probability
Chapter 5: Probability: What are the Chances?
Chapter 6: Probability: What are the Chances?
COUNTING AND PROBABILITY
Chapter 15 Probability Rules! Copyright © 2010 Pearson Education, Inc.
Chapter 5: Probability: What are the Chances?
Chapter 15 Probability Rules!.
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 6: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Chapter 5: Probability: What are the Chances?
Unit 6: Probability: What are the Chances?
Basic Probability Chapter Goal:
Presentation transcript:

1 Neural Codes

2 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly

3 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly after band pass filtering

4 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly after band pass filtering generated electronically by a threshold discriminator circuit

5 Neuronal Codes – Probabilistic response and Bayes’ rule stimulus spike trains conditional probability: Given a stimulus, how likely is it to observe a certain spike train?

6 Neuronal Codes – Probabilistic response and Bayes’ rule conditional probability ensembles of signals natural situation: joint probability: experimental situation: we choose s(t) prior distribution joint probability Asking (left side): How frequently do we observe stimulus and spike train TOGETHER? Answer (right side): As often as this spike train probably follows the stimulus presentation (1st term) times the probability of presenting the stimulus out of many stimuli which we use (right term).

7 Conditional probability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B), and is read "the probability of A, given B". Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written P(A,B). Consider the simple scenario of rolling two fair six-sided dice, labelled die 1 and die 2. Define the following three events: A: Dice 1 lands on 3. B: Dice 2 lands on 1. C: The dice sum to 8. The prior probability of each event describes how likely the outcome is before the dice are rolled, without any knowledge of the roll's outcome. For example, dice 1 is equally likely to fall on each of its 6 sides, so P(A) = 1 / 6. Similarly P(B) = 1 / 6. Likewise, of the 6 × 6 = 36 possible ways that two dice can land, and only 5 of them result in a sum of 8 (namely 2 and 6, 3 and 5, 4 and 4, 5 and 3, and 6 and 2), so P(C) = 5 / 36. Prior

8 A: Dice 1 lands on 3. B: Dice 2 lands on 1. C: The dice sum to 8. Some of these events can both occur at the same time; for example events A and C can happen at the same time, in the case where dice 1 lands on 3 and dice 2 lands on 5. This is the only one of the 36 outcomes where both A and C occur, so its probability is 1/36. The probability of both A and C occurring is called the joint probability of A and C and is written P(A,C)=1/36. On the other hand, if dice 2 lands on 1, the dice cannot sum to 8, so P(B,C)=0. Now suppose we roll the dice and cover up dice 2, so we can only see dice 1, and observe that dice 1 landed on 3. Given this partial information, the probability that the dice sum to 8 is no longer 5/36; instead it is 1/6, since dice 2 must land on 5 to achieve this result. This is called the conditional probability, because it's the probability of C under the condition that is A is observed, and is written P(C|A), which is read "the probability of C given A. On the other hand, if we roll the dice and cover up dice 2, and observe dice 1, this has no impact on the probability of event B, which only depends on dice 2. We say events A and B are statistically independent or just independent and in this case: P(B|A)=P(B). Joint Conditional

9 Neuronal Codes – Probabilistic response and Bayes’ rule But: the brain “sees” only {t i } and must “say” something about s(t) But: there is no unique stimulus in correspondence with a particular spike train thus, some stimuli are more likely than others given a particular spike train experimental situation: response-conditional ensemble

10 Neuronal Codes – Probabilistic response and Bayes’ rule Bayes’ rule: what we see: what our brain “sees”: What is the difference: Fundamentally WE know the prior P(s) as WE choose the stimuli. The Brain knows the Prior P(t) as the Brain generates the spike trains. What would the animal (the percept) like to know? It would like to know: Given a spike train what is the most likely stimulus behind it? This is P(s|t).

11 Neuronal Codes – Probabilistic response and Bayes’ rule motion sensitive neuron H1 in the fly’s brain: average angular velocity of motion across the View Field in a 200ms window spike count determined by the experimenter property of the neuron Correlation, Not (!) independent

12 Neuronal Codes – Probabilistic response and Bayes’ rule spikes determine the probability of a stimulus from given spike train stimuli

13 Neuronal Codes – Probabilistic response and Bayes’ rule determine the probability of a stimulus from given spike train

14 Neuronal Codes – Probabilistic response and Bayes’ rule determine probability of a spike train from a given stimulus

15 Neuronal Codes – Probabilistic response and Bayes’ rule determine probability of a spike train from a given stimulus

16 Neuronal Codes – Probabilistic response and Bayes’ rule How do we measure this time dependent firing rate?

17 Neuronal Codes – Probabilistic response and Bayes’ rule Nice probabilistic stuff, but SO, WHAT?

18 Neuronal Codes – Probabilistic response and Bayes’ rule SO, WHAT? We can characterize the neuronal code in two ways: translating stimuli into spikestranslating spikes into stimuli Bayes’ rule: (traditional approach) -> If we can give a complete listing of either set of rules, than we can solve any translation problem thus, we can switch between these two points of view (how the brain “sees” it)

19

20 Neuronal Codes – Probabilistic response and Bayes’ rule We can switch between these two points of view. And why is that important? These two points of view may differ in their complexity! Traditionally you would record this: Spike count n dependent on the Stimulus (velocity v). This is a difficult „curved“ function and requires a complex model to explain, does‘nt it??

21 Lets measure this in a better (more complete way): You choose P(v) and for some reason you like some stimuli better than others, which makes this peaked. Do this for all stimulus and don‘t forget to normalize all this to 1 before plotting (P(n,v). Then you record the responses (spike count n) for these stimuli. For example the red stimulus gives you after many repetitions the red response curve. Do this for all stimulus and don‘t forget to normalize all this to 1 before plotting (P(n,v).

22 Lets measure this in a better (more complete way): You choose P(v) and for some reason you like some stimuli better than others, which makes this peaked. Do this for all stimuli and don‘t forget to normalize all this to 1 before plotting (P(n,v). Then you record the responses (spike count n) for these stimuli. For example the red stimulus gives you after many repetitions the red response curve for P(n|v 0 ).

23 Neuronal Codes – Probabilistic response and Bayes’ rule Summing all values along the red arrow yields P(n) the Prior how often a certain number of spikes in general is observed. With Bayes and the knowledge of P(n) we can get the two conditional probability curves, too.

24 Neuronal Codes – Probabilistic response and Bayes’ rule average number of spikes depending on stimulus velocity average stimulus depending on spike count

25 Neuronal Codes – Probabilistic response and Bayes’ rule average number of spikes depending on stimulus velocity average stimulus depending on spike count non-linear relation almost perfectly linear relation The left relation is MUCH easier to understand than the right one (which is the one you would have measured naively)! This is how Bayes can help. You can deduct (guess) the stimulus velocity (here linearly) from just the spike count.

26 Neuronal Codes – Probabilistic response and Bayes’ rule For a deeper discussion read, for instance, that nice, difficult book: Rieke, F. et al. (1996). Spikes: Exploring the neural code. MIT Press.