Download presentation
Presentation is loading. Please wait.
Published byLoreen Lambert Modified over 6 years ago
1
Lecture 33: Stat mech approach to cooperativity; diffusion intro
X
2
Key concepts from biochemical approach to cooperativity
Cooperativity gives sigmoidal binding curves The Hill equation models strong cooperative binding with reasonable accuracy Plausible biological mechanisms for cooperativity include: concerted changes in subunit conformation nearest-neighbor effects b/t subunits tethering There has to be a good tethering analogy for this.
3
Cooperativity from a statistical mechanics approach
Jacques Monod, Jeffries Wyman, and Jean-Pierre Changeux, who were responsible for the MWC model you just saw, were by no means constrained to thinking of this problem from what I’ve deemed the “biochemist’s approach”. They wrote about the model and others like it from a stat mech perspective with equal facility and frequency. Some of those publications were joint with none other than Charles Kittel, your stat mech textbook author. Whereas Wyman studied hemoglobin doggedly throughout his life, Changeux followed this Ph.D. work with a career in neuroscience, and Monod did important work on gene regulation.
4
Modeling simple binding
What is the energy of a state in which n identical ligands are bound to a receptor? The overall change in free energy for the system has two major contributions: the free energy of binding between the ligand and the receptor, and the free energy of particle loss from the solution. dG = V dp – S dT + mu dN. We can assume pressure and temperature held constant.
5
Modeling simple binding
Suppose we have two states: unbound (0) and bound (1). What is the partition function for this system?
6
Modeling simple binding
Suppose we have two states: unbound (0) and bound (1). What is the probability of a receptor being in the bound state?
7
Modeling cooperative binding
To model cooperativity, we need a non-additive change in energy as more ligands bind. Consider the case where an extra energy change J occurs when the last ligand binds: Instead of considering hemoglobin with its four binding sites, for simplicity we’ll stick with two. The only thing you’ll miss from this simplification is some mathematical baggage.
8
Modeling cooperative binding
What is the mean number of sites bound?
9
Modeling cooperative binding
What is the partition function?
10
Modeling cooperative binding
Assume J large and negative:
11
Modeling cooperative binding
What is the average fraction of sites occupied by ligand? This is a Hill equation. (Assuming large, negative additional energy J in the last binding step is equivalent to assuming K4 >> K1, K2, K3.)
12
Key concepts from statistical mechanics approach to cooperativity
The binding curves introduced earlier can be derived: directly from the principles of stat mech with less algebraic effort while providing some insight into the origins of the parameters There has to be a good tethering analogy for this.
13
Diffusion lecture plan
Today: Diffusion of single particles as a random walk Parallels first lac operon lecture Adds introduction to the diffusion equation Tomorrow: Diffusion as a consequence of chemical potential differences Alternative derivation of Fick’s laws Description of the diffusion coefficient Friday: Gaussian integrals and FRAP Monday: Example applications Tuesday: Diffusion to detection (Berg & Purcell)
14
Particles in solution undergo Brownian motion
First observed by Robert Brown in 1820 with pollen grains and inorganic material. Appreciated that this motion was not active. Source of the jiggling: frequent collisions with molecules of the solvent, each of which may have been traveling in a different direction. The trajectory of the visible particle reflects the sum of the momenta imparted.
15
Modeling a particle undergoing Brownian motion
Suppose a particle diffuses in one dimension and its position when t=0 is x=0. Also suppose that in a time step Dt, the particle must move either right or left (with equal probability) by a specific distance Dx. When we observe Brownian motion, the distance traveled in a small time step is not always the same. Here we have assumed a consistent distance traveled anyway, since our discussion will be easier to follow. You might think of Dx as a “typical distance traveled”, like |x| or RMS(x). It turns out this assumption will not affect the accuracy of the result. After studying stochastic differential equations (in section 4 or 5), we will be able to write down a differential that includes a noise term, and know how to integrate it. But for now this is what we can work with.
16
Modeling a particle undergoing Brownian motion
Initial probability distribution: Note that the probability distribution is discrete and normalized.
17
What is the probability distribution at time t = 1 (in units of Dt)?
Student-led discussion of what the probability distribution should be, beginning with enumerating the possible positions and their relative probabiltiies. or
18
What is the probability distribution at time t = 2 (in units of Dt)?
Hopefully we have built up intuition about how the particle moves in this model. More possible positions slowly but surely creeping in, so it’s time to find a general description of position. or or
19
Finding a general formula for the probability distribution
We need to find a systematic way of enumerating the possibilities before things get out of hand. Notice that in these units, t is the total number of steps taken. Let a new variable, k, be the number of steps taken toward the right. Hopefully we have built up intuition about how the particle moves in this model. More possible positions slowly but surely creeping in, so it’s time to find a general description of position. To write a general formula, we’re going to introduce a new variable, k.
20
Finding a general formula for the probability distribution
How many ways can we choose k steps (out of t total steps) to take toward the right? The probability distribution p(k,t) will be the product of these two values What is the probability of any specific order of t steps?
21
Finding a general formula for the probability distribution
Possible values of x increment by two
22
Intuition check: try t=2
Possible values of x increment by two Try x = -2 and t =2. p = 2! / ( 2^2 * 0! * 2!) = 1 /4. Same for x=2. Try x =0 and t=2. p = 2!/ ( 2^2 * 1! * 1!) = ½. No other values possible. Check! or or
23
The large t approximation
What will this distribution look like when t is large? Possible values of x increment by two
24
The large t approximation
Added point: the terms highlighted in purple cancel out. Terms of the form “y ln y” (fast growing) Meaning of the highlighting: Purple: these three terms all sum to zero and cancel each other out. Red: These terms are of the form” y ln y”, which grows faster than terms of the form “y” (these canceled anyway) and “ln y” (the terms we totally ignored). The terms boxed in red appear in the third line while other terms are relegated to the ellipsis. Blue: Between the third and fourth lines, we are just rearranging terms. I bring together the two terms boxed in blue to form the term boxed in green. Green: The next slide will be spent simplifying this term. Bring these terms together Let’s simplify this first!
25
The large t approximation
26
The large t approximation
The very attentive among you may notice that we ignored some the term we are left with, -x^2/2t, actually is not
27
The large t approximation
Cut off infinity sign-looking symbol means “proportional to”
28
The large t approximation
We showed that in the limit of many steps (large t), the binomial probability distribution is well-approximated by a Gaussian distribution: We don’t need to wait a long time for this limit to be appropriate if Dt is small. (Recall t was being measured in units of Dt.)
29
A change of units If we express x and t in natural units (e.g. cm and seconds instead of Dx and Dt), this distribution will be scaled: A large D corresponds to a “faster” walk. Notice that D has units of [L]2/[T].
30
Time evolution of p(x,t)
For this plot, I used D=1. What do we expect this probability distribution to look like as t -> 0? Next time, introduce Dirac delta distribution.
31
Introduction to Diffusion summary (so far)
The probability distribution of a random walker is a binomial distribution. After many steps, the probability distribution is well-approximated by a Gaussian. The variance of this Gaussian distribution, 2Dt, increases with time. Through the derivation, we practiced using Stirling’s approximation, Taylor expansions, and normalization via a Gaussian integral formula.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.