Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky.

Slides:



Advertisements
Similar presentations
Physical Layer: Signals, Capacity, and Coding
Advertisements

Topics discussed in this section:
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Information Theory EE322 Al-Sanie.
Information & Entropy. Shannon Information Axioms Small probability events should have more information than large probabilities. – “the nice person”
Classical Statistical Mechanics in the Canonical Ensemble.
Cellular Communications
Chain Rules for Entropy
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Chapter 3 Data and Signals
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
A Bit of Information Theory Unsupervised Learning Working Group Assaf Oron, Oct Based mostly upon: Cover & Thomas, “Elements of Inf. Theory”,
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Department of Electronic Engineering City University of Hong Kong EE3900 Computer Networks Data Transmission Slide 1 Continuous & Discrete Signals.
Learning decision trees derived from Hwee Tou Ng, slides for Russell & Norvig, AI a Modern Approachslides Tom Carter, “An introduction to information theory.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Reliability and Channel Coding
Noise, Information Theory, and Entropy
Data Communication and Networking Physical Layer and Media.
Noise, Information Theory, and Entropy
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1. Entropy as an Information Measure - Discrete variable definition Relationship to Code Length - Continuous Variable Differential Entropy 2. Maximum Entropy.
1-1 Basics of Data Transmission Our Objective is to understand …  Signals, bandwidth, data rate concepts  Transmission impairments  Channel capacity.
4/11/40 page 1 Department of Computer Engineering, Kasetsart University Introduction to Computer Communications and Networks CONSYL Transmission.
1 Ch 6 Long-Distance Communication Carriers, Modulation, and Modems.
§4 Continuous source and Gaussian channel
Chi-Cheng Lin, Winona State University CS 313 Introduction to Computer Networking & Telecommunication Theoretical Basis of Data Communication.
Lecture 9 Energy Levels Translations, rotations, harmonic oscillator
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Information Theory The Work of Claude Shannon ( ) and others.
DIGITAL COMMUNICATIONS Linear Block Codes
STATISTICAL COMPLEXITY ANALYSIS Dr. Dmitry Nerukh Giorgos Karvounis.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Lecture 1 – Introduction to Statistical Mechanics
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
200 Physics Concepts from Delores Gende Website
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Channel capacity A very important consideration in data communications is how fast we can send data, in bits per second, over a channel.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Optical Receivers Theory and Operation
Review: The physical layer. Bandwidth/Capacity bit rate/baud rate How to determine the number of bits per symbol? Relation between the bandwidth and capacity:
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
CSI 4118Fall Part 1.1 Signals, Media, And Data Transmission.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Chapter 2. Thermodynamics, Statistical Mechanics, and Metropolis Algorithm (2.1~2.5) Minsu Lee Adaptive Cooperative Systems, Martin Beckerman.
Powerpoint Templates Computer Communication & Networks Week # 04 1 Lecture only.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
1587: COMMUNICATION SYSTEMS 1 Digital Signals, modulation and noise Dr. George Loukas University of Greenwich,
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Lecture 1.31 Criteria for optimal reception of radio signals.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Ch9: Decision Trees 9.1 Introduction A decision tree:
Information Theory Michael J. Watts
Digital Multimedia Coding
. Who is Most Merciful and Beneficial With the Name of Allah
Signals, Media, And Data Transmission
Nyquist and Shannon Capacity
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Topics discussed in this section:
Presentation transcript:

Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky

Overview §Why do we study Molecular Info. Theory? §What are molecular machines? §Power of Logarithm §Components of a Communication System §Discrete Noiseless System §Channel Capacity §Molecular Machine Capacity

Motivation §Needle in a haystack situation. §How will you go about looking for the needle? §How much energy you need to spend? §How fast can you find the needle? §Haystack = DNA, Needle = Binding site, You = Ribosome

What is a Molecular Machine? §One or more molecules or a molecular complex: not a macroscopic reaction. §Performs a specific function. §Energized before the reaction. §Dissipates energy during reaction. §Gains information. §An isothermal engine.

Where is the candy? §Is it in the left four boxes? §Is it in the bottom four boxes? §Is it in the front four boxes? You need answer to three questions to find the candy Box labels: 000, 001, 010, 011, 100, 101, 110, 111 Need log8 = 3 bits of information

More candies… §Box labels: 00, 01, 10, 11, 00, 01, 10, 11 §Candy in both boxes labeled 01. §Need only log8 - log2 = 2 bits of information. In general, m boxes with n candies need log m - log n bits of information

Ribosomes 2600 binding sites from 4.7 million base pairs Need log(4.7 million) - log(2600) = 10.8 bits of information.

Communication System

Information Source §Represented by a stochastic process §Mathematically a Markov chain §We are interested in ergodic sources: Every sequence is statistically same as every other sequence.

How much information is produced? Measure of uncertainty H should be: §Continuous in the probability. §Monotonic increasing function of the number of events. §When a choice is broken down into two successive choices, Total H = weighted sum of individual H

Enter Entropy

Properties of Entropy §H is zero iff all but one p are zero. §H is never negative. §H is maximum when all the events are equally probable §If x and y are two events H(x,y) £ H(x) + H(y) §Conditional entropy: H x (y) £ H(y)

Why is entropy important? §Entropy is a measure of uncertainty. § §Entropy relation from thermodynamics §Also from thermodynamics §For every bit of information gained, the machine dissipates k B Tln2 joules.

Ribosome binding sites

Information in sequence

Information curve Information gain for site l is Plot of this across the sites gives Information curve. For E.Coli, Total information is about 11 bits. … same as what the ribosome needs.

Sequence Logo

Channel capacity Source transmitting 0 and 1 at 1000 symbols/sec. 1 in 100 symbols have an error. What is the rate of transmission? Need to apply a correction correction = uncertainty in x for a given value of y Same as conditional entropy = 81 bits/sec

Channel capacity contd. Shannon’s theorem: As long as the rate of transmission is below C, the number of errors can me made as small as needed. For a continuous source with white noise, Signal to noise ratio Bandwidth

Molecular Machine Capacity §Lock and key mechanism. §Each pin on the ribosome is a simple harmonic oscillator in thermal bath. §Velocity of the pins represented by points in 2-d velocity space §More pins -> more dimensions. §Distribution of points is spherical.

Machine capacity For larger dimensions: All points are in a thin spherical shell Radius of the shell is the velocity and hence square root of the energy Before binding: After Binding:

Number of choices = Number of ‘after’ spheres that can sit in the ‘before’ sphere =Vol. of Before sphere/Vol. Of after sphere Machine capacity = logarithm of number of choices

References