Grids A1,1 A1,2 A1,3 A1,4 A2,1 A2,2 A2,3 A2,4 A3,1 A3,2 A3,3 A3,4 A4,1 A4,2 A4,3 A4,4.

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Topics discussed in this section:
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
CY2G2 Information Theory 1
Entropy and Information Theory
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
1/15 KLKSK Pertemuan III Analog & Digital Data Shannon Theorem xDSL.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Near Shannon Limit Performance of Low Density Parity Check Codes
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have.
Turbo Codes Azmat Ali Pasha.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch4: Channel.
© 2007 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.1 Computer Networks and Internets with Internet Applications, 4e By Douglas.
Reliability and Channel Coding
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Channel Polarization and Polar Codes
Chapter 2 Basic Communication Theory Basic Communications Theory w Understand the basic transmission theory, and figure out the maximum data rate. w.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Lecture 7: Channel capacity and multiplexing Anders Västberg
Fundamentals of Digital Communication 2 Digital communication system Low Pass Filter SamplerQuantizer Channel Encoder Line Encoder Pulse Shaping Filters.
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
Channel Capacity
Channel Capacity.
CODED COOPERATIVE TRANSMISSION FOR WIRELESS COMMUNICATIONS Prof. Jinhong Yuan 原进宏 School of Electrical Engineering and Telecommunications University of.
1 Analog/Digital Modulation Analog Modulation The input is continuous signal Used in first generation mobile radio systems such as AMPS in USA. Digital.
1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.
Factors in Digital Modulation
Shannon Theory Risanuri Hidayat Reference L L Peterson and B S Davie,
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Power Control in Wireless Ad Hoc Networks Background An ad hoc network is a group of self configuring wireless nodes that lack infrastructure. Motivation—Power.
Coding Theory Efficient and Reliable Transfer of Information
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Interactive Channel Capacity Ran Raz Weizmann Institute Joint work with Gillat Kol Technion.
Slope of a Line Slope basically describes the steepness of a line.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Tufts University. EE194-WIR Wireless Sensor Networks. February 17, 2005 Increased QoS through a Degraded Channel using a Cross-Layered HARQ Protocol Elliot.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Transmission Effect and Wired Digital Communication Professor: Dr. Miguel Alonso Jr.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
INFORMATION THEORY Pui-chor Wong.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
CSCI 465 D ata Communications and Networks Lecture 10 Martin van Bommel CSCI 465 Data Communications & Networks 1.
Channel Capacity Bandwidth – In cycles per second of Hertz – Constrained by transmitter and medium Data rate – In bits per second – Rate at which data.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Shannon’s Theorem.
Advanced Wireless Networks
Tilted Matching for Feedback Channels
Factor Graphs and the Sum-Product Algorithm
Information Theory Michael J. Watts
تصنيف التفاعلات الكيميائية
. Who is Most Merciful and Beneficial With the Name of Allah
Introduction Results Proofs Summary
Nyquist’s theorem: D = 2BLog2K
Nyquist and Shannon Capacity
CSCD 433 Network Programming
Capacity of Ad Hoc Networks
II. Modulation & Coding.
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
CSE 313 Data Communication
Topics discussed in this section:
CSCI-1680 Physical Layer Link Layer I
Properties of BP Algorithm
Logarithms Log Review.
BP in Practice Message Passing Inference Probabilistic Graphical
Physical Layer – How bits are sent
Presentation transcript:

Grids A1,1 A1,2 A1,3 A1,4 A2,1 A2,2 A2,3 A2,4 A3,1 A3,2 A3,3 A3,4 A4,1 A4,2 A4,3 A4,4

1: A,B 4: A,D A B D 2: B,C 3: C,D C 1,4 = 1 4,1 = 1 1,2 = 1 1 3,4 = B D 2,1 = 1 4,3 = 1 2: B,C 3: C,D C 2,3 = 1 3,2 = 1

1: A,B 4: A,D A B D 2: B,C 3: C,D C 1,4 = B 2,1(B) y1(A,B) 4,1 = D 3,4(D) y4(A,D) 1: A,B 4: A,D A 1,2 = A 4,1(A) y1 (A,B) 3,4 = C 2,3(C) y3(C,D) B D 2,1 = C 3,2(C) y2(B,C) 4,3 = A 1,4(A) y4(A,D) 2: B,C 3: C,D C 2,3 = B 1,2(B) y2(B,C) 3,2 = D 4,3(D) y3(C,D)

Loopy BP Run P(a1) Iteration # True posterior 0.85 0.8 0.75 0.7 0.65 0.55 5 10 15 20 Iteration #

Grid Cluster Graph A1,1 A1,1 , A1,2 A1,2 A1,2 , A1,3 A1,3 A1,1 , A2,1

Cluster Graphs 1: A, B, C C 3: B,D,F B D E 2: B, C, D 5: D, E 4: B, E

Loopy in Practice Synchronous BP: all messages are updated in parallel asynchronous order 1 Time (seconds) 2 4 6 8 10 12 14 # messages converged Ising Grid x 100 asynchronous order 2 synchronous 11 11 12 12 13 13 21 21 22 22 23 23 31 31 32 32 33 33 Asynchronous is faster than synchronous Order of messages has a significant effect on behaviour

Shannon’s Theorem Goal: Transmit bits over a noisy channel How efficient can we make our transmission, for arbitrarily low probability of making an error Shannon’s result: Define Channel Capacity = bound on code rate for a given signal to noise ratio all rates under this are achievable, for arbitrarily low error rate (simply make messages long enough) no rate above that is achievable # bits per message Rate of a code = # bits sent

Y1 Y2 Y3 Y4 X1 X2 X3 X4 U1 U2 U3 U4 X5 X6 X7 Y5 Y6 Y7

U1 U2 Un Z1 Z2 Zn X1 X2 Xn Y1 Y2 Yn

Permuter Y4 Y8 X4 X8 W1 W2 W3 W4 U1 U2 U3 U4 Y1 Y3 Y5 Y7 Z1 Z2 Z3 Z4

Coding: Post 1993 Shannon limit = -0.79