Coding Theory: Packing, Covering, and 2-Player Games Robert Ellis Menger Day 2008: Recent Applied Mathematics Research Advances April 14, 2008.

Slides:



Advertisements
Similar presentations
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Advertisements

Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 12 ERROR DETECTION & CORRECTION.
Applied Algorithmics - week7
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Day 2 Information theory ( 信息論 ) Civil engineering ( 土木工程 ) Cultural exchange.
Cellular Communications
Optimal Merging Of Runs
DIGITAL COMMUNICATION Coding
Probabilistic Methods in Coding Theory: Asymmetric Covering Codes Joshua N. Cooper UCSD Dept. of Mathematics Robert B. Ellis Texas A&M Dept. of Mathematics.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
Forward Error Correction. FEC Basic Idea Send redundant data Receiver uses it to detect/correct errors Reduces retransmissions/NAKs Useful when RTT is.
Adaptive Coding from a Diffusion Process on the Integer Line Robert Ellis October 26, 2009 Joint work with Joshua Cooper, University of South Carolina.
Lecture 4 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Linear-Time Encodable and Decodable Error-Correcting Codes Jed Liu 3 March 2003.
A 2-player game for adaptive covering codes Robert B. Ellis Texas A&M coauthors: Vadim Ponomarenko, Trinity University Catherine Yan, Texas A&M.
15-853Page :Algorithms in the Real World Error Correcting Codes I – Overview – Hamming Codes – Linear Codes.
Noise, Information Theory, and Entropy
Rényi-Ulam liar games with a fixed number of lies Robert B. Ellis Illinois Institute of Technology University of Illinois at Chicago, October 26, 2005.
exercise in the previous class (1)
Hamming Codes 11/17/04. History In the late 1940’s Richard Hamming recognized that the further evolution of computers required greater reliability, in.
Games, Hats, and Codes Mira Bernstein Wellesley College SUMS 2005.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Identification of strategies for liar-type games via discrepancy from their linear approximations Robert Ellis October 14 th, 2011 AMS Sectional Meeting,
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
An Improved Liar Game Strategy From a Deterministic Random Walk Robert Ellis February 22 nd, 2010 Peled Workshop, UIC Joint work with Joshua Cooper, University.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
© 2009 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved. 1 Communication Reliability Asst. Prof. Chaiporn Jaikaeo, Ph.D.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 5.
ERROR CONTROL CODING Basic concepts Classes of codes: Block Codes
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
Codes & the Hat Game Troy Lynn Bullock John H. Reagan High School, Houston ISD Shalini Kapoor McArthur High School, Aldine ISD Faculty Mentor: Dr. Tie.
DIGITAL COMMUNICATIONS Linear Block Codes
Linear codes of good error control performance Tsonka Baicheva Institute of Mathematics and Informatics Bulgarian Academy of Sciences Bulgaria.
1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology.
Information Theory Linear Block Codes Jalal Al Roumy.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
Error Detection. Data can be corrupted during transmission. Some applications require that errors be detected and corrected. An error-detecting code can.
Error Detection and Correction – Hamming Code
Some Computation Problems in Coding Theory
Error Detection and Correction
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Data Communications and Networking
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Richard Cleve DC 2117 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Lecture (2011)
Error Detecting and Error Correcting Codes
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
Codes for Symbol-Pair Read Channels Yuval Cassuto EPFL – ALGO Formerly: Hitachi GST Research November 3, 2010 IPG Seminar.
An identity for dual versions of a chip-moving game Robert B. Ellis April 8 th, 2011 ISMAA 2011, North Central College Joint work with Ruoran Wang.
8 Coding Theory Discrete Mathematics: A Concept-based Approach.
An Improved Liar Game Strategy From a Deterministic Random Walk
An Improved Liar Game Strategy From a Deterministic Random Walk
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
RS – Reed Solomon List Decoding.
Theory of Information Lecture 13
An Improved Liar Game Strategy From a Deterministic Random Walk
Presentation transcript:

Coding Theory: Packing, Covering, and 2-Player Games Robert Ellis Menger Day 2008: Recent Applied Mathematics Research Advances April 14, 2008

2 Overview  A problem from integrated circuit design  Coding Theory –Error-correcting codes and packings –Error-correcting codes as a 2-player liar game –Covering codes –Covering codes as a football pool  Coding with Feedback –A liar game and an adaptive football pool –Near-perfect radius 1 adaptive codes  Results and Research Questions in Liar Games

3 A VLSI Layout Problem Silicon substrate Wires & components Inert metal fill Fill Library 2 6 patterns  2 3 patterns, Compression ratio: 50%

4 An Asymmetric Covering Code  Fill library  (6,2)-asymmetric binary code  Size bound  (2 n /n R ) (Cooper,Ellis,Kahng `02)  Application to VLSI Layout (Ellis,Kahng,Zheng `03)  Improved fixed-parameter codes: Applegate,Rains,Sloane `03; Exoo `04; Östergård,Seuranen `04  Improved size bound (Krivelevich,Sudakov,Vu `03) Codeword:

5 Therefore K + (4,2) = 6 (length=4, radius=2). Smallest (4,1)-Asymmetric Covering Code

6 00…00 11…11  Select each word to be in the code with probability p(n)  Any uncovered word is added as a codeword  This plus hypercube structure yields codes of size  (2 n /n R )  Best possible up to a constant, since middle ball volumes are  (n R ) Good (n,R)-Asymmetric Covering Codes

7 Coding Theory Overview  Coding theory concerns the properties of sets of codewords, or fixed-length strings from a finite alphabet.  Primary uses: Error-correction for transmission in the presence of noise Compression of data with or without loss  Many viewpoints afforded: Packings and coverings of Hamming balls in the n -cube 2-player perfect information games

8  Noisy communication: add redundancy to counteract noise  Noiseless communication: compress data using redundancy  The binary symmetric channel for noise 0 ≤ p < 1/2 Information Theory (Shannon Model) sender receiver encoder decoder Noise  1 …  n x1…xnx1…xn (x 1 +  1 )…(x n +  n ) m m Claude Shannon p p 1-p

9  Transmit blocks of length n  Noise changes ≤ e bits per block ( ||  || 1 ≤ e )  Repetition code 111, 000 – length: n = 3 – e = 1 –information rate: 1/3 Coding Theory: (n,e) -Codes  x1…xnx1…xn (x 1 +  1 )…(x n +  n ) Received: Decoded: blockwise majority vote Richard Hamming

10 Block Codes from now on Restricting to block codes still includes  Convolutional codes (cell phones, Bluetooth)  Reed-Solomon codes (CDs, DSL, WiMAX)  Turbo codes (Mars Reconnaissance Orbiter) (assumptions on noise for these codes will vary)

errors: incorrect decoding Coding Theory – A Hamming (7,1)-Code Length n=7, corrects e=1 error received decoded error: correct decoding

12  (3,1)-code: 111, 000  Pairwise distance = 3  1 error can be corrected  The M codewords of an (n,e) -code correspond to a packing of Hamming balls of radius e in the n -cube A Repetition Code as a Packing A packing of 2 radius 1 Hamming balls in the 3-cube

13 A (5,2) -Code as a Packing (5,2)-code: 01100, (disjoint) packing in 5-cube Volume: Sphere Bound: for an (n,e) - code with M codewords,

14  (5,1)-code: 11111, 10100, 01010, A (5,1) -Code as a 2-Player Game 0What is the 5 th bit? 1What is the 4 th bit? 0What is the 3 rd bit? 0What is the 2 nd bit? 0What is the 1 st bit? CarolePaul >1 # errors

15  Covering is the companion problem to packing  Packing: (n,e) -code  Covering: (n,R) -code Covering Codes length packing radius covering radius (3,1) -packing code and (3,1) -covering code “perfect code” (5,1)-packing code(5,1)-covering code

16 Optimal Length 5 Packing & Covering Codes (5,1) -packing code (5,1) -covering code

17 A (5,1) -Covering Code as a Football Pool WLLLLBet 7 LWLLLBet 6 LLWLLBet 5 LLLWWBet 4 WWWLWBet 3 WWWWLBet 2 WWWWWBet 1 Round 5Round 4Round 3Round 2Round 1 Payoff: a bet with ≤ 1 bad prediction Question. Min # bets to guarantee a payoff? Ans.=

18 Codes with Feedback (Adaptive Codes) sender receiver Noise Noiseless Feedback Elwyn Berlekamp  Feedback Noiseless, delay-less report of actual received bits  Improves the number of decodable messages E.g., from 20 to 28 messages for an (8,1) -code 1, 0, 1, 1, 0 1, 1, 1, 1, 0

19 A (5,1) -Adaptive Code as a 2-Player Liar Game A D B C 0 1 >1 # lies YIs the message C? NIs the message D? NIs the message B? NIs the message A or C? YIs the message C or D? CarolePaul Message Original encoding Adapted encoding A B C D **** 11*** 10*** 1000* 111**100** 1000* Y  1, N  0

20 A (5,1)-Adaptive Covering Code as a Football Pool LWLLW Carole L Bet 6 L Bet 5 L Bet 4 W Bet 3 W L L WW Bet 2 L W W W W W L L WW Bet 1 Round 5Round 4Round 3Round 2Round 1 Payoff: a bet with ≤ 1 bad prediction Question. Min # bets to guarantee a payoff? Ans.=6 Bet 3 Bet 6 Bet 4 Bet >1 # bad predictions (# lies) Bet 2 Bet 1

21  Form of an adaptive Hamming ball (radius 1)  Example: n = 5, e = R = 1 Feedback and Adaptive Hamming Balls Adapted encoding after * * * * * * 1 Error in 5 th bit 1 Child 5 Error in 4 th bit * Child 4 Error in 3 rd bit * Child 3 Error in 2 nd bit * Child 2 Error in 1 st bit * Child 1 Original encoding 0 Root

22 Classification of Coding Problems Packing No feedback Error-correcting codes P(n,e) Feedback Adaptive error- correcting codes P’(n,e) Covering Feedback Adaptive covering codes K’(n,R) No feedbackCovering codes K(n,R) Sphere Bound ≤ ≤ ≤ ≤

23 Near-Perfect Radius 1 Adaptive Codes Theorem (E.`05+). For all n ≥ 2 and e = R = 1, there exists an adaptive packing contained in an adaptive covering with sizes given by where (The sphere bound is ) P’(n,1)K’(n,1)

24 Proof Idea: Near-Perfect Radius 1 Adaptive Codes packing covering steal Q1Q1 Q2Q2 duplicate 0Q 1 1Q 1 duplicate Q2Q steal 0Q 2 1Q Q3Q Q3Q Q 2 packing Q 2 covering

25 Adaptive Coding as an (M,n,e) -Liar Game  M = # chips n = # rounds e = max # lies Carole picks a distinguished x 2 {1,…,M} 1 M >e>e # lies … e … (1) Paul bipartitions {1,…,M} = A 0 [ A 1 and asks “Is x 2 A 1 ?” (2) Carole responds “Yes” or “No”, and may lie up to e times. Each Round > A0A0 A1A1 “Yes” “No” >

26 Lose Original and Pathological Liar Games  Two variants –Original liar game (Berlekamp, Rényi, Ulam) Paul wins iff at most 1 chip survives after n rounds –Pathological liar game (Ellis&Yan) Paul wins iff at least 1 chip survives after n rounds … 0 1 >3 2 3 … … LoseWin … 0 1 >3 2 3 … … Win

27 Adaptive error- correcting codes Liar game Classification of Coding Problems Covering codes Adaptive covering codes Error-correcting codes K(n,R) No feedback K’(n,R) Feedback Covering P’(n,e) Feedback P(n,e) No feedback Packing Sphere Bound ≤ ≤ ≤ ≤ Pathological liar game

28 3 Chip Original Liar Game  Given M=3 chips, in how many rounds can Paul guarantee winning the game with e lies?  Label each chip with its distance to being eliminated  Introduce weight function f(x 1,x 2,x 3 )=x 1 +x 2 +x 3 -1 f(6,4,3) = 12  Each round: –Paul can force f to reduce by 1 –Carole can prevent f from reducing by more than >e>e e … > A0A0 A1A1 Paul wins iff n ≥ 3e+2

Chip Original Liar Game Order chip labels so that x 1 ≥ x 2 ≥ … ≥ x M.  M=4 chips: f 4 (x 1,x 2,x 3,x 4 )=x 1 +x 2 +x 3 -1 = 12  M=5 chips: f 5 (x 1,x 2,x 3,x 4,x 5 )=x 1 +x 2 +x 3 +  (x 1 =x 5 )-1  Exercise: find/verify the weight function for M=4,…,8 (Ellis&Łuczak)  Research Problem: find the weight function for M> > A0A0 A1A f 5 =18+1-1= > f 5 =18+0-1=17 (f 3 =18-1=17)

Chip Pathological Liar Game  M=2 chips g 2 (x 1,x 2 )=x 1 +x 2 -1 = 9  M=3 chips g 3 (x 1,x 2,x 3 )=x 1 +x 2 -1 = 9  M=4 chips g 4 (x 1,x 2,x 3,x 4 )=x 1 +x 2 +  (x 1 =x 4 )-1 = 12  M=2,3,4 (Ellis&Stanford) M>4: Research Problem > A0A0 A1A g 4 =12+1-1= > g 4 =12+0-1=11 (g 2 =12-1=11)

31 Perfect Splits and the Pathological Liar Game 0 1 >e>e e k-1 k 2k2k … … … 0 1 >e>e e k 2 k-1 … … … 0 1 >e>e e k-1 k 1 … … … 1 1 round k-1 rounds 0 1 >e>e e k-1 k 0 … … … Remove chips 2 k’ 0 1 >e>e e k-1 k 0 … … … Repeat until 1 chip left at position e … …

32  Upper bound on M=2 k : e/n is the overall fraction of lies  after k rounds the chips at position (e/n)k determine whether Paul wins  Lower bound on M=2 k : Each chip survives in only out of 2 n possible outcomes of the game; i.e., Perfect Splits and the Pathological Liar Game 0 1 >e>e e k-1 k 0 … … … k’ 0 error fraction

33 Many Open Questions at Every Level!  Research problems appropriate for Undergraduates, Graduate students, Dissertations, and beyond!  Fixed parameter games  Games with constrained lies  Non-binary alphabets  Restricted feedback  List decoding (win with L chips instead of 1)  Applying feedback coding to real-world problems