236601 - Coding and Algorithms for Memories Lecture 8 1.

Slides:



Advertisements
Similar presentations
Gillat Kol joint work with Ran Raz Locally Testable Codes Analogues to the Unique Games Conjecture Do Not Exist.
Advertisements

Size-estimation framework with applications to transitive closure and reachability Presented by Maxim Kalaev Edith Cohen AT&T Bell Labs 1996.
Greedy Algorithms Greed is good. (Some of the time)
CSCE 668 DISTRIBUTED ALGORITHMS AND SYSTEMS Fall 2011 Prof. Jennifer Welch CSCE 668 Self Stabilization 1.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
1 The RSA Algorithm Supplementary Notes Prepared by Raymond Wong Presented by Raymond Wong.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
1 Eitan Yaakobi, Laura Grupp Steven Swanson, Paul H. Siegel, and Jack K. Wolf Flash Memory Summit, August 2010 University of California San Diego Efficient.
Correcting Errors Beyond the Guruswami-Sudan Radius Farzad Parvaresh & Alexander Vardy Presented by Efrat Bank.
CPSC 668Self Stabilization1 CPSC 668 Distributed Algorithms and Systems Spring 2008 Prof. Jennifer Welch.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
The Goldreich-Levin Theorem: List-decoding the Hadamard code
The max flow problem
Theta Function Lecture 24: Apr 18. Error Detection Code Given a noisy channel, and a finite alphabet V, and certain pairs that can be confounded, the.
1 On the Benefits of Adaptivity in Property Testing of Dense Graphs Joint work with Mira Gonen Dana Ron Tel-Aviv University.
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
Coding for Flash Memories
Lecture 6: Point Location Computational Geometry Prof. Dr. Th. Ottmann 1 Point Location 1.Trapezoidal decomposition. 2.A search structure. 3.Randomized,
Linear-Time Encodable and Decodable Error-Correcting Codes Jed Liu 3 March 2003.
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
(CSC 102) Discrete Structures Lecture 10.
Computing and Communicating Functions over Sensor Networks A.Giridhar and P. R. Kumar Presented by Srikanth Hariharan.
1 By: MOSES CHARIKAR, CHANDRA CHEKURI, TOMAS FEDER, AND RAJEEV MOTWANI Presented By: Sarah Hegab.
Generalized Derangement Graphs Hannah Jackson.  If P is a set, the bijection f: P  P is a permutation of P.  Permutations can be written in cycle notation.
Coding and Algorithms for Memories Lecture 8 1.
Combinatorial Algorithms Reference Text: Kreher and Stinson.
Coding and Algorithms for Memories Lecture 9 1.
Elementary Sorting Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
1 Network Coding and its Applications in Communication Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
Great Theoretical Ideas in Computer Science.
Error Correction and Partial Information Rewriting for Flash Memories Yue Li joint work with Anxiao (Andrew) Jiang and Jehoshua Bruck.
1 Closures of Relations: Transitive Closure and Partitions Sections 8.4 and 8.5.
Coding and Algorithms for Memories Lecture 4 1.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Chapter 5: Permutation Groups  Definitions and Notations  Cycle Notation  Properties of Permutations.
Partitioning the Labeled Spanning Trees of an Arbitrary Graph into Isomorphism Classes Austin Mohr.
Introduction to Graphs And Breadth First Search. Graphs: what are they? Representations of pairwise relationships Collections of objects under some specified.
The parity bits of linear block codes are linear combination of the message. Therefore, we can represent the encoder by a linear system described by matrices.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Symmetry Symmetry Chapter 14 from “Model Checking” by Edmund M. Clarke Jr., Orna Grumberg, and Doron A. Peled presented by Anastasia Braginsky March.
SECTION 9 Orbits, Cycles, and the Alternating Groups Given a set A, a relation in A is defined by : For a, b  A, let a  b if and only if b =  n (a)
Coding and Algorithms for Memories Lecture 7 1.
Comp. Genomics Recitation 7 Clustering and analysis of microarrays.
Yue Li joint work with Anxiao (Andrew) Jiang and Jehoshua Bruck.
15.082J & 6.855J & ESD.78J September 30, 2010 The Label Correcting Algorithm.
Coding and Algorithms for Memories Lecture 7 1.
Compression for Fixed-Width Memories Ori Rottenstriech, Amit Berman, Yuval Cassuto and Isaac Keslassy Technion, Israel.
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Prüfer code algorithm Algorithm (Prüfer code)
Richard Anderson Lecture 26 NP-Completeness
Coding and Algorithms for Memories Lecture 8
Mathematical Structures for Computer Science Chapter 6
Spectral Clustering.
Lecture 9 Greedy Strategy
Richard Anderson Lecture 25 NP-Completeness
Elementary graph algorithms Chapter 22
Richard Anderson Lecture 13 Divide and Conquer
Minimum Spanning Tree Algorithms
II. Linear Block Codes.
Elementary Sorting Algorithms
Clustering.
Elementary graph algorithms Chapter 22
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Proving the Correctness of Huffman’s Algorithm
Lecture 17 Making New Codes from Old Codes (Section 4.6)
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

Coding and Algorithms for Memories Lecture 8 1

Flash Memory Cell Can only add water (charge) 2

The Leakage Problem Error! 3

The Overshooting Problem Need to erase the whole block 4

Possible Solution – Iterative Programming Slow…

Relative Vs. Absolute Values 01 Less errors More retention Jiang, Mateescu, Schwartz, Bruck, “Rank modulation for Flash Memories”,

The New Paradigm Rank Modulation Absolute values  Relative values Single cell  Multiple cells Physical cell  Logical cell 7

Rank Modulation Ordered set of n cells Assume discrete levels Relative levels define a permutation Basic operation: push-to-the-top Overshoot is not a concern Writing is much faster Increased reliability (data retention) 8

New Number Representation System permutation in lexicographical order [Lehmer 1906, Laisant 1888] FACTORADIC decimal a n-1 …a 3 a 2 a 1 = a n-1 ·(n-1)! + … + a 3 ·3! + a 2 ·2! + a 1 ·1! 0 ≤ a i ≤ i

Gray Codes for Rank Modulation Find cycle through n! states by push-to-the-top transitions Transition graph, n=3 n=3 3 cycles The problem: Is it possible to transition between all permutations? 10

Gray Codes for Arbitrary n Recursive construction: – Keep bottom cell fixed – (n-1)! transitions with others ~ (n-1)!

Gray Codes for Arbitrary n

 2134  2143 Kendall’s Tau Distance For a permutation  an adjacent transposition is the local exchange of two adjacent elements For ,π ∊ S m, d τ ( ,π) is the Kendall’s tau distance between  and π = Number of adjacent transpositions to change  to be π  =2413 and π= d τ ( ,π) = 3 It is called also the bubble-sort distance Lemma: Kendall’s tau distance induces a metric on S n The Kendall’s tau distance is the number of pairs that do not agree in their order  2143  2134 

Kendall’s Tau Distance Lemma: Kendall’s tau distance induces a metric on S n The Kendall’s tau distance is the number of pairs that do not agree in their order For a permutation , W τ (  ) = {(i,j) | i  -1 (i) } Lemma: d τ ( ,π)= |W τ (  )  W τ (π)| = |W τ (  )\W τ (π)| + |W τ (π)\W τ (  )| d τ ( ,id) = |W τ (  )| The maximum Kendall’s tau distance is n(n-1)/2 The inversion vector of  is x  =(x  (2),…,x  (n)) x  (i) = # of elements to the right of i and are less then i d τ ( ,id) = |W τ (  )| = Σ 2≤i≤n x  (i) 15

Kendall’s Tau Distance The Kendall’s tau ball: B r (  ) = {π|d τ ( ,π) ≤ r} The Kendall’s tau sphere: S r (  ) = {π|d τ ( ,π) = r} They do not depend on the center  |B 1 (  )| = n, |S 1 (  )| = n-1 16

How to Construct ECCs for the Kendall’s Tau Distance? Goal: Construct codes with some prescribed min Kendall’s tau dist d 17

ECCs for Kendall’s Tau Distance Goal: Construct codes correcting a single error Assume k or k+1 is prime Encode a permutation in S k to a permutation in S k+2 A code over S k+2 with k! codewords – s=(s 1,…s k ) S k is the information permutation – set the locations of k+1 Z k+1 and k+2 Z k+2 to be loc(k+1) = Σ 1 k (2i-1)s i (mod m) loc(k+2) = Σ 1 k (2i-1) 2 s i (mod m) m=k if k is prime and m=k+1 is k+1 is prime Ex: k=7, s=( ) loc(8) = 1  7+3  6+5  1+7  3+9  2+11  4+13  5 = 3 (mod 7) loc(9) = 1 2        5 = 2 (mod 7) E(s) = ( ) 18

ECCs for Kendall’s Tau Distance A code over S k+2 with k! codewords – s=(s 1,…s k ) S k is the information permutation – set the locations of k+1 Z k+1 and k+2 Z k+2 to be loc(k+1) = Σ 1 k (2i-1)s i (mod m) loc(k+2) = Σ 1 k (2i-1) 2 s i (mod m) m=k if k is prime and m=k+1 is k+1 is prime Ex: k=3, m=3 123 => ; loc(4)=1  1+3  2+5  3=1(mod3), loc(5)=1  1+9  2+25  3=1(mod3) 132 => ; loc(4)=1  1+3  3+5  2=2(mod3), loc(5)=1  1+9  3+25  2=2(mod3) 213 => ; loc(4)=1  2+3  1+5  3=2(mod3), loc(5)=1  2+9  1+25  3=2(mod3) 231 => ; loc(4)=1  2+3  3+5  1=1(mod3), loc(5)=1  2+9  3+25  1=0(mod3) 312 => ; loc(4)=1  3+3  1+5  2=1(mod3), loc(5)=1  3+9  1+25  2=2(mod3) 321 => ; loc(4)=1  3+3  2+5  1=2(mod3), loc(5)=1  3+9  2+25  1=1(mod3) 19

ECCs for Kendall’s Tau Distance A code over S k+2 with k! codewords – s=(s 1,…s k ) S k is the information permutation – set the locations of k+1 Z k+1 and k+2 Z k+2 to be loc(k+1) = Σ 1 k (2i-1)s i (mod m); loc(k+2) = Σ 1 k (2i-1) 2 s i (mod m) Theorem: This code can correct a single Kendall’s tau error. Proof: Enough to show that the Kendall’s tau distance between every two codewords is at least 3 – s=(s 1,…s k ) S k, u=E(s); t=(t 1,…t k ) S k, v=E(t) – If d τ (s,t)≥3 then d τ (u,v)≥3 ✔ – If d τ (s,t)=1, write t=(s 1,…s i+1,s i …s k ), let δ = s i+1 -s i loc s (k+1)-loc t (k+1)=(2i-1)s i +(2i+1)s i+1 –(2i-1)s i+1 -(2i+1)s i =2s i+1 -2s i =2δ(mod k) thus, they are not positioned in the same location. ✔ loc s (k+2)-loc t (k+2)=(2i-1) 2 s i +(2i+1) 2 s i+1 –(2i-1) 2 s i+1 -(2i+1) 2 s i =8is i+1 -8is i =8iδ(mod k) thus, they are not positioned in the same location either ✔. 20

ECCs for Kendall’s Tau Distance Proof (cont): – s=(s 1,…s k ) S k, u=E(s); t=(t 1,…t k ) S k, v=E(t) – If d τ (s,t)=1, write t=(s 1,…s i+1,s i …s k ), let δ = s i+1 -s i loc s (k+1)-loc t (k+1)=(2i-1)s i +(2i+1)s i+1 –(2i-1)s i+1 -(2i+1)s i =2s i+1 -2s i =2δ(mod k) thus, they are not positioned in the same location. ✔ loc s (k+2)-loc t (k+2)=(2i-1) 2 s i +(2i+1) 2 s i+1 –(2i-1) 2 s i+1 -(2i+1) 2 s i =8is i+1 -8is i =8iδ(mod k) thus, they are not positioned in the same location either ✔. – If d τ (s,t)=2, write t=(s 1,…,s i+1,s i,…,s i+1,s i,…,s k ), δ 1 =s i+1 -s i, δ 2 =s j+1 -s j loc s (k+1)-loc t (k+1)=2(δ 1 +δ 2 )(mod k) loc s (k+2)-loc t (k+2)=8iδ 1 +8jδ 2 (mod k) If δ 1 +δ 2 is NOT a multiple of k then k+1 appears in different locations ✔ Otherwise, δ 2 =-δ 1 (mod k) and loc s (k+2)-loc t (k+2)=8(i–j)δ 1 (mod k), so k+2 appears in different locations ✔ – If d τ (s,t)=2, write t=(s 1,…,s i+2,s i,s i+1,…,s k ) loc s (k+1)-loc t (k+1)=(2i-1)s i +(2i+1)s i+1 +(2i+3)s i+2 -(2i-1)s i+2 -(2i+1)s i -(2i+3)s i+1 ) =4s i+2 -2s i+1 -2s i (mod k) loc s (k+2)-loc t (k+2)=8((2i+1)s i+2 -(i+1)s i+1 -is i )(mod k) 21