Presentation is loading. Please wait.

Presentation is loading. Please wait.

Coding and Algorithms for Memories Lecture 2

Similar presentations


Presentation on theme: "Coding and Algorithms for Memories Lecture 2"— Presentation transcript:

1 236601 - Coding and Algorithms for Memories Lecture 2

2 Overview Lecturer: Eitan Yaakobi yaakobi@cs.technion.ac.il, Taub 638
Lectures hours: Sundays Taub 201 Course website: Office hours: Sundays 17:30-18:30 and/or other times (please contact by before) Final grade: Class participation (10%) Homeworks (50%) Take home exam/final Homework + project (40%)

3 What is this class about?
Coding and Algorithms to Memories Memories – HDDs, flash memories, and other non-volatile memories Coding and algorithms – how to manage the memory and handle the interface between the physical level and the operating system Both from the theoretical and practical points of view Q: What is the difference between theory and practice?

4 Memories Volatile Memories – need power to maintain the information
Ex: RAM memories, DRAM, SRAM Non-Volatile Memories – do NOT need power to maintain the information Ex: HDD, optical disc (CD, DVD), flash memories Q: Examples of old non-volatile memories?

5 Optical Storage First generation – CD (Compact Disc), 700MB
Second generation – DVD (Digital Versatile Disc), 4.7GB, 1995 Third generation – BD (Blu-Ray Disc) Blue ray laser (shorter wavelength) A single layer can store 25GB, dual layer – 50GB Supported by Sony, Apple, Dell, Panasonic, LG, Pioneer

6 The Magnetic Hard Disk Drive
“1” “0”

7 Flash Memories 1 3 2 Introduce errors

8 SLC, MLC and TLC Flash SLC Flash MLC Flash TLC Flash 1 01 00 10 11 011
High Voltage High Voltage High Voltage 1 01 00 10 11 011 010 000 001 101 100 110 111 SLC Flash MLC Flash TLC Flash 1 Bit Per Cell 2 States 2 Bits Per Cell 4 States 3 Bits Per Cell 8 States Low Voltage Low Voltage Low Voltage Flash Memory Summit Santa Clara, CA USA

9 Flash Memories Programming
Array of cells made from floating gate transistors Typical size can be 32×215 The cells are programmed by pulsing electrons via hot-electron injection

10 Flash Memories Programming
Array of cells made from floating gate transistors Typical size can be 32×215 The cells are programmed by pulsing electrons via hot-electron injection Each cell can have q levels, represented by different amounts of electrons In order to reduce a cell level, thee cell and its containing block must be reset to level 0 before rewriting – A VERY EXPENSIVE OPERATION

11 Programming of Flash Memory Cells
Flash memory cells are programmed in parallel in order to increase the write speed Cells can only increase their value In order to decrease a cell level, its entire containing block (~106 cells) has to be erased first Flash memory cells do not behave identically When charge is injected, only a fraction of it is trapped in the cell Easy cells – most of the charge is trapped in the cell Hard cells – a small fraction of the charge is trapped in the cell Flash Memory Summit Santa Clara, CA USA

12 Programming of Flash Memory Cells
Flash memory cells are programmed in parallel in order to increase the write speed Cells can only increase their value In order to decrease a cell level, its entire containing block (~106 cells) has to be erased first Flash memory cells do not behave identically When charge is injected, only a fraction of it is trapped in the cell Easy cells – most of the charge is trapped in the cell Hard cells – a small fraction of the charge is trapped in the cell Goals: Programming is done cautiously to prevent over-shooting Programming should work for both easy and hard cells And still… fast enough Flash Memory Summit Santa Clara, CA USA

13 Incremental Step Pulse Programming (ISPP)
Gradually increase the program voltage First the easy cells reach their level On subsequent steps, only cells which didn’t reach their level are programmed Enable fast programming of both easy and hard cells Flash Memory Summit Santa Clara, CA USA

14 Rewriting Codes Array of cells, made of floating gate transistors
Each cell can store q different levels Today, q typically ranges between 2 and 16 The levels are represented by the number of electrons The cell’s level is increased by pulsing electrons To reduce a cell level, all cells in its containing block must first be reset to level 0 A VERY EXPENSIVE OPERATION Flash Memory Summit Santa Clara, CA USA

15 Rewriting Codes Problem: Cannot rewrite the memory without an erasure
However… It is still possible to rewrite if only cells in low level are programmed

16 From Wikipedia: One limitation of flash memory is that, although it can be read or programmed a byte or a word at a time in a random access fashion, it can only be erased a "block" at a time. This generally sets all bits in the block to 1. Starting with a freshly erased block, any location within that block can be programmed. However, once a bit has been set to 0, only by erasing the entire block can it be changed back to 1. In other words, flash memory (specifically NOR flash) offers random-access read and programming operations, but does not offer arbitrary random-access rewrite or erase operations. A location can, however, be rewritten as long as the new value's 0 bits are a superset of the over-written values. For example, a nibble value may be erased to 1111, then written e.g. as Successive writes to that nibble can change it to 1010, then 0010, and finally Essentially, erasure sets all bits to 1, and programming can only clear bits to 0. File systems designed for flash devices can make use of this capability, for example to represent sector metadata.

17 Rewrite codes significantly reduce the number of block erasures
Rewriting Codes Rewrite codes significantly reduce the number of block erasures Store 3 bits once Store 1 bit 8 times Store 4 bits once Store 1 bit 16 times

18 Rewriting Codes One of the most efficient schemes to decrease the number of block erasures Floating Codes Buffer Codes Trajectory Codes Rank Modulation Codes WOM Codes

19 Write-Once Memories (WOM)
Introduced by Rivest and Shamir, “How to reuse a write-once memory”, 1982 The memory elements represent bits (2 levels) and are irreversibly programmed from ‘0’ to ‘1’ 1st Write 2nd Write

20 Write-Once Memories (WOM)
Examples: data Memory State 00 000 11 011 data Memory State 10 010 00 111 data Memory State 11 100 10 101 data Memory State 01 001 1st Write 2nd Write

21 Write-Once Memories (WOM)
Introduced by Rivest and Shamir, “How to reuse a write-once memory”, 1982 The memory elements represent bits (2 levels) and are irreversibly programmed from ‘0’ to ‘1’ Q: How many cells are required to write 100 bits twice? P1: Is it possible to do better…? P2: How many cells to write k bits twice? P3: How many cells to write k bits t times? P3’: What is the total number of bits that is possible to write in n cells in t writes? 1st Write 2nd Write

22 Binary WOM Codes k1,…,kt:the number of bits on each write
n cells and t writes The sum-rate of the WOM code is R = (Σ1t ki)/n Rivest Shamir: R = (2+2)/3 = 4/3

23 Definition: WOM Codes Definition: An [n,t;M1,…,Mt] t-write WOM code is a coding scheme which consists of n cells and guarantees any t writes of alphabet size M1,…,Mt by programming cells from zero to one A WOM code consists of t encoding and decoding maps Ei, Di, 1 ≤i≤ t E1: {1,…,M1}  {0,1}n For 2 ≤i≤ t, Ei: {1,…,Mi}×Im(Ei-1)  {0,1}n such that for all (m,c)∊{1,…,Mi}×Im(Ei-1), Ei(m,c) ≥ c For 1 ≤i≤ t, Di: {0,1}n  {1,…,Mi} such that for Di(Ei(m,c)) =m for all (m,c)∊{1,…,Mi}×Im(Ei-1) The sum-rate of the WOM code is R = (Σ1t logMi)/n Rivest Shamir: [3,2;4,4], R = (log4+log4)/3=4/3

24 Definition: WOM Codes There are two cases
The individual rates on each write must all be the same: fixed-rate The individual rates are allowed to be different: unrestricted-rate We assume that the write number on each write is known. This knowledge does not affect the rate Assume there exists a [n,t;M1,…,Mt] t-write WOM code where the write number is known It is possible to construct a [Nn+t,t;M1N,…,MtN] t-write WOM code where the write number is not-known so asymptotically the sum-rate is the same

25 James Saxe’s WOM Code [n,n/2-1; n/2,n/2-1,n/2-2,…,2] WOM Code
Partition the memory into two parts of n/2 cells each First write: input symbol m∊{1,…,n/2} program the ith cell of the 1st group The ith write, i≥2: input symbol m∊{1,…,n/2-i+1} copy the first group to the second group program the ith available cell in the 1st group Decoding: There is always one cell that is programmed in the 1st and not in the 2nd group Its location, among the non-programmed cells, is the message value Sum-rate: (log(n/2)+log(n/2-1)+ … +log2)/n=log((n/2)!)/n ≈ (n/2log(n/2))/n ≈ (log n)/2

26 James Saxe’s WOM Code Example: n=8, [8,3; 4,3,2]
[n,n/2-1; n/2,n/2-1,n/2-2,…,2] WOM Code Partition the memory into two parts of n/2 cells each Example: n=8, [8,3; 4,3,2] First write: 3 Second write: 2 Third write: 1 Sum-rate: (log4+log3+log2)/8=4.58/8=0.57 0,0,0,0|0,0,0,0  0,0,1,0|0,0,0,0  0,1,1,0|0,0,1,0  1,1,1,0|0,1,1,0

27 WOM Codes Constructions
Rivest and Shamir ‘82 [3,2; 4,4] (R=1.33); [7,3; 8,8,8] (R=1.28); [7,5; 4,4,4,4,4] (R=1.42); [7,2; 26,26] (R=1.34) Tabular WOM-codes “Linear” WOM-codes David Klaner: [5,3; 5,5,5] (R=1.39) David Leavitt: [4,4; 7,7,7,7] (R=1.60) James Saxe: [n,n/2-1; n/2,n/2-1,n/2-2,…,2] (R≈0.5*log n), [12,3; 65,81,64] (R=1.53) Merkx ‘84 – WOM codes constructed with Projective Geometries [4,4;7,7,7,7] (R=1.60), [31,10; 31,31,31,31,31,31,31,31,31,31] (R=1.598) [7,4; 8,7,8,8] (R=1.69), [7,4; 8,7,11,8] (R=1.75) [8,4; 8,14,11,8] (R=1.66), [7,8; 16,16,16,16, 16,16,16,16] (R=1.75) Wu and Jiang ‘09 - Position modulation code for WOM codes [172,5; 256, 256,256,256,256] (R=1.63), [196,6; 256,256,256,256,256,256] (R=1.71), [238,8; 256,256,256,256,256,256,256,256] (R=1.88), [258,9; 256,256,256,256,256,256,256,256,256] (R=1.95), [278,10; 256,256,256,256,256,256,256,256,256,256] (R=2.01) Flash Memory Summit Santa Clara, CA USA


Download ppt "Coding and Algorithms for Memories Lecture 2"

Similar presentations


Ads by Google