Error Correction Code (2)

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Iterative Equalization and Decoding
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
Forward Error Correction Demystified Presented by Sunrise Telecom Broadband … a step ahead.
2015/4/28System Arch 2008 (Fire Tom Wada) 1 Error Correction Code (1) Fire Tom Wada Professor, Information Engineering, Univ. of the Ryukyus.
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Submission May, 2000 Doc: IEEE / 086 Steven Gray, Nokia Slide Brief Overview of Information Theory and Channel Coding Steven D. Gray 1.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Coding and Error Control
Turbo Codes Azmat Ali Pasha.
Lecture 9-10: Error Detection and Correction Anders Västberg Slides are a selection from the slides from chapter 8 from:
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
ECE 559 VLSI – Design Project Viterbi Decoder VLSI Design Project Spring 2002 Dan Breen Keith Grimes Damian Nowak David Rust Advisor: Prof. Goeckel.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
#4 1 Victor S. Frost Dan F. Servey Distinguished Professor Electrical Engineering and Computer Science University of Kansas 2335 Irving Hill Dr. Lawrence,
296.3Page :Algorithms in the Real World Convolutional Coding & Viterbi Decoding.
ECED 4504 Digital Transmission Theory
S Advanced Digital Communication (4 cr)
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
1 Chapter 8. Coding and Error Control Wen-Shyang Hwang KUAS EE.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
A simple rate ½ convolutional code encoder is shown below. The rectangular box represents one element of a serial shift register. The contents of the shift.
Wireless Communication Research Lab. CGU What is Convolution Code? 指導教授:黃文傑 博士 學生:吳濟廷
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Log-Likelihood Algebra
SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Convolutional Codes.
Convolutional Coding In telecommunication, a convolutional code is a type of error- correcting code in which m-bit information symbol to be encoded is.
Interleaving Compounding Packets & Convolution Codes
Coding No. 1  Seattle Pacific University Digital Coding Kevin Bolding Electrical Engineering Seattle Pacific University.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
296.3:Algorithms in the Real World
DIGITAL SYTEM DESIGN MINI PROJECT CONVOLUTION CODES
The Viterbi Decoding Algorithm
What is this “Viterbi Decoding”
Modulation and Coding Schemes
Pipelined Architectures for High-Speed and Area-Efficient Viterbi Decoders Chen, Chao-Nan Chu, Hsi-Cheng.
Coding and Interleaving
Trellis Codes With Low Ones Density For The OR Multiple Access Channel
Interleaver-Division Multiple Access on the OR Channel
S Digital Communication Systems
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Su Yi Babak Azimi-Sadjad Shivkumar Kalyanaraman
Error Correction Code (2)
Channel coding architectures for OCDMA
IV. Convolutional Codes
Chapter 10: Error-Control Coding
Coding and Error Control
Wireless Mesh Networks
Error Correction Code (1)
Error Correction Code (1)
Error Correction Code (2)
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
IV. Convolutional Codes
Presentation transcript:

Error Correction Code (2) Fire Tom Wada Professor, Information Engineering, Univ. of the Ryukyus 2018/11/27

Two major FEC technologies Reed Solomon code (Block Code) Convolutional code Serially concatenated code Interleaving Reed Solomon Code Convolution Code ConcatenatedCoded Source Information Concatenated Goes to Storage, Transmission 2018/11/27

(n, k) Block Code it-2 it-1 it wt-2 wt-1 wt BLOCK k bit Information n bit wt-1 wt ENCODE Time t information block it is coded to time t code wt. k : information length n : code length Code Rate : R=k/n 2018/11/27

Convolutional Code it-2 it-1 it wt-2 wt-1 wt k bit Information code n bit wt-1 wt Constraint length K Time t code wt is determined by past K information. K : Constraint length Code Rate : R=k/n 2018/11/27

Simple Convolutional Coder  D is delay operator  c1t, c2t depends on not only current it bu also past it-1, it-2 2018/11/27

Simple Convolutional Coder(2) Time t 1 2 3 4 5 it F1=it-1 F2=it-2 c1t c2t If input information is “110010” then Output code is “11 10 10 11 11 01”. Constraint length K=3, R=1/2 2018/11/27

State Transition Diagram Number of State: 4 Time t 1 2 3 4 5 it F1=it-1 F2=it-2 c1t c2t 2018/11/27

State Transition Diagram(2) Time t 1 2 3 4 5 it F1=it-1 F2=it-2 c1t c2t t=0 t=5 t=1 t=3 t=2 In each time t, a traveler stays in one state. And move to different state cycle by cycle. 2018/11/27

Trellis Diagram Convert the State transition diagram to 2D diagram Vertical axis : states Horizontal axis : time states time 00 01 10 11 00/0 11/1 11/0 00/1 01/0 10/1 10/0 01/1 2018/11/27

Encoding in Trellis Input information determines a path in Trellis Time t 1 2 3 4 5 it F1=it-1 F2=it-2 c1t c2t Input information determines a path in Trellis Each Branch outputs 2bit code. t=0 t=1 t=2 t=3 t=4 t=5 00 01 10 11 00/0 11/1 11/0 00/1 01/0 10/1 10/0 01/1 11 11 11 01 10 10 2018/11/27

Punctured Convolutional Code The example Encoder has Code Rate R=1/2 Punctured Convolutional Code means that one or some code output is removed. Then Code Rate can be modified 2018/11/27

Merit of Punctured Code Larger code rate is better. Using Punctured technology, When communication channel condition is good, weak error correction but high code rate can be chosen. When communication channel condition is bud, strong error correction but low code rate can be chosen. This capability can be supported by small circuit change. One coder or decoder can be used for several code rate addaptively 2018/11/27

R=2/3 example 2018/11/27

Viterbi Decode Viterbi Decode is one method of Maximum likelihood decoding for Convolutional code. Maximum likelihood decoding Likelihood function is P(xi|r): Probability of seding xi under the condition of receiving r N message x1, x2, x3, …, xN x? r Sender take one message and send it out! Receiver find the maximum conditional Probability 2018/11/27

Viterbi Decode(2) Branch metric is the Likelihood function of each branch. -Ln{p(xk|ri)} High possibility  small value Example : Hamming distance Path metric is the sum of branch metrics along the possible TRELLIS PATH. 2018/11/27

Viterbi Decoding Example R=1/2, Receive 11 10 11 01 11 01 Calculate Each Branch metric (This time Hamming distance) 00 01 10 11 00(2) 11(0) 01(1) 10(1) 00(1) 11(1) 01(0) 10(2) 01(2) 10(0) t=0 t=1 t=2 t=3 t=4 t=5 t=6 Rece- ived 2018/11/27

Viterbi Decoding Example(2) Calculate Path metric in order to find minimum path metric path. Until t=2 Lower path has smaller path metric, Then Take it! t=0 t=1 t=2 t=3 t=4 t=5 t=6 5 3 00 00(2) 00 00(1) 00 00(2) 00 00(1) 00 00(2) 00 00(1) 00 2 11(0) 11(1) 11(0) 11(1) 11(0) 11(1) 3 01 01 11(0) 01 11(1) 01 11(0) 01 11(1) 01 2 00(2) 00(1) 00(2) 00(1) 01(2) 01(1) 01(0) 01(1) 01(0) 10 10 10 10 10 10(0) 10(1) 10(2) 10(1) 10(2) 10(1) 10(2) 10(1) 10(2) 11 11 11 11 11 01(1) 01(0) 01(1) 01(0) 2018/11/27

Viterbi Decoding Example(3) Calculate Path metric in order to find minimum path metric path. Until t=6 t=0 t=1 t=2 t=3 t=4 t=5 t=6 3 2 2 3 3 00 00(2) 00 00(1) 00 00(2) 00 00(1) 00 00(2) 00 00(1) 00 11(0) 11(1) 11(0) 3 11(1) 11(0) 11(1) 3 3 2 2 01 01 11(0) 01 11(1) 01 11(0) 01 11(1) 01 2 00(2) 00(1) 00(2) 00(1) 2 3 01(2) 01(1) 01(0) 01(1) 01(0) 10 10 10 10 10 10(0) 10(1) 1 10(2) 10(1) 10(2) 2 10(1) 10(2) 10(1) 10(2) 11 11 11 11 11 01(1) 01(0) 01(1) 01(0) 1 1 2 2 2018/11/27

Viterbi Decoding Example(4) Select Minimum Path Metric and get original information In this example, two minimum path Upper path : 1 1 0 0 1 0 Lower path : 1 1 1 1 1 1 If we increase the time, we might find ONLY ONE MINIMUM PATH. 00 01 10 11 00(2) 11(0) 01(1) 10(1) 00(1) 11(1) 01(0) 10(2) 01(2) 10(0) t=0 t=1 t=2 t=3 t=4 t=5 t=6 1 2 1 1 2 2 2018/11/27

Received signal has many level In the previous example, we have assumed the received sequence is 11 10 11 01 11 01 Usually, received signal is analog (Many Levels) such as SIGNAL LEVEL 1 2 3 4 5 6 7 2018/11/27

Hard Decision HARD DECISION LINE SIGNAL LEVEL 1 2 3 4 5 6 7 Loosing Reliability Information by Hard Decision! HARD DECISION OUTPUTS 1 1 1 1 1 1 1 1 1 Highly reliable 1 Low reliable 1 We have to distinguish! 2018/11/27

Soft Decision Use soft decision metric One Example LEVEL 1 2 3 4 5 6 7 1 2 3 4 5 6 7 Branch Metric for ‘0’ Branch Metric for ‘1’ Difference -3 -2 -1 Reliable ‘1’ Reliable ‘0’ 00 00(branch metric = 1 + 2 = 3) LEVEL= 65 HOW TO COMPUTE BRANCH METRIC No effect on Viterbi decoding! Looks like Erased or Punctured! 2018/11/27

Soft Decision Viterbi Calculate Branch Metric based on Soft-Table 00 01 10 11 00(1) 11(0) 01(1) 10(0) 00(2) 01(0) 10(2) 01(2) 00(3) 00(4) 11(1) 10(4) t=0 t=1 t=2 t=3 t=4 t=5 t=6 65 LEVEL 63 54 36 66 27 LEVEL 1 2 3 4 5 6 7 Branch Metric for ‘0’ Branch Metric for ‘1’ 2018/11/27

Soft Decision Viterbi(2) Calculate Path metric in order to find minimum path metric path. Until t=6 00 01 10 11 00(1) 11(0) 01(1) 10(0) 00(2) 01(0) 10(2) 01(2) 00(3) 00(4) 11(1) 10(4) t=0 t=1 t=2 t=3 t=4 t=5 t=6 65 LEVEL 63 54 36 66 27 3 5 2 3 4 3 3 2 4 2 3 3 1 1 3 3 2018/11/27

Soft Decision Viterbi(3) Select Minimum Path Metric and get original information In this example : 1 1 0 0 1 0 00 01 10 11 00(1) 11(0) 01(1) 10(0) 00(2) 01(0) 10(2) 01(2) 00(3) 00(4) 11(1) 10(4) t=0 t=1 t=2 t=3 t=4 t=5 t=6 65 LEVEL 63 54 36 66 27 2018/11/27

Summary 2 types of FEC Viterbi Decoder : Maximum likelihood decoding Block code such as RS Convolutional Code Code Rate Punctured Viterbi Decoder : Maximum likelihood decoding Trellis Hard Decision vs. Soft Decision Branch Metric, Path Metric 2018/11/27