IV. Convolutional Codes

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Name Convolutional codes Tomashevich Victor. Name- 2 - Introduction Convolutional codes map information to code bits sequentially by convolving a sequence.
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Submission May, 2000 Doc: IEEE / 086 Steven Gray, Nokia Slide Brief Overview of Information Theory and Channel Coding Steven D. Gray 1.
Cellular Communications
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Coding and Error Control
Turbo Codes Azmat Ali Pasha.
Lecture 9-10: Error Detection and Correction Anders Västberg Slides are a selection from the slides from chapter 8 from:
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
ECE 559 VLSI – Design Project Viterbi Decoder VLSI Design Project Spring 2002 Dan Breen Keith Grimes Damian Nowak David Rust Advisor: Prof. Goeckel.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
#4 1 Victor S. Frost Dan F. Servey Distinguished Professor Electrical Engineering and Computer Science University of Kansas 2335 Irving Hill Dr. Lawrence,
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
ECED 4504 Digital Transmission Theory
S Advanced Digital Communication (4 cr)
Channel Coding and Error Control
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
Codes Codes are used for the following purposes: - to detect errors - to correct errors after detection Error Control Coding © Erhan A. Ince Types: -Linear.
1 Chapter 8. Coding and Error Control Wen-Shyang Hwang KUAS EE.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Basic Characteristics of Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
DIGITAL COMMUNICATIONS Linear Block Codes
Information Theory Linear Block Codes Jalal Al Roumy.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
A simple rate ½ convolutional code encoder is shown below. The rectangular box represents one element of a serial shift register. The contents of the shift.
Sujan Rajbhandari LCS Convolutional Coded DPIM for Indoor Optical Wireless Links S. Rajbhandari, N. M. Aldibbiat and Z. Ghassemlooy Optical Communications.
Wireless Communication Research Lab. CGU What is Convolution Code? 指導教授:黃文傑 博士 學生:吳濟廷
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Log-Likelihood Algebra
SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Convolutional Codes.
Convolutional Coding In telecommunication, a convolutional code is a type of error- correcting code in which m-bit information symbol to be encoded is.
Error Control Coding. Purpose To detect and correct error(s) that is introduced during transmission of digital signal.
Interleaving Compounding Packets & Convolution Codes
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
MD. TARIQ HASAN SoC Design LAB Department of Information and Communication Engineering College of Electronics and Information Engineering Chosun University.
© Tallal Elshabrawy Trellis Coded Modulation. © Tallal Elshabrawy Trellis Coded Modulation: Introduction Increases the constellation size compared to.
DIGITAL SYTEM DESIGN MINI PROJECT CONVOLUTION CODES
The Viterbi Decoding Algorithm
What is this “Viterbi Decoding”
Modulation and Coding Schemes
MAP decoding: The BCJR algorithm
Pipelined Architectures for High-Speed and Area-Efficient Viterbi Decoders Chen, Chao-Nan Chu, Hsi-Cheng.
Coding and Interleaving
S Digital Communication Systems
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Subject Name: Information Theory Coding Subject Code: 10EC55
Error Correction Code (2)
Error Correction Code (2)
IV. Convolutional Codes
Coding and Error Control
Wireless Mesh Networks
Error Correction Code (2)
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Error Correction Coding
Presentation transcript:

IV. Convolutional Codes

Introduction Block Codes: Code words are produced on a block by block basis. In Block Codes, the encoder must buffer an entire block before generating the associated codeword. Some applications have bits arrive serially rather than in large blocks Convolutional codes operate on the incoming message sequence continuously in a serial manner

Convolutional Codes Specification A convolutional code is specified by three parameters (n, k, K), where k/n is the coding rate and determines the number of data bits per coded bit K is called the constraint length of the encoder where the encoder has K-1 memory elements

Convolutional Encoder: Example Rate ½ Convolutional Encoder 1 + c2 Input Output 1 0 1 c1 + 1

Convolutional Encoder: Example Rate ½ Convolutional Encoder 1 + c2 Input Output 1 0 1 1 1 1 c1 +

Convolutional Encoder: Example Rate ½ Convolutional Encoder + c2 Input Output 1 0 1 1 1 0 1 1 c1 +

Convolutional Encoder: Example Rate ½ Convolutional Encoder + c2 Input Output 1 0 1 1 0 0 1 0 1 1 c1 +

State Diagram Representation 0/00 S0 b0 b1 0/11 1/11 1/00 S2 S1 0/01 States (b0b1) s0 00 s1 10 s2 01 s3 11 0/10 1/10 S3 1/01 Input 0 Input 1

Trellis Representation 0/00 S 0/11 1/11 1/00 S S 2 1 0/01 0/10 1/10 S 3 1/01 00 00 00 00 11 01 10 s0 (0 0) 11 11 11 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 10 s3 (1 1) 01

Trellis Representation 0/00 S 0/11 1/11 1/00 S S 2 1 0/01 Input: 101 Output: 001011 0/10 1/10 S 3 1/01 00 00 00 00 00 00 s0 (0 0) 11 11 11 11 11 11 s1 (1 0) 11 11 11 11 00 00 00 00 01 01 01 01 01 s2 (0 1) 10 10 10 10 10 10 10 10 10 s3 (1 1) 01 01 01 01

Maximum Likelihood Decoding What is the transmitted sequence that will most likely result in the received sequence at the decoder side? Viterbi Decoding of Convolutional Codes: Maximum likelihood decoding algorithm An algorithm that finds the closest codeword to a given received sequence Hard Decision: Closest  Minimum Hamming Distance Soft Decision : Closest  Decoding Accounts for Reliability of Decision

Viterbi Decoding: Hard Decision Example 0/00 S0 b0 b1 0/11 1/11 1/00 S2 S1 Hard Decision: The receiver makes a firm hard decision whether one or zero is received The receiver provides no information to the decoder characterizing reliability of its decision The input to the decoder is only zero or one 0/01 0/10 1/10 S3 States (b0b1) S0 00 S1 10 S2 01 S3 11 1/01 Input 0 Input 1

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 00 1 00 00 00 11 01 10 s0 (0 0) 11 11 11 1 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 10 s3 (1 1) 01

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 00 1 00 3 00 00 11 01 10 s0 (0 0) 11 11 11 1 1 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 2 10 s3 (1 1) 01 2

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 4 3 00 1 00 3 00 00 11 01 10 s0 (0 0) 11 11 11 4 1 1 s1 (1 0) 11 3 00 01 01 1 s2 (0 1) 10 10 4 2 10 s3 (1 1) 01 2 3 2

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 3 00 1 00 3 00 11 01 10 s0 (0 0) 11 11 1 1 s1 (1 0) 11 3 00 01 01 1 s2 (0 1) 10 2 s3 (1 1) 01 2 2

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 3 3 3 00 1 00 3 00 11 01 10 s0 (0 0) 11 11 5 1 1 s1 (1 0) 11 3 1 00 01 01 1 4 s2 (0 1) 10 3 2 s3 (1 1) 01 2 2 4 3

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 Two Equivalent Paths: Eliminate one of them at random 10 11 01 00 11 10 3 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 1 1 s1 (1 0) 11 3 11 1 11 11 00 00 00 00 01 01 01 01 1 s2 (0 1) 10 10 10 3 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 1 1 s1 (1 0) 11 3 1 11 11 00 00 00 00 01 01 01 01 1 s2 (0 1) 10 10 10 3 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 5 3 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 5 11 00 00 00 00 01 01 01 01 1 2 s2 (0 1) 10 10 10 3 4 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3 2 4

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 3 3 3 00 1 00 3 00 00 s0 (0 0) 11 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 00 00 00 01 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 10 10 s3 (1 1) 01 01 01 2 2 3 2

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 4 3 3 3 3 00 1 00 3 00 00 s0 (0 0) 11 11 11 11 3 4 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 01 1 2 5 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 01 2 2 3 2 3 4

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 10 11 01 00 11 10 3 3 3 3 00 1 00 3 00 s0 (0 0) 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 2 2 3 2 3

Viterbi Decoder Hard Decision Assume received (Hard Decision) vector is 01 11 00 10 11 01 Message (0 1 1 0 1 0) Decoded vector is 01 01 00 10 11 00 10 11 01 00 11 10 3 3 3 3 00 1 00 3 00 s0 (0 0) 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 2 2 3 2 3