EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Submission May, 2000 Doc: IEEE / 086 Steven Gray, Nokia Slide Brief Overview of Information Theory and Channel Coding Steven D. Gray 1.
Cellular Communications
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
Multiple-input multiple-output (MIMO) communication systems
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
ECE 559 VLSI – Design Project Viterbi Decoder VLSI Design Project Spring 2002 Dan Breen Keith Grimes Damian Nowak David Rust Advisor: Prof. Goeckel.
EC 723 Satellite Communication Systems
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 13.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
EE 3220: Digital Communication
ECED 4504 Digital Transmission Theory
S Advanced Digital Communication (4 cr)
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
EE 3220: Digital Communication Dr Hassan Yousif1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Soft-in/ Soft-out Noncoherent Sequence Detection for Bluetooth: Capacity, Error Rate and Throughput Analysis Rohit Iyer Seshadri and Matthew C. Valenti.
Medicaps Institute of Technology & Management Submitted by :- Prasanna Panse Priyanka Shukla Savita Deshmukh Guided by :- Mr. Anshul Shrotriya Assistant.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
EE 3220: Digital Communication
EE 3220: Digital Communication
EE 3220: Digital Communication
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Al Dawaser Prince Sattam bin.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
A simple rate ½ convolutional code encoder is shown below. The rectangular box represents one element of a serial shift register. The contents of the shift.
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser Slman bin Abdulaziz.
Wireless Communication Research Lab. CGU What is Convolution Code? 指導教授:黃文傑 博士 學生:吳濟廷
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Log-Likelihood Algebra
SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Convolutional Codes.
Convolutional Coding In telecommunication, a convolutional code is a type of error- correcting code in which m-bit information symbol to be encoded is.
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
MD. TARIQ HASAN SoC Design LAB Department of Information and Communication Engineering College of Electronics and Information Engineering Chosun University.
DIGITAL SYTEM DESIGN MINI PROJECT CONVOLUTION CODES
The Viterbi Decoding Algorithm
What is this “Viterbi Decoding”
MAP decoding: The BCJR algorithm
Coding and Interleaving
S Digital Communication Systems
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Error Correction Code (2)
Error Correction Code (2)
Wireless Mesh Networks
Error Correction Code (2)
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Error Correction Coding
IV. Convolutional Codes
Presentation transcript:

EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser Slman bin Abdulaziz University Lec-11: Soft and hard decisions

EE 3220: Digital Communication Dr Hassan Yousif 2 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser Slman bin Abdulaziz University Lec-11: Soft and hard decision

Dr Hassan Yousif3 Today, we are going to talk about: What are the state diagram and trellis representation of the code? How the decoding is performed for Convolutional codes? What is a Maximum likelihood decoder? What are the soft decisions and hard decisions? How does the Viterbi algorithm work?

Dr Hassan Yousif4 Block diagram of the DCS Information source Rate 1/n Conv. encoder Modulator Information sink Rate 1/n Conv. decoder Demodulator Channel

Dr Hassan Yousif5 State diagram A finite-state machine only encounters a finite number of states. State of a machine: the smallest amount of information that, together with a current input to the machine, can predict the output of the machine. In a Convolutional encoder, the state is represented by the content of the memory. Hence, there are states.

Dr Hassan Yousif6 State diagram – cont’d A state diagram is a way to represent the encoder. A state diagram contains all the states and all possible transitions between them. There can be only two transitions initiating from a state. There can be only two transitions ending up in a state.

Dr Hassan Yousif7 State diagram – cont’d outputNext state inputCurrent state /11 1/00 1/01 1/10 0/11 0/00 0/01 0/10 Input Output (Branch word)‏

Dr Hassan Yousif8 Trellis – cont’d The Trellis diagram is an extension of the state diagram that shows the passage of time. Example of a section of trellis for the rate ½ code Time State 0/00 1/10 0/11 0/10 0/01 1/11 1/01 1/00

Dr Hassan Yousif9 Trellis –cont’d A trellis diagram for the example code 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/ Input bits Output bits Tail bits

Dr Hassan Yousif10 Trellis – cont’d 1/11 0/00 0/10 1/11 1/01 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 0/00 0/11 0/ Input bits Output bits Tail bits

Dr Hassan Yousif11 Optimum decoding If the input sequence messages are equally likely, the optimum decoder which minimizes the probability of error is the Maximum likelihood decoder. The ML decoder, selects a codeword among all the possible codewords which maximizes the likelihood function where is the received sequence and is one of the possible codewords:  ML decoding rule: codewords to search!!!

Dr Hassan Yousif12 ML decoding for memory-less channels Due to the independent channel statistics for memoryless channels, the likelihood function becomes and equivalently, the log-likelihood function becomes The path metric up to time index, is called the partial path metric. Path metricBranch metric Bit metric  ML decoding rule: Choose the path with maximum metric among all the paths in the trellis. This path is the “closest” path to the transmitted sequence.

Dr Hassan Yousif13 Binary symmetric channels (BSC)‏ If is the Hamming distance between Z and U, then Modulator input 1-p p p Demodulator output  ML decoding rule: Choose the path with minimum Hamming distance from the received sequence. Size of coded sequence

Dr Hassan Yousif14 AWGN channels For BPSK modulation the transmitted sequence corresponding to the codeword is denoted by where and and. The log-likelihood function becomes Maximizing the correlation is equivalent to minimizing the Euclidean distance. Inner product or correlation between Z and S  ML decoding rule: Choose the path which with minimum Euclidean distance to the received sequence.

Dr Hassan Yousif15 Soft and hard decisions In hard decision: The demodulator makes a firm or hard decision whether a one or a zero was transmitted and provides no other information for the decoder such as how reliable the decision is. Hence, its output is only zero or one (the output is quantized only to two level) that are called “hard- bits”. Decoding based on hard-bits is called the “hard-decision decoding”.

Dr Hassan Yousif16 Soft and hard decision-cont’d In Soft decision: The demodulator provides the decoder with some side information together with the decision. The side information provides the decoder with a measure of confidence for the decision. The demodulator outputs which are called soft- bits, are quantized to more than two levels. Decoding based on soft-bits, is called the “soft-decision decoding”. On AWGN channels, a 2 dB and on fading channels a 6 dB gain are obtained by using soft-decoding instead of hard-decoding.

Dr Hassan Yousif17 The Viterbi algorithm The Viterbi algorithm performs Maximum likelihood decoding. It finds a path through the trellis with the largest metric (maximum correlation or minimum distance). It processes the demodulator outputs in an iterative manner. At each step in the trellis, it compares the metric of all paths entering each state, and keeps only the path with the smallest metric, called the survivor, together with its metric. It proceeds in the trellis by eliminating the least likely paths. It reduces the decoding complexity to !

Dr Hassan Yousif18 The Viterbi algorithm - cont’d Viterbi algorithm: A.Do the following set up: For a data block of L bits, form the trellis. The trellis has L+K-1 sections or levels and starts at time and ends up at time. Label all the branches in the trellis with their corresponding branch metric. For each state in the trellis at the time which is denoted by, define a parameter B.Then, do the following:

Dr Hassan Yousif19 The Viterbi algorithm - cont’d 1.Set and 2.At time, compute the partial path metrics for all the paths entering each state. 3.Set equal to the best partial path metric entering each state at time. Keep the survivor path and delete the dead paths from the trellis. 1.If, increase by 1 and return to step 2. A.Start at state zero at time. Follow the surviving branches backwards through the trellis. The path found is unique and corresponds to the ML codeword.

Dr Hassan Yousif20 Example of Hard decision Viterbi decoding 1/11 0/00 0/10 1/11 1/01 0/00 0/11 0/10 0/01 1/11 1/01 1/00 0/00 0/11 0/10 0/01 0/00 0/11 0/00

Dr Hassan Yousif21 Example of Hard decision Viterbi decoding-cont’d Label all the branches with the branch metric (Hamming distance)‏

Dr Hassan Yousif22 Example of Hard decision Viterbi decoding-cont’d i=

Dr Hassan Yousif23 Example of Hard decision Viterbi decoding-cont’d i=

Dr Hassan Yousif24 Example of Hard decision Viterbi decoding-cont’d i=

Dr Hassan Yousif25 Example of Hard decision Viterbi decoding-cont’d i=

Dr Hassan Yousif26 Example of Hard decision Viterbi decoding-cont’d i=

Dr Hassan Yousif27 Example of Hard decision Viterbi decoding- cont’d Trace back and then:

Dr Hassan Yousif28 Example of soft-decision Viterbi decoding 5/3 -5/3 4/ /3 -1/3 5/3 -5/3 1/3 -1/3 -5/3 0 10/31/314/3 2 8/3 10/3 13/33 1/3 5/3 Branch metric Partial metric 1/3 -4/3 5/3 -5/3