Maximum Likelihood Detection

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
Logic Circuits Design presented by Amr Al-Awamry
M. Mateen Yaqoob The University of Lahore Spring 2014.
$100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300.
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Quantum Error Correction SOURCES: Michele Mosca Daniel Gottesman Richard Spillman Andrew Landahl.
Binary Decision Diagrams1 BINARY DECISION DIAGRAMS.
Turbo Codes Azmat Ali Pasha.
HUFFMAN TREES CSC 172 SPRING 2002 LECTURE 24. Prefix Codes Consider a binary trie representing a code
Error detection/correction FOUR WEEK PROJECT 1 ITEMS TO BE DISCUSSED 1.0 OVERVIEW OF CODING STRENGTH (3MINS) Weight/distance of binary vectors Error detection.
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
Adaptive RLL Constrained Coding Seong Taek Chung Dec Stanford University.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Faculty of Computer Science © 2006 CMPUT 229 Special-Purpose Codes Binary, BCD, Hamming, Gray, EDC, ECC.
f has a saddle point at (1,1) 2.f has a local minimum at.
Error Detection and Correction Rizwan Rehman Centre for Computer Studies Dibrugarh University.
exercise in the previous class (1)
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
Channel Coding Part 1: Block Coding
CY2G2 Information Theory 5
$100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
ERROR CONTROL CODING Basic concepts Classes of codes: Block Codes
ENEE244-02xx Digital Logic Design Lecture 3. Announcements Homework 1 due next class (Thursday, September 11) First recitation quiz will be next Monday,
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Doc.: IEEE /133r0 Submission March 2003 Michael Park, Samsung Electronics co., LtdSlide 1 Project: IEEE P Working Group for Wireless Personal.
Last time, we talked about:
Basic Concepts of Encoding Codes and Error Correction 1.
Error Correction Code (2)
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Reed-Solomon Codes Probability of Not Decoding a Symbol Correctly By: G. Grizzard North Carolina State University Advising Professor: Dr. J. Komo Clemson.
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Introduction to Number Representation A451 GCSE Computing.
Error Detecting and Error Correcting Codes
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Decoders A decoder is a logic circuit that detects the presence of a specific combination of bits at its input. Two simple decoders that detect the presence.
Decoder Chapter 12 Subject: Digital System Year: 2009.
How To Graph Quadratic Equations Standard Form.
Supervised Training and Classification
Learning Objectives To recognise and plot quadratic graphs
Overview Introduction Logic Gates Flip Flops Registers Counters
Welcome to the presentation. Linear Block Codes Almost all block codes used today belong to a subset called linear block codes. The exclusive OR of two.
COT 5611 Operating Systems Design Principles Spring 2012
Error Correction Code (2)
Error Correction Code (2)
POINT ESTIMATOR OF PARAMETERS
محاسبات عددی و برنامه نویسی
Figure 2.1 Illustration of a simple binary counting application.
Plotting Multiple Graphs In The Same Plot
Standard Array.
Distributed Compression For Binary Symetric Channels
Confidence as Bayesian Probability: From Neural Origins to Behavior
X y y = x2 - 3x Solutions of y = x2 - 3x y x –1 5 –2 –3 6 y = x2-3x.
LOGIC Circuits.
Adder, Subtructer, Encoder, Decoder, Multiplexer, Demultiplexer
Error Correction Code (2)
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Theory: Question B & C For this task it has ask me to construct the ML(Maximum Likelihood) Table. Finally, I need to find the probability that the ML(Maximum.
Lecture 15 The Minimum Distance of a Code (Section 4.4)
Theory of Information Lecture 13
IV. Convolutional Codes
How To Graph Quadratic Equations.
Presentation transcript:

Maximum Likelihood Detection Dr. Muqaibel

Example A binary repetition code is used where 0 is encoded as 000 and 1 is encoded as 111. What is your decoding decision if what you receive over a BSC with cross over probability=0.3 is 101. Using Minimum Distance Decoding. Using Maximum Likelihood detection if p(0)=0.8 & p(1)=0.2. Comment

Matlab illustration % Dr. Ali Muqaibel 072 March 2008 %c1:000 c2=111 r=101 pc1=0:0.01:1; pc2=1-pc1; t2=1; t1=2; p=0.3; y=t1-t2-log(pc2./pc1)/log(p/(1-p)); % note log(p/(1-p)) is a negative number which flips the probability. plot (pc1,y) xlabel ('pc1')