ADVANCED SIGNAL PROCESSING TECHNIQUES FOR WIRELESS COMMUNICATIONS Erdal Panayırcı Electronics Engineering Department IŞIK University.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Jensen’s Inequality (Special Case) EM Theorem.
Segmentation and Fitting Using Probabilistic Methods
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
1 Expectation Maximization Algorithm José M. Bioucas-Dias Instituto Superior Técnico 2005.
Visual Recognition Tutorial
Kuang-Hao Liu et al Presented by Xin Che 11/18/09.
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Lecture 5: Learning models using EM
Evaluating Hypotheses
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Maximum Likelihood (ML), Expectation Maximization (EM)
Expectation-Maximization
Visual Recognition Tutorial
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Particle Filtering in Network Tomography
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 Bayesian Learning for Latent Semantic Analysis Jen-Tzung Chien, Meng-Sun Wu and Chia-Sheng Wu Presenter: Hsuan-Sheng Chiu.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Particle Filtering (Sequential Monte Carlo)
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.
Multiuser Detection (MUD) Combined with array signal processing in current wireless communication environments Wed. 박사 3학기 구 정 회.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Plenary Department 1 Meeting, May 13, 2005, Paris ADVANCED SIGNAL PROCESSING ALGORITHMS AND USER MOBILITY FOR WIRELESS COMMUNICATIONS Erdal Panayırcı Electronics.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
1 A Randomized Space-Time Transmission Scheme for Secret-Key Agreement Xiaohua (Edward) Li 1, Mo Chen 1 and E. Paul Ratazzi 2 1 Department of Electrical.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
ITERATIVE CHANNEL ESTIMATION AND DECODING OF TURBO/CONVOLUTIONALLY CODED STBC-OFDM SYSTEMS Hakan Doğan 1, Hakan Ali Çırpan 1, Erdal Panayırcı 2 1 Istanbul.
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Competitive Scheduling in Wireless Networks with Correlated Channel State Ozan.
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
ECE 8443 – Pattern Recognition Objectives: Jensen’s Inequality (Special Case) EM Theorem Proof EM Example – Missing Data Intro to Hidden Markov Models.
Expectation-Maximization (EM) Algorithm & Monte Carlo Sampling for Inference and Approximation.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Spectrum Sensing In Cognitive Radio Networks
SLAM Tutorial (Part I) Marios Xanthidis.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
EE 551/451, Fall, 2006 Communication Systems Zhu Han Department of Electrical and Computer Engineering Class 15 Oct. 10 th, 2006.
Other Models for Time Series. The Hidden Markov Model (HMM)
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Jensen’s Inequality (Special Case) EM Theorem.
Visual Recognition Tutorial1 Markov models Hidden Markov models Forward/Backward algorithm Viterbi algorithm Baum-Welch estimation algorithm Hidden.
Chapter 6 Random Processes
Introduction to Sampling based inference and MCMC
Estimation and detection from coded signals
LECTURE 10: EXPECTATION MAXIMIZATION (EM)
Assoc. Prof. Dr. Peerapol Yuvapoositanon
Hidden Markov Models Part 2: Algorithms
Graduate School of Information Sciences, Tohoku University
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
Presentation transcript:

ADVANCED SIGNAL PROCESSING TECHNIQUES FOR WIRELESS COMMUNICATIONS Erdal Panayırcı Electronics Engineering Department IŞIK University

OUTLINE Introduction Knowledge Gaps in General The essential of EM algorithm The Sage algorithm Some Application Areas Sequential Monte Carlo Method (SMC) Knowledge Gaps in SMC

INTRODUCTION Future generation wireless commun. systems are confronted with new challenges mainly due to Hostile channel characteristics Limited bandwidth Very high data rates

Advanced Signal Proc. techniques such as The Expectation-Maximization algorithm The SAGE algorithm The Baum-Welch algorithm Sequential Monte Carlo Techniques Kalman filters and their extensions Hidden Markov modeling Stochastic approximation algorithms

In collaboration with Inexpensive and Rapid computational power provide powerful tools to overcome the limitations of current technologies.

Applications of advanced signal processing algorithms, include, but are not limited to Joint/Blind/Adaptive Sequence (data) detection Frequency, Phase,timing synchronization Equalization Channel Estimation techniques.

These techniques are employed in advanced wireless communication systems such as OFDM/OFDMA CDMA MIMO, Space-time-frequency Coding Multi-User detection

Especially, development of the suitable algorithms for wireless multiple access systems in Non-stationary Interference-rich environments presents major challenges to us.

Optimal solutions to these problems mostly can not be implemented in practice mainly due to high computational complexity

Advanced signal processing tools, I mentioned before, have provided a promising route for the design of low complexity algorithms with performances approaching the theoretical optimum for Fast, and Reliable communication in highly severe and dynamic wireless environment

Over the past decade, such methods have been successfully applied in several communication problems. But many technical challenges remain in emerging applications whose solutions will provide the bridge between the theoretical potential of such techniques and their practical utility.

The Key Knowledge Gaps Theoretical performance and convergence analysis of these Algorithms Some new efficient algorithms need to be worked out and developed for some of the problems mentioned above Computational complexity problems of these algorithms when applied to on-line implementations of some algorithms running in the digital receivers must be handled.

Implementation of these algorithms based on batch processing and sequential (adaptive) processing depending on how the data are processed and the inference is made has not been completely solved for some of the techniques mentioned above.

Some class of algorithms requires efficient generation of random samples from an arbitrary target probability distribution, known up to a normalizing constant. So far two basic types of algorithms, Metropolis algorithm and Gibbs sampler have been widely used in diverse fields. But it is known that they are substantially complex and difficult to apply for on-line applications like wireless communications. There are gaps for devising new types of more efficient algorithms that can be effectively employed in wireless applications.

THE EM ALGORITHM The EM algorithm was popularized in 1977 An iterative “algorithm” for obtaining ML parameter estimates Not really an algorithm, but a procedure Same problem has different EM formulations Based on definition of complete and incomplete data

L. E. Baum, T. Petrie, G. Soules and N. Weiss, A Maximization Technique in Statistical Estimation for Probabilistic Functions of Markov Chains, Annals of Mathematical Statistics, pp , 970. A. P. Dempster, N. M. Laird, and D. B. Rubin, Maximum-Likelihood from Incomplete Data Via the EM Algorithm, Journal, Royal Statistical Society, Vol. 39, pp. 1-17, C. F. Wu, On the Convergence Properties of the EM Algorithm, Annals of Statistics, Vol. 11, pp , Main References

The Essential EM Algorithm Consider estimating parameter vector s from data y (“incomplete” data): Parameters to be estimatedRandom parameters Then, the ML estimate of s is:

Thus, obtaining ML estimates may require: An Expectation Often analytically intractable A Maximization Computationally intensive

The EM Iteration Define the complete data x Many-to-one mapping having conditional density The EM iteration at the i-th step: E-step: M-step:

Convergence Properties At each iteration the likelihood-function is monotonically non-decreasing If the likelihood-function is bounded, then the algorithm converges Under some conditions, the limit point coincides with the ML estimate

EM Algorithm Extensions J. A. Fessler and A. O. Hero, Complete-data spaces and generalized EM algorithms, 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP-93), Vol. 4, pp. 1-4, J. A. Fessler and A. O. Hero, Space-alternating generalized EM algorithm, IEEE Transactions and Signal Processing, October 1994.

The SAGE Algorithm The SAGE algorithm is an extension of EM algorithm It provides much faster convergence than EM Algorithm alternates several hidden data spaces rather than just using one complete data space, and Updates only a subset of elements of the parameters in each itteration

Some Application Areas Positron-Emission-Tomography (PET) Genetics Neural Networks Radar Imaging Image / Speech processing Communications Channel Estimation / Equalization Multiuser detection Squence estimation Interference rejection

SEQUENTIAL MONTE CARLO TECHNIQUE (SMC) Emerged in the field of statistics, J. S. Liu and R. Chen, “Sequential Monte Carlo Methods for Dynamics Systems”, J. American Stat. Assoc., Vol. 93, pp , 1998.

Recently, SMC has been successfully applied to several problems in wireless communications, such as, Blind equalization Detection/decoding in fading channels It is basically based on approximating the expectation operation by means of sequentially generated Monte Carlo samples from either unknow state variables or system parameters.

Main Advantages SMC is self adaptive and no training/pilot symbols or decision feedback are needed Tracking of fading channels and the estimation of the data sequence are naturally in integrated Channel noise can be either Gaussian on Non-Gaussian It is suitable for MAP receiver design

If the system employs channel coding, the coded signal structure can be easily exploited to improve the accuracy of both channel and data estimation SMC is suitable for high-speed parallel implementation using VLSI Does not require iterations like in EM algorithm Updating with new data can be done more efficiently

SMC Method Let denote the parameter vector of interest Let denote the complete data so that is assumed to be simple is partially observed It can be partitioned as where denotes the observed part denotes in the incomplete data or unobservable or missing data.

Example 1. Fading channel Problem: Joinly estimate the data signal and the unknown channel parameters

2. Joint Phase Offset and SNR Estimation is unknown phase offset is unknown noise variance is the data to be transmitted

Problem: Estimate based on complete datawhere observed part incomplete data

MAP SOLUTION USING SMC METHOD MAP solution of the unknown parameter vector  is Where p(  |Y t ) can be computed by means of incomplete data sequence as

Substituting this in the above, we have

To implement SMS, we need to draw m independent samples (Monte Carlo samples) from the conditional distribution of Usually, directly drawing samples from this distribution is difficult. But, drawing samples from some trial- distribution is easy.

In this case, we can use the idea of importance sampling as follows: Suppose a set of samples is drawn from the trial distributionBy associating weight to the sample

where, The pair is called a properly weighted sample. w. r. t. distribution We can now estimateas follows;

By properly choosing the trial distribution q(.), the weighted samples can be generated sequentially. That is, suppose a set of properly weighted samples Then SMC algorithm generates from this set, a new one at time t. at time t-1 is given.

1. Draw samplesfrom the trial distribution q(.) and let 2.Compute the important weightfrom sequentially. 3.Compute the MAP estimate As a summary SMC algorithm is given as follows for j = 1, 2,..., m

KNOWLEDGE GAPS IN SMC Coosing the effective sample size m (empirically usually, 20 < m < 100 ). The sampling weights measures the “quality” of the corresponding drawn data sequence Small weights implies that these samples do not really represent the distribution from which they are drawn and have small contribution in the final estimation Resampling procedure was developed for it. It needs to be improved for differential applications

Delay Estimation Problem: Since the fading process is highly correlated, the future received signals contain information about current data and channel state. A delay estimate seems to be more efficient and promising than the present estimate summarized above.

In delay estimation: Instead of making inference on (S t,  ) with posterior density p( , S t |Y t ), we delay this inference to a later time (t+  ) with the distribution p( , S t |Y t+  )

Note: Such a delay estimation method does not increase computational cost but it requires some extra memory. Knowledge Gap: Develop computationally efficient delayed-sample estimation techniques which will find applications in channel with strong memory (ISI channel).

Turbo Coding Applications Because, SMC is soft-input and soft-output in nature, the resulting algorithms is capable of exchanging extrinsic information with the MAP outher channel decoder and sucessively improving the overall receiver performance. Therefore blind MAP decoder in turbo receivers can be worked out.