Communication Over Unknown Channels: A Personal Perspective of Over a Decade Research* Meir Feder Dept. of Electrical Engineering-Systems Tel-Aviv University.

Slides:



Advertisements
Similar presentations
Jesper H. Sørensen, Toshiaki Koike-Akino, and Philip Orlik 2012 IEEE International Symposium on Information Theory Proceedings Rateless Feedback Codes.
Advertisements

Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Doc.: IEEE /0015r2 Submission January 2004 Yang-Seok Choi et al., ViVATOSlide 1 Comments on Ergodic and Outage Capacity Yang-Seok Choi,
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Information Theory EE322 Al-Sanie.
Authors: David N.C. Tse, Ofer Zeitouni. Presented By Sai C. Chadalapaka.
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
2005/12/06OPLAB, Dept. of IM, NTU1 Optimizing the ARQ Performance in Downlink Packet Data Systems With Scheduling Haitao Zheng, Member, IEEE Harish Viswanathan,
Semi-Blind Equalization for OFDM using Space-Time Block Coding and Channel Shortening Alvin Leung Yang You EE381K-12 March 04, 2008.
Shaping Methods for Low Density Lattice Codes Meir Feder Dept. of Electrical Engineering-Systems Tel-Aviv University * Joint work with Naftali Sommer and.
Chapter 6 Information Theory
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
ISSPIT Ajman University of Science & Technology, UAE
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
2015/6/15VLC 2006 PART 1 Introduction on Video Coding StandardsVLC 2006 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Fountain Codes Amin Shokrollahi EPFL and Digital Fountain, Inc.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Introduction to Boosting Aristotelis Tsirigos SCLT seminar - NYU Computer Science.
Variable-Length Codes: Huffman Codes
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
How to Turn on The Coding in MANETs Chris Ng, Minkyu Kim, Muriel Medard, Wonsik Kim, Una-May O’Reilly, Varun Aggarwal, Chang Wook Ahn, Michelle Effros.
Capacity of multi-antenna Gaussian Channels, I. E. Telatar By: Imad Jabbour MIT May 11, 2006.
Noise, Information Theory, and Entropy
Coding Schemes for Multiple-Relay Channels 1 Ph.D. Defense Department of Electrical and Computer Engineering University of Waterloo Xiugang Wu December.
Noise, Information Theory, and Entropy
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Wireless Communication Elec 534 Set IV October 23, 2007
An algorithm for dynamic spectrum allocation in shadowing environment and with communication constraints Konstantinos Koufos Helsinki University of Technology.
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Cross-Layer Optimization in Wireless Networks under Different Packet Delay Metrics Chris T. K. Ng, Muriel Medard, Asuman Ozdaglar Massachusetts Institute.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
DIGITAL COMMUNICATIONS Linear Block Codes
Optimal Bayes Classification
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
1 Raptor codes for reliable multicast object delivery Michael Luby Digital Fountain.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
BCS547 Neural Decoding.
MAIN RESULT: Depending on path loss and the scaling of area relative to number of nodes, a novel hybrid scheme is required to achieve capacity, where multihop.
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Competitive Scheduling in Wireless Networks with Correlated Channel State Ozan.
A Semi-Blind Technique for MIMO Channel Matrix Estimation Aditya Jagannatham and Bhaskar D. Rao The proposed algorithm performs well compared to its training.
Throughput-Smoothness Trade-offs in Streaming Communication Gauri Joshi (MIT) Yuval Kochman (HUJI) Gregory Wornell (MIT) 1 13 th Oct 2015 Banff Workshop.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
INFORMATION THEORY Pui-chor Wong.
Hongjie Zhu,Chao Zhang,Jianhua Lu Designing of Fountain Codes with Short Code-Length International Workshop on Signal Design and Its Applications in Communications,
Bounds on Redundancy in Constrained Delay Arithmetic Coding Ofer ShayevitzEado Meron Meir Feder Ram Zamir Tel Aviv University.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Multiple Antennas.
EE465: Introduction to Digital Image Processing
Channel Equalization Techniques
Ch3: Model Building through Regression
Design of Multiple Antenna Coding Schemes with Channel Feedback
Tilted Matching for Feedback Channels
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Scalar Quantization – Mathematical Model
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Communication Over Unknown Channels: A Personal Perspective of Over a Decade Research* Meir Feder Dept. of Electrical Engineering-Systems Tel-Aviv University * The presentation includes joint works with Amos Lapidoth, Neri Merhav, Nadav Shulman, Elona Erez, Ofer Shayevitz and Yuval Lomnitz

The Problem and The Models

Unknown Channels Shannon: A Channel is a known Random Function of the Output y, given the Input x p(y|x) is known Capacity is defined and known Rates up to capacity can be attained reliably Feedback may only help when there is memory in the channel, but is not needed to “learn” the channel Unknown Channels: Possible models Stochastic channels with unknown probability model Individual channels: Arbitrary (unknown) input-output relation Capacity? Communication schemes? Feedback? Follow the success of universal source coding

The Compound Channel First considered by Blackwell, Breiman and Thomasian, 1960 Worst case “capacity”: max min I(X;Y,θ) p(x) θ Can be attained with “universal decoding”: MMI decoder for unknown DMC’s The compound capacity of unknown BSC’s, unknown scalar fading (y = θx + n) and other “common” models is ZERO

The Arbitrary Varying Channel Also considered by Blackwell, Breiman and Thomasian, 1960 With randomization, “worst case” capacity is known:

The Individual Noise Channel May be considered as a special case of the AVC However, since feedback and “variable rate” communication is natural, provides a new point of view Can attain, where unknown. But what does it mean?

The Models Make minimal assumptions about the channel Yet, get beyond worst case “outage” approach: Adapt to the instantaneous specific channel mode. Feedback seems to be a must. Broadcast channel approach?

Outline: The Considered Topics The Receiver Problem: Universal Decoding Universal decoding in the general case, channels with memory The “criterion” Rateless Codes for Universal Communication “Static Broadcast” with Simple Feedback (Ack/Nack) Code generation Universal Communication with Feedback The “Universal Horstein” scheme Individual Channels The setting and the plausible “rateless” solution Lesson Learned Practical: Rateless – incremental redundancy, “universal” decoding Theoretical:

Universal Decoding

The problem

The solution

Composite Hypotheses

A simple fading example Two Codewords: GLRT: New decoder: Uniformly improving the GLRT:

Practical Universal Decoders? Both GLRT and new decoders are exponential in block length Training? Lose rate! Decision Feedback? But apply “weighted” metric

Rateless Codes

A simple “universal” transmission strategy

Universal Prior Look for a prior P attaining:, For binary channels the uniform prior attain all the above: Non binary channels higher loss (uniform, conjecture): Similar result for Gaussian input

Practical Rateless Codes: Use efficiently decodable codes Incremental redundancy “Fountain Codes”? “Raptor Codes”? Rateless codes for Gaussian Channels: “multilevel” construction for incremental redundancy (Erez et al). Recent extension for MIMO channels

Individual Noise Channel with Feedback

The Individual Noise Channel with Feedback SF (‘09): Use “Universal” Horstein’s scheme - Use sequential estimate of the noise empirical probability

Specific scheme outline Randomization is a must Attain “empirical capacity”: Similar performance obtained by using “rateless” codes - Esarwan et al (‘10)

Individual Channels

Short Summary – L&F

The BIG Question: A “True” Universal Communication Scheme

Plausible answer: Reference scheme is constrained to be “finite block” (or FS): For modulo additive “individual” channel optimal performance is where z is the additive individual noise sequence This can be attained universally with feedback (LF submitted to ISIT 2011)

Lesson Learned

Practical Rateless Codes for Universal Communication Feedback is a must Simple Feedback (Ack/Nack) can work – ARQ schemes Incremental redundancy With Practical Universal Decoding Training data is a simple mean for universally decodable codebooks Training and decision based decoding with modified “metric”

Theoretical

Individual channels The problem and possible solution seems to be defined:  A “limited” resource scheme – finite block, finite state, with (or without feedback) can be designed retrospectively after tuned to the channel empirical behavior  The performance of this scheme can be attained universally by the scheme that was not tuned to the empirical measurements – at some “redundancy cost”  The universal scheme will use feedback (probably in the form of rateless coding – i.e. minimal feedback) and randomization  A task to complete!!

THANK YOU!