Information Theory in an Industrial Research Lab Marcelo J. Weinberger Information Theory Research Group Hewlett-Packard Laboratories – Advanced Studies.

Slides:



Advertisements
Similar presentations
Communication System Overview
Advertisements

Pattern Recognition and Machine Learning
Information Theory EE322 Al-Sanie.
Model Assessment and Selection
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
Adaptive Denoising for Video Compression Eren Soyak EECS 463 Winter 2006 Northwestern University.
CSE 589 Applied Algorithms Spring 1999 Image Compression Vector Quantization Nearest Neighbor Search.
SWE 423: Multimedia Systems
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Fundamental limits in Information Theory Chapter 10 :
Spatial and Temporal Data Mining
Engineering Data Analysis & Modeling Practical Solutions to Practical Problems Dr. James McNames Biomedical Signal Processing Laboratory Electrical & Computer.
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Communication Systems
Machine Learning CMPT 726 Simon Fraser University
Mean Squared Error : Love It or Leave It ?. Why do we love the MSE ? It is simple. It has a clear physical meaning. The MSE is an excellent metric in.
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. Gerhard Maierbacher Scalable Coding Solutions for Wireless Sensor Networks IT.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Tracey Ho Sidharth Jaggi Tsinghua University Hongyi Yao California Institute of Technology Theodoros Dikaliotis California Institute of Technology Chinese.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Source-Channel Prediction in Error Resilient Video Coding Hua Yang and Kenneth Rose Signal Compression Laboratory ECE Department University of California,
Thanks to Nir Friedman, HU
Statistical Learning: Pattern Classification, Prediction, and Control Peter Bartlett August 2002, UC Berkeley CIS.
Department of Computer Engineering University of California at Santa Cruz Data Compression (2) Hai Tao.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Noise, Information Theory, and Entropy
Software Research Image Compression Mohamed N. Ahmed, Ph.D.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Lecture 1 Contemporary issues in IT Lecture 1 Monday Lecture 10:00 – 12:00, Room 3.27 Lab 13:00 – 15:00, Lab 6.12 and 6.20 Lecturer: Dr Abir Hussain Room.
National Science Foundation Science & Technology Centers Program Bryn Mawr Howard MIT Princeton Purdue Stanford UC Berkeley UC San Diego UIUC Biology Thrust.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
CMPD273 Multimedia System Prepared by Nazrita Ibrahim © UNITEN2002 Multimedia System Characteristic Reference: F. Fluckiger: “Understanding networked multimedia,
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Abhik Majumdar, Rohit Puri, Kannan Ramchandran, and Jim Chou /24 1 Distributed Video Coding and Its Application Presented by Lei Sun.
Multimedia Specification Design and Production 2012 / Semester 1 / L3 Lecturer: Dr. Nikos Gazepidis
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Image Compression (Chapter 8) CSC 446 Lecturer: Nada ALZaben.
Adaptive Multi-path Prediction for Error Resilient H.264 Coding Xiaosong Zhou, C.-C. Jay Kuo University of Southern California Multimedia Signal Processing.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
Compression No. 1  Seattle Pacific University Data Compression Kevin Bolding Electrical Engineering Seattle Pacific University.
1 Classification of Compression Methods. 2 Data Compression  A means of reducing the size of blocks of data by removing  Unused material: e.g.) silence.
Outline Kinds of Coding Need for Compression Basic Types Taxonomy Performance Metrics.
Week 7 Lecture 1+2 Digital Communications System Architecture + Signals basics.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
DIGITAL COMMUNICATIONS Linear Block Codes
23 November Md. Tanvir Al Amin (Presenter) Anupam Bhattacharjee Department of Computer Science and Engineering,
Analyzing wireless sensor network data under suppression and failure in transmission Alan E. Gelfand Institute of Statistics and Decision Sciences Duke.
Lecture 2: Statistical learning primer for biologists
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
CHARACTERIZATION PRESENTATION ANAT KLEMPNER SPRING 2012 SUPERVISED BY: MALISA MARIJAN YONINA ELDAR A Compressed Sensing Based UWB Communication System.
Image Processing Architecture, © Oleh TretiakPage 1Lecture 4 ECE-C490 Winter 2004 Image Processing Architecture Lecture 4, 1/20/2004 Principles.
C.K. Kim, D.Y. Suh, J. Park, B. Jeon ha 強壯 !. DVC bitstream reorganiser.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
SEAC-3 J.Teuhola Information-Theoretic Foundations Founder: Claude Shannon, 1940’s Gives bounds for:  Ultimate data compression  Ultimate transmission.
#1 Make sense of problems and persevere in solving them How would you describe the problem in your own words? How would you describe what you are trying.
Diana B. Llacza Sosaya Digital Communications Chosun University
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Digital Communications Chapter 13. Source Coding
Digital Communications
Recent Advances in Iterative Parameter Estimation
Context-based Data Compression
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Special Topics In Scientific Computing
Scalar Quantization – Mathematical Model
Pattern Recognition and Machine Learning
Sparse and Redundant Representations and Their Applications in
Presentation transcript:

Information Theory in an Industrial Research Lab Marcelo J. Weinberger Information Theory Research Group Hewlett-Packard Laboratories – Advanced Studies Palo Alto, California, USA with contributions from the ITR group Purdue University – November 19, 2007

Information Theory research in the industry  Mission Research the mathematical foundations and practical applications of information theory, generating intellectual property and technology for “XXX Company” through the advancement of scientific knowledge in these areas  Apply the theory and work on the applications makes obvious sense for “XXX Company” research labs; But why invest on advancing the theory? some simple answers which apply to any basic research area: long-term investment, prestige, visibility, give back to society... this talk will be about a different type of answer: differentiating technology vs. enabling technology Main claim: working on the theory helps developing analytical tools that are needed to envision innovative, technology-differentiating ideas

Case studies  JPEG-LS: from universal context modeling to a lossless image compression standard  DUDE (Discrete Universal DEnoiser): from a formal setting for universal denoising to actual image denoising algorithms  Error-correcting codes in nanotechnology: the advantages of interdisciplinary research  2-D information theory: looking into the future of storage devices compress store, transmit de- compress Input Output

 The goal: upon observing, choose to optimize some fidelity criterion (e.g.: minimize number of symbol errors, squared distance, etc.)  A natural extension of work on prediction/compression  Applications: image and video denoising, text correction, financial data denoising, DNA sequence analysis, wireless communications… discrete source discrete memoryless channel (noise) denoiser Discrete Universal DEnoising (DUDE)

DUDE: how it’s done  pass 1: - gather statistics on symbol occurrences per context pattern - estimate noiseless symbol distribution given context pattern and noisy sample (posterior distribution)  pass 2: denoise each symbol, based on estimated posterior who do you believe? what you see, or what the global stats tell you? precise decision formula proven asymptotically optimal context template size must be carefully chosen zizi “context” samples sample being denoised data sequence noisy channel

 Key component of DUDE: Model the conditional distribution P ( Z i |context of Z i ) and infer P ( X i | Z i and context of Z i ) from it  Main issue: large alphabet large number of model parameters high learning cost  Leveraged “semi-universal’’ approach from image compression: rely on prior knowledge. Main tools: prediction contexts based on quantized data parameterized distributions  State-of-the-art for “salt-and-pepper” noise removal Competitive for Gaussian and ``real world” noise removal, but still room for improvement The main challenge in image denoising

Application 1: Image denoising  Best previous result in the literature: PSNR = 35.6 error rate=30% (Chan,Ho&Nikolova, IEEE IP Oct’05) error rate=30% PSNR=10.7 dB (``salt and pepper” noise) dude-denoised PSNR=38.3 dB

Application 2: Denoiser-enhanced ECC  Suitable for wireless communications  Leaves overall system ``as-is’’, but enhances receiver by denoising signal prior to error correction (ECC) decoding  Allows to design a “better receiver” that will recover signals other receivers would reject as undecodable transmitted codeword decodable region for regular ECC (handles code redundancy, structured) received noisy codeword denoising (handles source redundancy, natural) non-enhanced (no reception) DUDE-enhanced