Gentle tomography and efficient universal data compression Charlie Bennett Aram Harrow Seth Lloyd Caltech IQI Jan 14, 2003.

Slides:



Advertisements
Similar presentations
The Learnability of Quantum States Scott Aaronson University of Waterloo.
Advertisements

Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
EE 4780 Huffman Coding Example. Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Anthony Greene1 Simple Hypothesis Testing Detecting Statistical Differences In The Simplest Case:  and  are both known I The Logic of Hypothesis Testing:
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chapter 6 Information Theory
What’s the Difference? Efficient Set Reconciliation without Prior Context Frank Uyeda University of California, San Diego David Eppstein, Michael T. Goodrich.
Classical capacities of bidirectional channels Charles Bennett, IBM Aram Harrow, MIT/IBM, Debbie Leung, MSRI/IBM John Smolin,
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
Quantum Error Correction Michele Mosca. Quantum Error Correction: Bit Flip Errors l Suppose the environment will effect error (i.e. operation ) on our.
Spatial and Temporal Data Mining
Gate robustness: How much noise will ruin a quantum gate? Aram Harrow and Michael Nielsen, quant-ph/0212???
Quantum Circuits for Clebsch- GordAn and Schur duality transformations D. Bacon (Caltech), I. Chuang (MIT) and A. Harrow (MIT) quant-ph/ more.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Evaluating Hypotheses
No Free Lunch (NFL) Theorem Many slides are based on a presentation of Y.C. Ho Presentation by Kristian Nolde.
Quantum Circuits for Clebsch- Gordon and Schur duality transformations D. Bacon (Caltech), I. Chuang (MIT) and A. Harrow (MIT) quant-ph/
A Family of Quantum Protocols Igor Devetak, IBM Aram Harrow, MIT Andreas Winter, Bristol quant-ph/ IEEE Symposium on Information Theory June 28,
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Coherent Classical Communication Aram Harrow (MIT) quant-ph/
NO INPUT IS DETECTED ON RGB1. Quantum Circuits for Clebsch- Gordon and Schur duality transformations D. Bacon (Caltech), I. Chuang (MIT) and A. Harrow.
Information Theory and Security
EECS 598 Fall ’01 Quantum Cryptography Presentation By George Mathew.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
1 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Richard Cleve QNC 3129 Lecture 18 (2014)
Quantum Error Correction Jian-Wei Pan Lecture Note 9.
1 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 16 (2011)
Copyright © 2009 Cengage Learning Chapter 10 Introduction to Estimation ( 추 정 )
Quantum Factoring Michele Mosca The Fifth Canadian Summer School on Quantum Information August 3, 2005.
The private capacities of a secret shared reference frame Patrick Hayden (McGill) with: PRA 75: (2005) ??? Stephen Bartlett Robert Spekkens arXiv:quant-ph/
Counterexamples to the maximal p -norm multiplicativity conjecture Patrick Hayden (McGill University) || | | N(½)N(½) p C&QIC, Santa Fe 2008.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Coherent Communication of Classical Messages Aram Harrow (MIT) quant-ph/
Lecture 12: Linkage Analysis V Date: 10/03/02  Least squares  An EM algorithm  Simulated distribution  Marker coverage and density.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
8.4.2 Quantum process tomography 8.5 Limitations of the quantum operations formalism 量子輪講 2003 年 10 月 16 日 担当:徳本 晋
Vector Quantization CAP5015 Fall 2005.
Coherent Classical Communication Aram Harrow, MIT Quantum Computing Graduate Research Fellow Objective Objective ApproachStatus Determine.
compress! From theoretical viewpoint...
Fidelity of a Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Quantum Automatic Repeat Request (ARQ)
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Coherent Communication of Classical Messages Aram Harrow (MIT) quant-ph/
Fidelities of Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Qubits, von Neumann Measurement, Quantum.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
An Introduction to Quantum Computation Sandy Irani Department of Computer Science University of California, Irvine.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis Kevin Quinn Fall 2015.
DIGITIAL COMMUNICATION
A low cost quantum factoring algorithm
Context-based Data Compression
Quantum Circuits for Clebsch-Gordon and Schur duality transformations
Scott Aaronson (UT Austin) MIT, November 20, 2018
CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis Dan Grossman Fall 2013.
Quantum Computing Dorca Lee.
Gentle Measurement of Quantum States and Differential Privacy
Introduction to Estimation
Scott Aaronson (UT Austin) UNM, Albuquerque, October 18, 2018
Presentation transcript:

Gentle tomography and efficient universal data compression Charlie Bennett Aram Harrow Seth Lloyd Caltech IQI Jan 14, 2003

Classical data compression Asymptotically optimal: n copies of {p i } are compressed to n(H({p i })+  n ) bits with  n error where  n,  n ! 0 as n !1. Computationally efficient: running time is polynomial in n. Universal: algorithms work for any i.i.d. source.

Example: Huffman coding Letter (i)Probability (p i )Codeword A1/20 B1/410 C1/8110 D1/8111 Map letter i to a prefix-free codeword of length -log p i. The expected number of bits per symbol is -  i p i logp i =H(p i ) and the standard deviation is O( p n). A B C D

Quantum Data Compression A quantum source  =  j q j |  j ih  j | can be diagonalized  =  i p i |i ih i| and treated as a classical source. The main difference between quantum and classical Huffman coding is that measuring the length of the output will damage the state. Also, Schumacher compression assumes we know the basis in which  is diagonal. Therefore it is optimal and efficient, but not universal.

Universal quantum data compression JHHH showed that for any R, there exists a space H n,R of dimension 2 nR+o(n) such that if S(  )<R then most of the support of  ­ n lies in H n,R. (quant-ph/ ) HM showed that you don’t need to know the entropy. However, they present no efficient implementation. (quant-ph/ ) JP give an efficient, universal algorithm, but we like ours better. (quant-ph/ )

Goal: Efficient, Universal quantum data compression Modeled after classical Huffman coding. Pass 1: Read the data, determine the type class (empirical distribution) and decide on a code. Pass 2: Encode

Step 1: Gentle tomography Problem: Quantum state tomography damages the states it measures. Solution: use weak measurements. For example, in NMR the average value of  x can be approximately measured for spins without causing decoherence.

Gentle tomography algorithm Reduce tomography to estimating d 2 observables of the form tr  k. For each estimate: –Randomly divide the interval 0,…,n into n 1/4 different bins. –Measure only the bin that tr  k falls into. 0 ntr  k n bin width O (n 3/4 ) uncertainty O (n 1/2 ) n 1/4 random bin boundaries

Gentle tomography lemma If a projective measurement {M j } has high probability of obtaining a particular outcome k, then obtaining k leaves the state nearly unchanged. (Ahlswede, Winter) Thus, if we are very likely to estimate the state correctly, then this estimation doesn’t cause very much damage.

Implementation … ­n­n … |0 i ­ log n +1 Weakly Measure Example of a circuit to gently estimate h 1|  |1 i.

Result An information-disturbance tradeoff curve: error ¢ disturbance>n -1/2 (up to logarithmic factors). (Can we prove this is optimal?) In particular, we can set both error and damage equal to n -1/4 log n.

Step 2: Compression Given an estimate  with |  -  |< , how do we compress  ­ n ?

Huffman coding with an imperfect state estimate Suppose we encode  ­ n according to  =  i q i |i ih i|. Codeword i has length -log q i and occurs with probability h i|  |i i. Thus the expected length is  i -log q i h i|  |i i = -tr(  log  ) = S(  ) + S(  ||  ). Unfortunately, S(  ||  ) is not a continuous function of .

Dealing with imperfect estimates. Replace  with (1-  )  +  I/d. This makes the rate no worse than S(  )+  log 1/ .

Result A polynomial time algorithm that compresses  ­ n into n ( S(  )+ O (n -s log 2 n) ) qubits with error O (n s-1/2 log n).

Other methods Schumacher coding and the JHHH method both achieve S(  )+ O (kn -1/2 ) qubits/signal and exp(-k 2 ) error. The HM method has roughly the same rate- disturbance trade-off curve, though their overflow probability is lower. The JP method achieves error n -e and rate S(  )+n -r when e=1/2+r(1+d 2 ). For example, compressing qubits with constant error gives rate S(  )+ O (n -1/10 ).

Future directions Stationary ergodic sources. Likewise, a visible quantum coding theory exists, but the only algorithm is classical Lempel-Ziv. Proving converses: –No exponentially vanishing error. –No on-the-fly compression. –Information/disturbance and rate/disturbance trade-offs.

A Quantum Clebsch-Gordon transform (Bacon & Chuang) H ­ n = © ` n A ­ B. is a partition of n into d parts, A is an irreducible representation of SU(d) and B is an irrep of S n. Wanted: an efficient quantum circuit to map |i 1 …i n i!  | i |a i |b i. Useful for universal compression, state estimation, hypothesis testing, and entanglement concentration/dilution/distillation.