Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.

Similar presentations


Presentation on theme: "Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info."— Presentation transcript:

1 Quantum Shannon Theory Patrick Hayden (McGill) http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info

2 Overview  Part I:  What is Shannon theory?  What does it have to do with quantum mechanics?  Some quantum Shannon theory highlights  Part II:  Resource inequalities  A skeleton key

3 Information (Shannon) theory  A practical question:  How to best make use of a given communications resource?  A mathematico-epistemological question:  How to quantify uncertainty and information?  Shannon:  Solved the first by considering the second.  A mathematical theory of communication [1948] The

4 Quantifying uncertainty  Entropy: H(X) = -  x p(x) log 2 p(x)  Proportional to entropy of statistical physics  Term suggested by von Neumann (more on him soon)  Can arrive at definition axiomatically:  H(X,Y) = H(X) + H(Y) for independent X, Y, etc.  Operational point of view…

5 X1X1 X 2 … XnXn Compression Source of independent copies of X {0,1} n : 2 n possible strings 2 nH(X) typical strings If X is binary: 0000100111010100010101100101 About nP(X=0) 0’s and nP(X=1) 1’s Can compress n copies of X to a binary string of length ~nH(X)

6 H(Y) Quantifying information H(X) H(Y|X) Information is that which reduces uncertainty I(X;Y) H(X|Y) Uncertainty in X when value of Y is known H(X|Y)= H(X,Y)-H(Y) = E Y H(X|Y=y) I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y) H(X,Y)

7 Sending information through noisy channels Statistical model of a noisy channel: ´ m Encoding Decoding m’ Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula

8 Shannon theory provides  Practically speaking:  A holy grail for error-correcting codes  Conceptually speaking:  A operationally-motivated way of thinking about correlations  What’s missing (for a quantum mechanic)?  Features from linear structure: Entanglement and non-orthogonality

9 Quantum Shannon Theory provides  General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits…  Relies on a  Major simplifying assumption: Computation is free  Minor simplifying assumption: Noise and data have regular structure

10 Quantifying uncertainty  Let  =  x p(x) |  x ih  x | be a density operator  von Neumann entropy: H(  ) = - tr [  log   Equal to Shannon entropy of  eigenvalues  Analog of a joint random variable:   AB describes a composite system A ­ B  H(A)  = H(  A ) = H( tr B  AB )

11 ­­ ­ …­ … ­  Compression Source of independent copies of   : B ­ n dim(Effective supp of  B ­ n ) ~ 2 nH(B) Can compress n copies of B to a system of ~nH(B) qubits while preserving correlations with A No statistical assumptions: Just quantum mechanics! A A A BBB (aka typical subspace) [Schumacher, Petz]

12 H(B)  Quantifying information H(A)  H(B|A)  H(A|B)  Uncertainty in A when value of B is known? H(A|B)= H(AB)-H(B) |  i AB =|0 i A |0 i B +|1 i A |1 i B  B = I/2 H(A|B)  = 0 – 1 = -1 Conditional entropy can be negative ! H(AB) 

13 H(B)  Quantifying information H(A)  H(B|A)  Information is that which reduces uncertainty I(A;B)  H(A|B)  Uncertainty in A when value of B is known? H(A|B)= H(AB)-H(B) I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0 H(AB) 

14 Data processing inequality (Strong subadditivity) Alice Bob time U I(A;B)  I(A;B)    I(A;B)  ¸ I(A;B) 

15 Sending classical information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) m Encoding (  state) Decoding (measurement) m’ HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the ( regularization of the ) formula where

16 Sending classical information through noisy channels m Encoding (  state) Decoding (measurement) m’ B ­ n 2 nH(B) X 1,X 2,…,X n 2 nH(B|A)

17 Sending quantum information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) |  i 2 C d Encoding (TPCP map) Decoding (TPCP map) ‘‘ LSD noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can reliably send qubits to Bob (1/n log d) through  is given by the ( regularization of the ) formula where Conditional entropy!

18 All x Random 2 n(I(X;Y)-  ) x Entanglement and privacy: More than an analogy p(y,z|x) x = x 1 x 2 … x n y=y 1 y 2 … y n z = z 1 z 2 … z n How to send a private message from Alice to Bob? AC93 Can send private messages at rate I(X;Y)-I(X;Z) Sets of size 2 n(I(X;Z)+  )

19 All x Random 2 n(I(X:A)-  ) x Entanglement and privacy: More than an analogy U A’->BE ­ n |  x i A’ |  i BE = U ­ n |  x i How to send a private message from Alice to Bob? D03 Can send private messages at rate I(X:A)-I(X:E) Sets of size 2 n(I(X:E)+  )

20 All x Random 2 n(I(X:A)-  ) x Entanglement and privacy: More than an analogy U A’->BE ­ n  x p x 1/2 |x i A |  x i A’  x p x 1/2 |x i A |  x i BE How to send a private message from Alice to Bob? SW97 D03 Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E) Sets of size 2 n(I(X:E)+  ) H(E)=H(AB)

21 Notions of distinguishability Basic requirement: quantum channels do not increase “distinguishability” FidelityTrace distance F( ,  )=max | h   |   i | 2 T( ,  )=|  -  | 1 F( ,  )={Tr[(  1/2  1/2 ) 1/2 ]} 2 F=0 for perfectly distinguishable F=1 for identical T=2 for perfectly distinguishable T=0 for identical T( ,  )=2max|p(k=0|  )-p(k=0|  )| where max is over POVMS {M k } F(  (  ),  (  )) ¸ F( ,  )T( ,  ) ¸ T(  ( ,  (  )) Statements made today hold for both measures

22 Conclusions: Part I  Information theory can be generalized to analyze quantum information processing  Yields a rich theory, surprising conceptual simplicity  Operational approach to thinking about quantum mechanics:  Compression, data transmission, superdense coding, subspace transmission, teleportation

23 Some references: Part I: Standard textbooks: * Cover & Thomas, Elements of information theory. * Nielsen & Chuang, Quantum computation and quantum information. (and references therein) Part II: Papers available at arxiv.org: * Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127 * Devetak, Harrow & Winter, A family of quantum protocols, quant-ph/0308044. * Horodecki, Oppenheim & Winter, Quantum information can be negative, quant-ph/0505062


Download ppt "Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info."

Similar presentations


Ads by Google