Download presentation
Presentation is loading. Please wait.
Published byBartholomew Bell Modified over 8 years ago
1
Counterexamples to the maximal p -norm multiplicativity conjecture Patrick Hayden (McGill University) || | | N(½)N(½) p C&QIC, Santa Fe 2008
2
A challenge to the physicists John Pierce [1973]: I think that I have never met a physicist who understood information theory. I wish that physicists would stop talking about reformulating information theory and would give us a general expression for the capacity of a channel with quantum effects taken into account rather than a number of special cases.
3
Sending classical information through noisy quantum channels Physical model of a noisy channel: (Trace-preserving, completely positive map) HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send bits reliably to Bob through N is given by the ( regularization of the ) formula where the maximization is over some family of input/output states. m Encoding ( state) Decoding (measurement) m’
4
Sending classical information through noisy quantum channels Physical model of a noisy channel: (Trace-preserving, completely positive map) m Encoding ( state) Decoding (measurement) m’ HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send bits reliably to Bob through N is given by the ( regularization of the ) formula
5
The additivity conjecture: These two formulas are equal where Sustained, heroic, and so far inconclusive efforts by: Datta, Eisert, Fukuda, Holevo, King, Ruskai, Schumacher, Shirokov, Shor, Werner... Why do they care so much?
6
The additivity conjecture: These two formulas are equal where Operational interpretation: Alice doesn’t need to entangle her inputs across multiple uses of the channel. Codewords look like ¾ x 1 ¾ x 2 ¾ x n
7
QMAC solution pre-QIP 2005 Interpretation: Alice and Bob treat each others’ actions as noise. Independent decoding. No-go theorem for use of quantum side information. [Yard/Devetak/H 05 v1]
8
QMAC solution post-QIP 2005 Interpretation: Charlie decodes Alice’s quantum data first and uses it to help him decode Bob’s. (Or vice-versa.) Go theorem for use of quantum side information. [Yard/Devetak/H 05 v2]
9
Capacity formulas matter Fair question to throw at the speaker if you’re getting bored in any quantum Shannon theory talk: “Can you describe an effective procedure for calculating this capacity you claim to have determined?” If we can’t write down a tractable formula for the solution to a capacity problem, then we don’t fully understand the structure of the optimal codes. Lesson:
10
An (Almost) Equivalent Form: Minimum Entropy Outputs H( ) = - Tr[ log ] (von Neumann entropy of the density operator ) N, N 1 and N 2 are quantum channels. (CPTP) Notation: H min ( N ) = min H( N ( )) is the minimum output entropy of N. Conjecture: The minimum entropy output state for the product channel N 1 N 2 is attained by a product state input 1 2. [King-Ruskai 99]
11
Maximal p-norm multiplicativity conjecture Conjecture: The minimum entropy output state for the product channel N 1 N 2 is attained by a product-state input 1 2.
12
Maximal p-norm multiplicativity conjecture Conjecture: The minimum entropy output state for the product channel N 1 N 2 is attained by a product-state input 1 2. Renyi entropy (1 < p ): (Recover von Neumann entropy as p 1.) Norm? What norm? [Amosov-Holevo-Werner 00]
13
Partial results: Additivity holds if... One channel is Unitary A unital qubit channel A generalized depolarizing channel A generalized dephasing channel Entanglement-breaking A very noisy channel Complements of these channels [Amosov, Devetak, Eisert, Fujiwara, Hashizume, Holevo, King, Matsumoto, Nathanson, Ruskai, Shor, Wolf, Werner] [See Holevo ICM 2006]
14
But... 2002: Additivity fails for p > 4.79... [Holevo-Werner] 2007: Additivity fails for p > 2. [Winter]
15
Counterexamples for 1<p<2! For all 1 < p < 2, there exist channels N 1 and N 2 to C d such that: i)H p min ( N 1 ), H p min ( N 2 ) log d - O(1) ii)H p min ( N 1 N 2 ) p log d + O(1) Additivity would have implied: H p min ( N 1 N 2 ) 2 log d - O(1) Near p=1, minimum output entropy of N 1 N 2 not significantly greater than that of N 1 or N 2 alone! Intuition: Channels that look very noisy (nearly depolarizing) need not be anywhere near depolarizing on entangled input. 2 p 1
16
The counterexamples U |0 N()N() R S A B TRASH N N()N() S A Fix dimensions |R|<<|S|, |A|=|B| and choose U at random according to Haar measure. Demonstrate resulting channels violate Renyi additivity with non-zero probability. Two things to prove: i)Product channel has low minimum output entropy. ii)Individual channels have high minimum output entropies.
17
N N has low output entropy The key identity:
18
N N has low output entropy The key identity (v1): The key identity (v2): U |0 N()N() R S A B TRASH Easy calculation: This is BIG if |R| is small! (Compare 1/|A| 2 for maximally mixed state.) Choose |R| ~ |A| p-1.
19
N and N have high output entropy U |0 N()N() R S A B TRASH | If U is selected at random, what can be said about U| |0 ? U| |0 is highly entangled between A and B: H p ( N ( ) ) log|A| - O(1) (Compare maximally mixed state: log|A|.) N N()N() S A | [Lubkin, Lloyd, Page, Foong & Kanno, Sanchez-Ruiz, Sen…] Is this true simultaneously for all | S with a typical U? i.e. Is min | S H p ( N ( ) ) log|A| - O(1) ?
20
Concentration of measure SnSn LEVY: Given an -Lipschitz function f : S n ! R with median M, the probability that, for a random x 2 R S n, f ( x ) is further than from M is bounded above by exp (- n 2 C / 2 ) from some C > 0. AnAn A n < exp[- n g( )] for some g( ) indep. of n f ( x )= x 1 Just need a Lipschitz constant: Choosing f the map from | to H p ( N ( )), can take 2 |A| p-1. Pr[ H p ( N ( )) < log|A|- const - ] ~ exp( - const 2 |A| 3-p )
21
Connect the dots U (S |0 ½ A B 1)Choose a fine net F of states on the unit sphere of S |0 . 2)P ( Not all states in UF highly entangled ) · | F | P ( One state isn’t ) 3)Highly entangled for sufficiently fine N implies same for all states in S. THEOREM: If |R|~|A| p-1, then |S| ~ |A| 3-p and w.h.p. as|A| , min | S H p ( N ( ) ) log|A| - O(1). N and N have high minimum output entropy.
22
Done! For all 1 < p < 2, there exist channels N 1 and N 2 to C d such that: i)H p min ( N 1 ), H p min ( N 2 ) log d - O(1) ii)H p min ( N 1 N 2 ) p log d + O(1) Additivity would have implied: H p min ( N 1 N 2 ) 2 log d - O(1) Near p=1, minimum output entropy of N 1 N 2 not significantly greater than that of N 1 or N 2 alone!
23
What about von Neumann (p=1)??? Method fails: recall |R|~|A| p-1. Constants depend on p and blow up. Artifact of the analysis or does the conjecture survive at p=1?
24
|R|=3 |A|=|B|=24 ( N N )( )
25
What about von Neumann (p=1)??? Method fails: recall |R|~|A| p-1. Constants depend on p blow up. Artifact or does the conjecture survive at p=1? H p for p > 1 very sensitive to a single large eigenvalue, but H 1 is not.
26
Do some calculating Contribution from eigenvalue ~1/|R| Contribution from all the others For H p, p > 1, first term dominates but second term dominates H 1 H 1 (( N N )( )) = 2 log|A| - O(1) is BIG not small No additivity violations. To be sure, can anyone calculate the O(1) terms?
27
Summary Additivity fails for 1 < p < 2. Closes main approach to additivity for capacity itself. Further developments: Winter tightened Lipschitz bound, showing same examples work for 1 < p < Dupuis showed orthogonal group can replace unitary group: N 1 = N 2 Cubitt, Harrow, Leung, Montanaro & Winter have found violations for 0 p 0.12
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.