Download presentation
Presentation is loading. Please wait.
Published byLambert Briggs Modified over 8 years ago
1
The Concavity of the Auxiliary Function for Classical-Quantum Channels Speaker: Hao-Chung Cheng Co-work: Min-Hsiu Hsieh Date: 01/09/2016 1
2
2 Discrete Memoryless Channel n -block encoder Error probability: Rate of the code: Shannon’s theory: n -fold product channel Message set decoder
3
3 Three Concerns Blocklength – the total number of channel uses Rate – the amount of information transmitted per channel use Error probability – given the coding scheme Investigate the minimum achievable probability of error as a function of the rate R, the block length n, and the channel W trade-off
4
4 Error Exponent Analysis C. E. Shannon, “Probability of error for optimal codes in a Gaussian channel,” Bell Syst. Tech. J., 38(5), 1959 A. S. Holevo, “Reliability function of general classical-quantum channel,” IEEE Inform. Theory, 46(6), 2000 Difficult to derive even in classical channels
5
5 Characterizations Upper bound: Sphere-packing exponent M. Dalai, “Lower bounds on the probability of error for classical and classical-quantum channels,” IEEE Inform. Theory, 59(12), 2013 Lower bound: Random coding exponent Auxiliary function
6
Bounds on E(R) for Classical Channels 6 Sphere packing upper bound Random coding lower bound Straight line upper bound Expurgated lower bound Exact reliability Resulting region for E(R)
7
7 Main Result Technique: Matrix quasi-arithmetic mean
8
8 Properties (1/4) (1) Non-negativity: (2) Monotonicity: R. Bhatia, P. Grover, “Norm inequalities related to the matrix geometric mean,” Linear Algebra Appl., 437, 2012 M. Tomamichel, M. Hayashi, “A hierarchy of information quantities for finite block length analysis of quantum tasks,” IEEE Inform. Theory, 59(11), 2013 T. Ogawa, H. Nagaoka, “Strong converse to the quantum channel coding theorem,” IEEE Inform. Theory, 45(7), 1999 (4) Mutual Information: (5) Concavity: (6) Variance: Channel dispersion
9
9 Properties (2/4) (7) From E 0 to E sp
10
10 Properties (3/4) (8) Convexity: (9) Convexity: (10) Tensor invariance: (11) Data-processing inequality: F. Hiai, “Concavity of certain matrix trace and normed functions,” Linear Algebra Appl., 439, 2013
11
11 Properties (4/4) (12) Alternative expression: Scaled Rényi Mutual Information Sphere-packing exponent: Random coding exponent: Strong converse exponent: M Mosonyi, T. Ogawa, “Strong converse exponent for classical-quantum channel coding,” arXiv:1409.3562
12
12 Differentiation
13
13 Geometric Mean (1/2) Definition. Concavity:
14
14 Geometric Mean (2/2) (1) Commutativity: (2) Joint homogeneity: (3) Monotonicity: (4) Congruence invariance: (5) Self-duality: (6) HM-GM-AM: (7) Continuity: (8) Tensor product: (9) Concavity:
15
15 Proof Strategy [Matharu & Aujla 2013] Matrix Hölder Inequality: Lemma: J. S. Matharu, J. S. Aujlia, “Some inequalities for unitarily invariant norm,” Linear Algebra Appl., 436, 2012 Concavity:
16
16 Discussions (1/2) Result: Prove the concavity of the auxiliary function in c-q channels Tool: Elegant proof from geometric means, instead of the differentiation approach Contributions: Convex optimization Sphere-packing exponent is convex in R High rate region: Question: Is the map concave for ?
17
17 Discussions (2/2)
18
18
19
19 Characterizations (2/3) R. M. Fano, Transmission of Information. MIT Press, 1961 R. G. Gallager, “A simple derivation of the coding theorem and some applications,” IEEE Inform. Theory, 11, 1965 I. Csiszár, J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, 1981 B. V. Burnashev, A. S. Holevo, “On reliability function of quantum communication channel,” Prob. Of Inform. Trans., 34, 1998 Lower bound: Random coding exponent Classical: Chernoff-bound method with ML decoder: Fano 1961, Gallager 1965 Method of types: Csiszár and Körner 1981 Random union bound: Polyanskiy, Poor, and Verdú 2010 Quantum: Pure state channel: Burnashev and Holevo 1998, 2000 Mixed state channel: open problem High rate: tight for Low rate: improved Expurgated bound by Gallager 1965, Holevo 2000 No ML decoder
20
20 Characterizations (3/3) C. E. Shannon, R. G. Gallager, E. R. Berlekamp, “Lower bounds to error probability for coding in discrete memoryless channels. I, II”, Inf. Control, 10, 1967 E. A. Haroutunian, “Estimates of the error probability exponent for a semicontinuous memoryless channel,”, Probl. Inform. Transm., 4(4)1968 R. E. Blahut, “Hypothesis testing and information theory,” IEEE Inform. Theory, 20, 1974 Lower bound: Sphere packing exponent Classical: Hypothesis testing with constant composition codes: Shannon, Gallager, Berlekamp 1967 Method of types: Haroutunian 1968, Blahut 1974 Quantum: Nussbaum-Szokoła maping: Dalai 2013 High rate: tight for Low rate: Lose for Improved straight line bound by [SGB67] Still open
21
21 Lemmas [Matharu & Aujla 2013] [Bhatia 1997] Matrix Hölder Inequality: Weak majorization: Lemma (*) : J. S. Matharu, J. S. Aujlia, “Some inequalities for unitarily invariant norm,” Linear Algebra Appl., 436, 2012 R. Bhatia, Matrix Analysis. Springer, 1997 Lemma:
22
22 Proof (1/2) Convexity: Define: Concavity:
23
23 Proof (2/2) By Lemma (*) Hölder Inequality:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.