Download presentation
Presentation is loading. Please wait.
1
Quantum Information Theory Introduction
Abida Haque
2
Outline Motivation Mixed States Classical Information Theory
Quantum Information Theory
3
Motivation How many bits are in a qubit?
If we have a random variable X distributed according to some probability distribution P, how much information do you learn from seeing X?
4
Sending Information Alice wants to send Bob a string x. Classically:
5
Sending Information Quantumly:
Alice does a computation to create the state. Can only discriminate quantum states if they are orthogonal. But more generallyβ¦
6
Mixed States Using vectors for a quantum system is not enough
Model quantum noise for when you implement a quantum system Eg, the device outputs
7
Mixed States Basis: Represent by:
8
Examples Note: these are indistinguishable for an observer.
9
Examples II
10
More general scenario Alice samples πβπ΄β 0,1 π with probability π(π₯).
Alice sends π π β β ππ₯π Bob picks POVMs from π€ Bob measures π π and receives πβπ€, where π =π¦ given π=π₯ with probability tr πΈ π¦ π π₯ Bob tries to infer X from Y. POVM: positive-operator valued measure is a set of positivesemidefinite matrices. Specifyingr Gives a way for Bob to do measurement.
11
More general scenario
12
Perspectives Bob sees Alice sees
13
Joint Mixed System Note that Alice and Bob see π₯><π₯ β π π₯ with probability π(π₯). πβ π₯βπ΄ π π₯ π₯><π₯ β π π₯
14
Classical Information Theory
Alice samples a random message X with probability P(X). How much information can Bob learn about X?
15
Examples If P is the uniform distribution, then Bob gets n bits of info from seeing X. If P has all its probability on a single string, Bob gets 0 bits of information seeing X.
16
Shannon Entropy π π₯ =ππ π=π₯ Properties: 0β€π» π β€ log π΄ H is concave.
Claude Shannon
17
Examples X is uniformly distributed
18
Examples II X has all its probability mass on a single string.
19
Classical Information Theory
How much information does Bob learn from seeing X? Maximum: π» π How much does Bob actually learn?
20
Classical Information Theory
How much information does Bob learn from seeing X? Maximum: π» π How much does Bob actually learn? Two correlated random variables π, π on sets π΄,π€ How much does knowing Y tell us about X?
21
Joint Distribution, Mutual Information
π π₯,π¦ =π π=π,π=π¦ πΌ π;π =π» π +π» π βπ» π,π π» π,π = π₯βπ΄,π¦βπ€ π π₯,π¦ 1 π π₯,π¦ The more different the distributions for P(x) and P(y) are on average, the greater the information gain.
22
Examples If X and Y are independent then:
If X and Y are perfectly correlated:
23
Analog of Shannonβs
24
Indistinguishable States
What if you seeβ¦ Thenβ¦
25
Von Neumann Entropy π» π = π=1 π πΌ π log 1 πΌ π =π» πΌ
πΌ π are the eigenvalues John von Neumann
26
Von Neumann Entropy Equivalently: π» π = tr π log 1 π
27
Example βMaximally mixed stateβ π» π = π=
28
Quantum Mutual Information
If π is the joint state of two quantum systems A and B πΌ π π΄ , π π΅ =π» π π΄ +π» π π΅ βπ» π
29
Example π= π π΄ β π π΅ then πΌ π π΄ ; π π΅ =0
30
Given Aliceβs choices for π and π
Holevo Information The amount of quantum information Bob gets from seeing Aliceβs state: π π,π = πΌ π π΄ ; π π΅ Given Aliceβs choices for π and π
31
Recall Alice samples πβπ΄β 0,1 π with probability π(π₯).
Alice sends π π β β ππ₯π Bob picks POVMs from π€ Bob measures π π and receives πβπ€, where π =π¦ given π=π₯ with probability tr πΈ π¦ π π₯ Bob tries to infer X from Y.
32
Holevoβs Bound n qubits can represent no more than n classical bits.
Holevoβs bound proves that Bob can retrieve no more than n classical bits. Odd because it seems like quantum computing should be more powerful than classical. Assuming Alice and Bob do not share entangled qubits. And it takes 2 π β1 complex numbers to represent n classical bits. Alexander Holevo
33
Abida Haque ahaque3@ncsu.edu
Thank You! Abida Haque
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.