Quantum Information Theory Introduction

Slides:



Advertisements
Similar presentations
How Much Information Is In Entangled Quantum States? Scott Aaronson MIT |
Advertisements

Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
How Much Information Is In A Quantum State? Scott Aaronson MIT |
Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
Tony Short University of Cambridge (with Sabri Al-Safi – PRA 84, (2011))
Quantum Computing MAS 725 Hartmut Klauck NTU
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
Spin chains and channels with memory Martin Plenio (a) & Shashank Virmani (a,b) quant-ph/ , to appear prl (a)Institute for Mathematical Sciences.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Quantum Information Stephen M. Barnett University of Strathclyde The Wolfson Foundation.
Entropy in the Quantum World Panagiotis Aleiferis EECS 598, Fall 2001.
1 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 / RAC 2211 Lecture.
Universal Uncertainty Relations Gilad Gour University of Calgary Department of Mathematics and Statistics Gilad Gour University of Calgary Department of.
Chain Rules for Entropy
Superdense coding. How much classical information in n qubits? Observe that 2 n  1 complex numbers apparently needed to describe an arbitrary n -qubit.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
BB84 Quantum Key Distribution 1.Alice chooses (4+  )n random bitstrings a and b, 2.Alice encodes each bit a i as {|0>,|1>} if b i =0 and as {|+>,|->}
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Information Theory Kenneth D. Harris 18/3/2015. Information theory is… 1.Information theory is a branch of applied mathematics, electrical engineering,
Lo-Chau Quantum Key Distribution 1.Alice creates 2n EPR pairs in state each in state |  00 >, and picks a random 2n bitstring b, 2.Alice randomly selects.
Information Theory and Security
EECS 598 Fall ’01 Quantum Cryptography Presentation By George Mathew.
Paraty, Quantum Information School, August 2007 Antonio Acín ICFO-Institut de Ciències Fotòniques (Barcelona) Quantum Cryptography.
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
Basic Concepts in Information Theory
Some basic concepts of Information Theory and Entropy
1 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Richard Cleve QNC 3129 Lecture 18 (2014)
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 16 (2011)
A Few Simple Applications to Cryptography Louis Salvail BRICS, Aarhus University.
§4 Continuous source and Gaussian channel
1 Introduction to Quantum Information Processing CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 19 (2009)
QCCC07, Aschau, October 2007 Miguel Navascués Stefano Pironio Antonio Acín ICFO-Institut de Ciències Fotòniques (Barcelona) Cryptographic properties of.
Experimental generation and characterisation of private states Paweł Horodecki Wydział Fizyki Technicznej i Matematyki Stosowanej, Politechnika Gdańska.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Anonymity and Covert Channels in Simple, Timed Mix-firewalls Richard E. Newman --- UF Vipan R. Nalla -- UF Ira S. Moskowitz --- NRL
Counterexamples to the maximal p -norm multiplicativity conjecture Patrick Hayden (McGill University) || | | N(½)N(½) p C&QIC, Santa Fe 2008.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School.
What is Qu antum In formation and T echnology? Prof. Ivan H. Deutsch Dept. of Physics and Astronomy University of New Mexico Second Biannual Student Summer.
Quantum correlations with no causal order OgnyanOreshkov, Fabio Costa, ČaslavBrukner Bhubaneswar arXiv: December2011 Conference on Quantum.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 667 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 653 Lecture.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Coherent Communication of Classical Messages Aram Harrow (MIT) quant-ph/
1 Transactional Nature of Quantum Information Subhash Kak Computer Science, Oklahoma State Univ © Subhash Kak, June 2009.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Random Access Codes and a Hypercontractive Inequality for
Richard Cleve DC 2117 Introduction to Quantum Information Processing CS 667 / PH 767 / CO 681 / AM 871 Lecture 16 (2009) Richard.
Entropic uncertainty relations for anti-commuting observables
Introduction to Quantum Computing Lecture 1 of 2
Unbounded-Error Classical and Quantum Communication Complexity
with Weak Measurements
Communication Amid Uncertainty
Unconditional Security of the Bennett 1992 quantum key-distribution protocol over a lossy and noisy channel Kiyoshi Tamaki * *Perimeter Institute for.
Information Based Criteria for Design of Experiments
Communication Amid Uncertainty
Linear sketching with parities
Linear sketching over
Computational Learning Theory
Linear sketching with parities
Imperfectly Shared Randomness
Chapter 6 Random Processes
Richard Cleve DC 2117 Introduction to Quantum Information Processing CS 667 / PH 767 / CO 681 / AM 871 Lecture 16 (2009) Richard.
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
INTERNATIONAL CONFERENCE ON QUANTUM INFORMATION
Presentation transcript:

Quantum Information Theory Introduction Abida Haque ahaque3@ncsu.edu

Outline Motivation Mixed States Classical Information Theory Quantum Information Theory

Motivation How many bits are in a qubit? If we have a random variable X distributed according to some probability distribution P, how much information do you learn from seeing X?

Sending Information Alice wants to send Bob a string x. Classically:

Sending Information Quantumly: Alice does a computation to create the state. Can only discriminate quantum states if they are orthogonal. But more generally…

Mixed States Using vectors for a quantum system is not enough Model quantum noise for when you implement a quantum system Eg, the device outputs

Mixed States Basis: Represent by:

Examples Note: these are indistinguishable for an observer.

Examples II

More general scenario Alice samples 𝑋∈𝛴⊆ 0,1 𝑛 with probability 𝑝(𝑥). Alice sends 𝜎 𝑋 ∈ ℂ 𝑑𝑥𝑑 Bob picks POVMs from 𝛤 Bob measures 𝜎 𝑋 and receives 𝑌∈𝛤, where 𝑌 =𝑦 given 𝑋=𝑥 with probability tr 𝐸 𝑦 𝜎 𝑥 Bob tries to infer X from Y. POVM: positive-operator valued measure is a set of positivesemidefinite matrices. Specifyingr Gives a way for Bob to do measurement.

More general scenario

Perspectives Bob sees Alice sees

Joint Mixed System Note that Alice and Bob see 𝑥><𝑥 ⊗ 𝜎 𝑥 with probability 𝑝(𝑥). 𝜌≔ 𝑥∈𝛴 𝑝 𝑥 𝑥><𝑥 ⊗ 𝜎 𝑥

Classical Information Theory Alice samples a random message X with probability P(X). How much information can Bob learn about X?

Examples If P is the uniform distribution, then Bob gets n bits of info from seeing X. If P has all its probability on a single string, Bob gets 0 bits of information seeing X.

Shannon Entropy 𝑝 𝑥 =𝑃𝑟 𝑋=𝑥 Properties: 0≤𝐻 𝑋 ≤ log 𝛴 H is concave. Claude Shannon

Examples X is uniformly distributed

Examples II X has all its probability mass on a single string.

Classical Information Theory How much information does Bob learn from seeing X? Maximum: 𝐻 𝑋 How much does Bob actually learn?

Classical Information Theory How much information does Bob learn from seeing X? Maximum: 𝐻 𝑋 How much does Bob actually learn? Two correlated random variables 𝑋, 𝑌 on sets 𝛴,𝛤 How much does knowing Y tell us about X?

Joint Distribution, Mutual Information 𝑃 𝑥,𝑦 =𝑃 𝑋=𝑋,𝑌=𝑦 𝐼 𝑋;𝑌 =𝐻 𝑋 +𝐻 𝑌 −𝐻 𝑋,𝑌 𝐻 𝑋,𝑌 = 𝑥∈𝛴,𝑦∈𝛤 𝑃 𝑥,𝑦 1 𝑃 𝑥,𝑦 The more different the distributions for P(x) and P(y) are on average, the greater the information gain.

Examples If X and Y are independent then: If X and Y are perfectly correlated:

Analog of Shannon’s

Indistinguishable States What if you see… Then…

Von Neumann Entropy 𝐻 𝜌 = 𝑖=1 𝑑 𝛼 𝑖 log 1 𝛼 𝑖 =𝐻 𝛼 𝛼 𝑖 are the eigenvalues John von Neumann

Von Neumann Entropy Equivalently: 𝐻 𝜌 = tr 𝜌 log 1 𝜌

Example “Maximally mixed state” 𝐻 𝜌 = 𝜌=

Quantum Mutual Information If 𝜌 is the joint state of two quantum systems A and B 𝐼 𝜌 𝐴 , 𝜌 𝐵 =𝐻 𝜌 𝐴 +𝐻 𝜌 𝐵 −𝐻 𝜌

Example 𝜌= 𝜌 𝐴 ⊗ 𝜌 𝐵 then 𝐼 𝜌 𝐴 ; 𝜌 𝐵 =0

Given Alice’s choices for 𝜎 and 𝑝 Holevo Information The amount of quantum information Bob gets from seeing Alice’s state: 𝜒 𝜎,𝑝 = 𝐼 𝜌 𝐴 ; 𝜌 𝐵 Given Alice’s choices for 𝜎 and 𝑝

Recall Alice samples 𝑋∈𝛴⊆ 0,1 𝑛 with probability 𝑝(𝑥). Alice sends 𝜎 𝑋 ∈ ℂ 𝑑𝑥𝑑 Bob picks POVMs from 𝛤 Bob measures 𝜎 𝑋 and receives 𝑌∈𝛤, where 𝑌 =𝑦 given 𝑋=𝑥 with probability tr 𝐸 𝑦 𝜎 𝑥 Bob tries to infer X from Y.

Holevo’s Bound n qubits can represent no more than n classical bits. Holevo’s bound proves that Bob can retrieve no more than n classical bits. Odd because it seems like quantum computing should be more powerful than classical. Assuming Alice and Bob do not share entangled qubits. And it takes 2 𝑛 −1 complex numbers to represent n classical bits. Alexander Holevo

Abida Haque ahaque3@ncsu.edu Thank You! Abida Haque ahaque3@ncsu.edu