Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Itay Hen Information Sciences Institute, USC NIPS Quantum Machine Learning Workshop December.

Slides:



Advertisements
Similar presentations
CSE 330: Numerical Methods
Advertisements

Sta220 - Statistics Mr. Smith Room 310 Class #14.
The Game of Algebra or The Other Side of Arithmetic The Game of Algebra or The Other Side of Arithmetic © 2007 Herbert I. Gross by Herbert I. Gross & Richard.
HMM II: Parameter Estimation. Reminder: Hidden Markov Model Markov Chain transition probabilities: p(S i+1 = t|S i = s) = a st Emission probabilities:
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Review of the Basic Logic of NHST Significance tests are used to accept or reject the null hypothesis. This is done by studying the sampling distribution.
Part 3: The Minimax Theorem
Chapter 10: Hypothesis Testing
Probability & Certainty: Intro Probability & Certainty.
Visual Recognition Tutorial
© Janice Regan, CMPT 102, Sept CMPT 102 Introduction to Scientific Computer Programming The software development method algorithms.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Chapter Sampling Distributions and Hypothesis Testing.
8-1 Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall Chapter 8 Confidence Interval Estimation Statistics for Managers using Microsoft.
Copyright ©2011 Pearson Education 8-1 Chapter 8 Confidence Interval Estimation Statistics for Managers using Microsoft Excel 6 th Global Edition.
The Analysis of Variance
Ch. 9 Fundamental of Hypothesis Testing
Probability & Certainty: Intro Probability & Certainty.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
The Sampling Distribution Introduction to Hypothesis Testing and Interval Estimation.
Getting Started with Hypothesis Testing The Single Sample.
Inferential Statistics
The Marriage Problem Finding an Optimal Stopping Procedure.
Chapter 7 Sampling Distributions
Estimation and Hypothesis Testing. The Investment Decision What would you like to know? What will be the return on my investment? Not possible PDF for.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Estimation Basic Concepts & Estimation of Proportions
Investment Analysis and Portfolio management Lecture: 24 Course Code: MBF702.
F OUNDATIONS OF S TATISTICAL I NFERENCE. D EFINITIONS Statistical inference is the process of reaching conclusions about characteristics of an entire.
Statistical Decision Theory
Chap 20-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 20 Sampling: Additional Topics in Sampling Statistics for Business.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Basic Statistics for Engineers. Collection, presentation, interpretation and decision making. Prof. Dudley S. Finch.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Chapter 6 Lecture 3 Sections: 6.4 – 6.5.
Statistical Methods Introduction to Estimation noha hussein elkhidir16/04/35.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
10.2 Tests of Significance Use confidence intervals when the goal is to estimate the population parameter If the goal is to.
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
GG 313 Geological Data Analysis Lecture 13 Solution of Simultaneous Equations October 4, 2005.
Section 10.1 Confidence Intervals
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
AUGUST 2. MATH 104 Calculus I Review of previous material…. …methods and applications of integration, differential equations ………..
Chap 8-1 Chapter 8 Confidence Interval Estimation Statistics for Managers Using Microsoft Excel 7 th Edition, Global Edition Copyright ©2014 Pearson Education.
Advanced EM -Master in Physics ACCELERATION FIELDS: THE RADIATION Back to the (Schwartz) formula for the acceleration fields: Fairly complicated,
8.4.2 Quantum process tomography 8.5 Limitations of the quantum operations formalism 量子輪講 2003 年 10 月 16 日 担当:徳本 晋
Chapter 6 Lecture 3 Sections: 6.4 – 6.5. Sampling Distributions and Estimators What we want to do is find out the sampling distribution of a statistic.
Fidelity of a Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Quantum Automatic Repeat Request (ARQ)
Classical Control in Quantum Programs Dominique Unruh IAKS, Universität Karlsruhe Founded by the European Project ProSecCo IST
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
Linear Systems Numerical Methods. 2 Jacobi Iterative Method Choose an initial guess (i.e. all zeros) and Iterate until the equality is satisfied. No guarantee.
Excursions in Modern Mathematics, 7e: Copyright © 2010 Pearson Education, Inc. 16 Mathematics of Normal Distributions 16.1Approximately Normal.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
Tests of Significance We use test to determine whether a “prediction” is “true” or “false”. More precisely, a test of significance gets at the question.
Virtual University of Pakistan Lecture No. 35 of the course on Statistics and Probability by Miss Saleha Naghmi Habibullah.
Week 1 Real Numbers and Their Properties (Section 1.6, 1.7, 1.8)
CSE 330: Numerical Methods. What is true error? True error is the difference between the true value (also called the exact value) and the approximate.
Lesson 8: Basic Monte Carlo integration
Decisions Under Risk and Uncertainty
12. Principles of Parameter Estimation
13. Acting under Uncertainty Wolfram Burgard and Bernhard Nebel
Learning From Observed Data
12. Principles of Parameter Estimation
Gentle Measurement of Quantum States and Differential Privacy *
Presentation transcript:

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Itay Hen Information Sciences Institute, USC NIPS Quantum Machine Learning Workshop December 12, 2015 Fidelity-Optimized Quantum State Estimation Joint work with Amir Kalev, UNM

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Disclaimer disclaimer: no Quantum Machine Learning per se here, Disclaimer: 1) a renunciation of any claim to or connection with; 2) disavowal; 3) a statement made to save one’s own ass. but… machine learning for quantum systems

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop The Problem an oven is emitting identical copies of the same unknown quantum state in a steady flow. objective: find out what this state is what measurements should we perform? oven measurement apparatus

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop The Problem what is the optimal sequence of measurements that would yield the best estimate with the smallest error and in the least amount of measurements? oven measurement apparatus an oven is emitting identical copies of the same unknown quantum state in a steady flow.

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  some probability theory  the protocol  some results  conclusions and applications for actual quantum machine learning Outline

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Probability theory

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  given an emitted state, what is the probability of getting the outcome ? Some probability theory

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  given an emitted state, what is the probability of getting the outcome after a single measurement?  what is the probability of getting the sequence of outcomes ? Some probability theory

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  now, given the sequence of outcomes, what is the probability that the emitted state is ? for that, we have Bayes’ law: Some probability theory  given an emitted state, what is the probability of getting the outcome after a single measurement?  what is the probability of getting the sequence of outcomes ?

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  now, given the sequence of outcomes, what is the probability that the emitted state is ? for that, we have Bayes’ law: Some probability theory probability of getting the state given the sequence of outcomes probability of getting the sequence of outcomes given the state the a priori probability of the state the probability of obtaining the sequence of outcomes  let us assume for simplicity (we don’t have to) that we have no knowledge about oven, i.e., that.  moreover, we have.

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  we thus end up with:  where: Some probability theory

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop The protocol

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  equipped with the above probability measure, we can give a general “learning” protocol for optimized adaptive tomography: The protocol 1. perform measurement in a randomly-chosen basis 2. based on record of measurement outcomes thus far, find most-likely state 4. compute optimal basis for next measurement 5. execute measurement in optimal basis 3. exit if convergence criterion has been reached, otherwise:  remaining questions: how do we calculate most-likely state / best guess? how do we determine the optimal measurement basis?

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  given a list of outcomes from all measurements thus far. what should be our best guess in the k-th step for the emitted state ?  this actually depends on how we define “best”. let’s say we’d like to maximize the fidelity of our guess with the real thing.  obviously, we don’t know what is, but we know the probability of occurrence for each state, so we can guess:  plugging in what we already have for, we get: Most likely state

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  now, we can rewrite  as:  put differently: Most likely state where and

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  given a list of prior measurement outcomes, how shall we determine the next basis of measurements?  is there a simple clear answer?  we have to carefully state what we would like accomplished. Determining next basis of measurement we would like to maximize the fidelity of the emitted state with our best guess after the measurement

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  how do we do that?  let’s say that the chosen basis of measurement in the k-th step is: Determining next basis of measurement  let’s assume that after the measurement is carried out, the obtained outcome is with.  we would like to maximize the fidelity of our best guess based on all outcomes “so far” with the real state. but, we have already calculated that, it’s simply

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  of course, we do not know which of the outcomes we’ll get. Determining next basis of measurement  we must therefore average over all possible outcomes, namely:  here, is the probability of obtaining the n-th outcome given what we know so far about the emitted state:

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  but what is: exactly? Determining next basis of measurement  it’s:  putting it all together, we find that the optimal basis is simply: where and

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Some Results

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  example: an oven is emitting copies of a qubit.  let’s say the following outcomes have been obtained: 4 up-z, 3 up-x and 3 down-x, 2 up-y and 2 down-y.  in which direction should the next measurement be performed?  first, what’s the “most likely” state? Next basis of measurement: an example oven measurement apparatus

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  example: an oven is emitting copies of a qubit.  let’s say the following outcomes have been obtained: 4 up-z, 3 up-x and 3 down-x, 2 up-y and 2 down-y.  in which direction should the next measurement be performed?  clearly, the best guess is up-z. Next basis of measurement: an example oven measurement apparatus

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Meaning of results  what does the requirement of maximizing where mean exactly?  it tells us that we should find a basis of measurements such that all outcomes are equally probable!  it tells us to perform a measurement in a basis that we cannot possibly guess what the outcome is!  putting it all together, we arrive at: and

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  going back to the example, outcomes are: 4 up-z, 3 up-x and 3 down-x, 2 up-y and 2 down-y.  in which direction should the next measurement be performed?  if we perform a measurement in the z-direction, we have a pretty good guess of what of the outcome is going to be.  this is not the case in the x and y directions. but, there’s more certainty in the x direction. Meaning of results

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  going back to the example, outcomes are: 4 up-z, 3 up-x and 3 down-x, 2 up-y and 2 down-y.  in which direction should the next measurement be performed?  if we perform a measurement in the z-direction, we have a pretty good guess of what of the outcome is going to be.  this is not the case in the x and y directions. but, there’s more certainty in the x direction.  we should therefore measure in the y direction! Meaning of results

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  consider the first few iterations of the protocol in the qubit case  an oven is emitting qubits one by one…  protocol dictates that we perform the first measurement in some random direction. let’s call the outcome up-z.  what does the protocol say about next basis of measurement? The qubit case: first few measurements

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  consider the first few iterations of the protocol in the qubit case  an oven is emitting qubits one by one…  protocol dictates that we perform the first measurement in some random direction. let’s call the outcome up-z.  what does the protocol say about next basis of measurement?  should be in a basis “orthogonal to z”. measurement direction should be on equator of Bloch sphere. let’s call the outcome up-x.  what about the next measurement basis? orthogonal to z and x, namely y.  next one is more complicated… The qubit case: first few measurements

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  repeatedly performing a numerical experiment thousands of times, we calculated the mean infidelity (with respect to the true state) as a function of number of measurements.  compared several methods: random-basis measurements. repeated measurements in the x, y, z directions. x, y, z measurements chosen in optimal way. fully-optimized.  no surprise, learning methods are superior. Numerical results: the qubit case

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  another example: qudit with d=4. again, mean infidelity as a function of number of measurements.  here, we’re assuming that the available measurements are only “local Pauli”, i.e., xx, xy, xz,…,zz.  comparing a random sequence of local Pauli measurements with an optimized sequence. Numerical results: the qudit case (d=4)

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Conclusions and what’s next

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop  optimized adaptive tomography helps!  easily extended to emitted mixed states, generalized measurements, etc.  what’s next? we have seen that the first few optimizations of the measurement bases yield “orthogonal” or, mutually unbiased, bases. this procedure can therefore be used to generate sets of MUBs (or, so we believe).  can be carried over to machine learning protocols, e.g., Wiebe et al’s “Quantum Hamiltonian Learning”. in some situations, learning curve can be optimized, provided that we can utilize all the gathered information to maximize our knowledge about desired quantities. Conclusions

Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Itay Hen Information Sciences Institute, USC NIPS Quantum Machine Learning Workshop December 12, 2015 Fidelity Optimized Quantum State Estimation Joint work with Amir Kalev, UNM Thank You!