Gentle Measurement of Quantum States and Differential Privacy *

Slides:



Advertisements
Similar presentations
Quantum Lower Bound for the Collision Problem Scott Aaronson 1/10/2002 quant-ph/ I was born at the Big Bang. Cool! We have the same birthday.
Advertisements

Quantum Lower Bounds The Polynomial and Adversary Methods Scott Aaronson September 14, 2001 Prelim Exam Talk.
How Much Information Is In Entangled Quantum States? Scott Aaronson MIT |
The Learnability of Quantum States Scott Aaronson University of Waterloo.
Quantum Software Copy-Protection Scott Aaronson (MIT) |
The Future (and Past) of Quantum Lower Bounds by Polynomials Scott Aaronson UC Berkeley.
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
How Much Information Is In A Quantum State? Scott Aaronson MIT |
Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
How to Solve Longstanding Open Problems In Quantum Computing Using Only Fourier Analysis Scott Aaronson (MIT) For those who hate quantum: The open problems.
The Equivalence of Sampling and Searching Scott Aaronson MIT.
Scott Aaronson (MIT) Based on joint work with John Watrous (U. Waterloo) BQP PSPACE Quantum Computing With Closed Timelike Curves.
From
Quantum Search Algorithms for Multiple Solution Problems EECS 598 Class Presentation Manoj Rajagopalan.
1 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Richard Cleve QNC 3129 Lecture 18 (2014)
Multiplicative Weights Algorithms CompSci Instructor: Ashwin Machanavajjhala 1Lecture 13 : Fall 12.
Hypothesis Testing. Distribution of Estimator To see the impact of the sample on estimates, try different samples Plot histogram of answers –Is it “normal”
Using Data Privacy for Better Adaptive Predictions Vitaly Feldman IBM Research – Almaden Foundations of Learning Theory, 2014 Cynthia Dwork Moritz Hardt.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
The Sparse Vector Technique CompSci Instructor: Ashwin Machanavajjhala 1Lecture 12 : Fall 12.
Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Itay Hen Dec 12, 2015 NIPS Quantum Machine Learning Workshop Itay Hen Information Sciences Institute, USC NIPS Quantum Machine Learning Workshop December.
Can small quantum systems learn? NATHAN WIEBE & CHRISTOPHER GRANADE, DEC
NASSP Masters 5003F - Computational Astronomy Lecture 4: mostly about model fitting. The model is our estimate of the parent function. Let’s express.
Step 1: Specify a null hypothesis
University of Texas at El Paso
MCMC Output & Metropolis-Hastings Algorithm Part I
PAC-Learning and Reconstruction of Quantum States
Private Data Management with Verification
Probabilistic Algorithms
The complexity of the Separable Hamiltonian Problem
12. Principles of Parameter Estimation
Complexity-Theoretic Foundations of Quantum Supremacy Experiments
Randomness and Computation: Some Prime Examples
Testing Hypotheses About Proportions
Understanding Generalization in Adaptive Data Analysis
Privacy-preserving Release of Statistics: Differential Privacy
Dana Moshkovitz The Institute For Advanced Study
Shadow Tomography of Quantum States
Differential Privacy in Practice
Hidden Markov Models Part 2: Algorithms
Shadow Tomography of Quantum States
Differential Privacy and Statistical Inference: A TCS Perspective
Graph Theory.
Scott Aaronson (UT Austin) MIT, November 20, 2018
Big Data, Education, and Society
New Results on Learning and Reconstruction of Quantum States
New Results on Learning and Reconstruction of Quantum States
Probability Topics Random Variables Joint and Marginal Distributions
Certified Randomness from Quantum Supremacy Scott Aaronson (UT Austin)
Summarizing Data by Statistics
New Results on Learning and Reconstruction of Quantum States
Online Learning of Quantum States Scott Aaronson (UT Austin)
Gentle Measurement of Quantum States and Differential Privacy
Testing Hypotheses About Proportions
CPS 173 Computational problems, algorithms, runtime, hardness
ECE 352 Digital System Fundamentals
Workshop: A* Search.
Scott Aaronson (UT Austin) UNM, Albuquerque, October 18, 2018
Opinionated Lessons #39 MCMC and Gibbs Sampling in Statistics
Learning From Observed Data
Clustering.
12. Principles of Parameter Estimation
Data Exploration and Pattern Recognition © R. El-Yaniv
CS639: Data Management for Data Science
Some contents are borrowed from Adam Smith’s slides
Differential Privacy (1)
Presentation transcript:

Gentle Measurement of Quantum States and Differential Privacy * Scott Aaronson (University of Texas at Austin) Guy Rothblum (Weizmann Institute of Science) STOC, Phoenix, AZ, June 24, 2019 * Not a misprint. Talk will actually relate these things

Gentle Measurement Measurements in quantum mechanics are famously destructive But not always! If a qubit is known to be either |0 or |1, checking to see which one doesn’t damage it at all Given a quantum measurement M, let’s call M -gentle on a set of states S if for every state S, and every possible measurement outcome y, Trace distance, standard metric on quantum states Post-measurement state if outcome is y Typical choice for S: Product states 1n

Example: Measuring the total Hamming weight of n unentangled qubits could be extremely destructive (omitting normalization) Safer Approach: Measure the Hamming weight plus deliberately-added noise (of order >>n). E.g., “Laplace noise”

Differential Privacy In One Slide “CS theory applied to the social world—i.e., as far as you could possibly get from the subatomic world” Given an algorithm A that queries a database X=(x1,…,xn), we call A -DP if for every two databases X and X’ that differ in only a single xi, and every possible output y of A, Bad: How many patients have prostate cancer? Better: Return the number of patients with prostate cancer, plus Laplace noise of average magnitude  A simple calculation shows this is 1/-DP

Quantum Differential Privacy “Protecting the Privacy Rights of Quantum States” Given a quantum measurement M on n registers, let’s call M -DP on a set of states S (e.g., product states) if for every ,’S that differ on only 1 register, and every possible outcome y of M, Example: Once again, measuring the total Hamming weight of n unentangled qubits plus a Laplace noise term Hmmmm….

Our Main Result: DPGentleness (1) Any measurement that’s -gentle on product states (for small ) is also O()-DP on product states Aren’t DP and gentleness obviously the same? Have to restrict the set of states for part (2) to be interesting (2) Any measurement that’s -DP on product states, and consists of a classical DP algorithm applied to results of separate measurements on each register, is also O(n)-gentle on product states This restriction might just be an artifact of our proof n factor is tight (the Laplace noise measurement shows this)

Related Work Dwork et al. (2014): Connection between DP and adaptive data analysis We can safely reuse the same dataset for many scientific studies, if we’re careful to query the dataset using DP algorithms only! In our terms: DP  “classical Bayesian gentleness” Of course, “damage” to a probability distribution is purely internal and mental, whereas damage to a quantum state can be noticed by others…

One Slide on Proof Techniques GentlenessDP: Easy, little to do with QM. Consider a “converse” statement: if a measurement accepts  and  with very different probabilities, then by Bayes’ Theorem it will damage (+)/2. Then, given a product state 1n, apply this reasoning to each i separately DPGentleness for Product States: Harder direction. First prove for classical product distributions D1Dn (following Dwork et al). Then “lift” to a measurement that QSamples from the output distribution of a classical DP algorithm, on each component of a superposition |1|n. Use inequalities that relate KL-divergence, Hellinger distance, and trace distance

Application: “Shadow Tomography” Given an unknown D-dimensional quantum state , and known 2-outcome measurements E1,…,EM Estimate Pr[Ei accepts ] to  for all i[M], with high probability, by measuring as few copies of  as possible k=O(D2) copies suffice (do ordinary tomography) k=O(M) suffice (apply each Ei to separate copies) But in many applications, D and M are both enormous! Theorem (A., STOC’2018): Shadow tomography is possible using only this many copies of :

New Shadow Tomography Result We can do shadow tomography in a way that’s also online and gentle. The sample complexity is How it works: we take a known procedure from DP, Private Multiplicative Weights (Hardt-Rothblum 2010). PMW decides whether to update our current hypothesis on each query Ei using a threshold measurement with Laplace noise added We give a quantum analogue, “QPMW” Each iteration of QPMW is DP. Using this fact, our DPgentleness theorem, and a lot more work, we show that we can safely apply all M of the iterations in sequence

Future Directions Remove the restriction to product measurements in our DPgentleness implication? Characterize when DPgentleness preserves computational efficiency? What’s the true sample complexity of shadow tomography? Must it have any log D factor at all? For gentle or online shadow tomography, we can show that the answer is yes, by porting known results from classical DP Composition of quantum DP algorithms? Use quantum to say something new about classical DP?