KEG PARTY!!!!!  Keg Party tomorrow night  Prof. Markov will give out extra credit to anyone who attends* *Note: This statement is a lie.

Slides:



Advertisements
Similar presentations
University of Queensland
Advertisements

Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Max Cut Problem Daniel Natapov.
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
Designing Oracles for Grover Algorithm
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Implicit Hitting Set Problems Richard M. Karp Harvard University August 29, 2011.
Grover. Part 2. Components of Grover Loop The Oracle -- O The Hadamard Transforms -- H The Zero State Phase Shift -- Z O is an Oracle H is Hadamards H.
Approximation Algorithms
Phylogenetic Trees Presenter: Michael Tung
Grover Algorithm Marek Perkowski
An Algebraic Foundation for Quantum Programming Languages Andrew Petersen & Mark Oskin Department of Computer Science The University of Washington.
High-Performance Simulation of Quantum Computation using QuIDDs George F. Viamontes, Manoj Rajagopalan, Igor L. Markov, and John P. Hayes Advanced Computer.
Superposition, Entanglement, and Quantum Computation Aditya Prasad 3/31/02.
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
An Introduction to Black-Box Complexity
Quantum Search Heuristics: Tad Hogg’s Perspective George Viamontes February 4, 2002.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
CSEP 590tv: Quantum Computing
Study Group Randomized Algorithms Jun 7, 2003 Jun 14, 2003.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
A Fault-tolerant Architecture for Quantum Hamiltonian Simulation Guoming Wang Oleg Khainovski.
1.1 Chapter 1: Introduction What is the course all about? Problems, instances and algorithms Running time v.s. computational complexity General description.
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Quantum Error Correction Jian-Wei Pan Lecture Note 9.
7 Graph 7.1 Even and Odd Degrees.
Difficult Problems. Polynomial-time algorithms A polynomial-time algorithm is an algorithm whose running time is O(f(n)), where f(n) is a polynomial A.
1 Decrease-and-Conquer Approach Lecture 06 ITS033 – Programming & Algorithms Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing.
Game Playing Chapter 5. Game playing §Search applied to a problem against an adversary l some actions are not under the control of the problem-solver.
INTEGRALS Areas and Distances INTEGRALS In this section, we will learn that: We get the same special type of limit in trying to find the area under.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 8: Complexity Theory.
BackTracking CS335. N-Queens The object is to place queens on a chess board in such as way as no queen can capture another one in a single move –Recall.
CSC 413/513: Intro to Algorithms NP Completeness.
Quantum Computer Simulation Alex Bush Matt Cole James Hancox Richard Inskip Jan Zaucha.
MA 1128: Lecture 17 – 6/17/15 Adding Radicals Radical Equations.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
You Did Not Just Read This or did you?. Quantum Computing Dave Bacon Department of Computer Science & Engineering University of Washington Lecture 3:
By: Lokman Chan Recursive Algorithm Recursion Definition: A function that is define in terms of itself. Goal: Reduce the solution to.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
A Study of Error-Correcting Codes for Quantum Adiabatic Computing Omid Etesami Daniel Preda CS252 – Spring 2007.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
NP-complete Problems Prof. Sin-Min Lee Department of Computer Science.
De novo discovery of mutated driver pathways in cancer Discussion leader: Matthew Bernstein Scribe: Kun-Chieh Wang Computational Network Biology BMI 826/Computer.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
Optimization Problems
Introduction to Quantum Computing
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
A new algorithm for directed quantum search Tathagat Tulsi, Lov Grover, Apoorva Patel Vassilina NIKOULINA, M2R III.
The inference and accuracy We learned how to estimate the probability that the percentage of some subjects in the sample would be in a given interval by.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Quantum Computer Simulation Alex Bush Matt Cole James Hancox Richard Inskip Jan Zaucha.
On the Ability of Graph Coloring Heuristics to Find Substructures in Social Networks David Chalupa By, Tejaswini Nallagatla.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Optimization Problems
BackTracking CS255.
Quantum Computing Dorca Lee.
A Ridiculously Brief Overview
Optimization Problems
Topic 1: Problem Solving
Introduction to Simulated Annealing
Quantum Computation and Information Chap 1 Intro and Overview: p 28-58
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Searching for solutions: Genetic Algorithms
An approach to quantum Bayesian inference
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

KEG PARTY!!!!!  Keg Party tomorrow night  Prof. Markov will give out extra credit to anyone who attends* *Note: This statement is a lie

Trugenberger’s Quantum Optimization Algorithm Overview and Application

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

Two Main Sources of Inspiration  Exploiting Quantum Parallelism  Analogy of Simulated Annealing

What is quantum parallelism?  What is quantum parallelism?  We can represent super-positions of specific instances of data in a single quantum state  We can then apply a single operator to this quantum state and thereby change all instances of data in a single step

What is Simulated Annealing?  Comes from physical annealing  Iteratively heat and cool a material until there’s a high probability of obtaining a crystalline structure  Can be represented as a computational algorithm  Iteratively make changes to your data until there is a high probability of ending up with the data you want

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

Basic Idea  Use this inspiration to come up with a more generalized quantum searching algorithm  Trugenberger’s algorithm does a heuristic search on the entire data set by applying a cost function to each element in the data set  Goal is to find a minimal cost solution

The high-level algorithm  Use quantum parallelism to apply the cost function to all elements of the data set simultaneously in one step  Iteratively apply this cost function to the data set  Number of iterations is analogous to an instance of simulated annealing

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

Representing the Problem: Graph Coloring  Super-position of the data elements  N instances  Use n qubits to represent the N instances  Each instance encoded as a binary number I^k whose value is between 0 and 2^n

Cost Functions in General  should return a cost for that data element  In this algorithm we will want to minimize cost  Data elements with lower cost are better solutions

Skeleton of the U operator  The imaginary exponential of the cost function is the main engine of the quantum optimization

What is Cnor?  We know in general that exp(i*theta) = cos(theta) + i*sin(theta)  Since U will need the imaginary exponential of the cost function, we want to normalize the cost function  By normalizing, we ensure that the cost function result is between 0 and pi/2

What is Cnor?  C(I^k) can at most be Cmax and is at least Cmin  Cnor is always between 1 and 0

And Cmin and Cmax?  Simple to determine for graph coloring  Cmin = 0 (no pair connected vertices shares the same color)  Cmax = # of edges (every pair of connected vertices shares the same color)  More general method for determining Cmin and Cmax will be introduced later

Fleshing out U for Graph Coloring

Still don’t quite have our magic operator  As written, U by itself will not lower the probability amplitude of bad states and increase the amplitude of good states  If we apply U now, the probability amplitudes of both the best and worst data elements will be the same and differ only in phase

Take Advantage of Phase Differences  We can accomplish the proper amplitude modifications by using a controlled form of the U gate  Can’t be an ordinary controlled gate though

Ucs: The Answer to our Problems  Ucs is a controlled gate that applies U to the data elements when the control bit is |0> and applies the inverse of U when the control bit is |1>

Control Bits also need some modification  Control bit always starts out in |0> state  Before applying Ucs, we run the control bit through a Hadamard gate  After applying Ucs, we run it through another Hadamard gate  This gives us a nice super-position of minimal and maximal cost elements

Matlab results for Graph Coloring Data element Probability amplitude i* i* i* i* i* i*0.3536

Measurement  If we were to measure the control bit now and get a |0>, we’d know that the data will get the “first half” of the super-position: Data element Probability amplitude

Measurement  However if we got a |1> instead, we’d know that the data will get the “second half” of the super-position: Data element Probability amplitude 000i* i* i* i* i* i*0.3536

Measurement  A control qubit measurement of |0> means we have a better chance of getting a lower cost state (a good solution)  A control qubit measurement of |1> means we have a better chance of getting a higher cost state (a bad solution)

Measurement  Assume the world is perfect and we always get a |0> when we measure the control qubit  We can effectively increase our probability of getting good solutions and decrease the probability of getting bad solutions by iterating the H,Ucs,H operations  We iterate by duplicating the circuit and adding more control qubits

Matlab Results after 26 “Ideal” Iterations Data element Probability amplitude

Life Isn’t Fair  We don’t always get a |0> for all the control qubits when we measure  Some of the qubits are bound to be measured in the |1> state  Upon measuring the control qubits we can at least know the quality of our computation

The Tradeoff  If we increase the number of control qubits (b), then we have a chance of bumping up the probability amplitudes of the lower cost solutions and canceling out the probability amplitudes of the higher cost solutions

The Tradeoff  However, if we increase the number of control qubits (b), we ALSO lower our chances of getting more control qubits in the |0> state

Some good news  As mentioned earlier, the measurement of the control qubits will tell us how good our bad a particular run was  Trugenberger gives an equation for the expected number of runs needed for a good result:

Analogy to Simulated Annealing  Can view b, the number of control qubits, as a sort of temperature parameter  Trugenberger gives some energy distributions based on the “effective temperature” being equal to 1/b  Simply an analogy to the number of iterations needed for a probabilistically good solution

A Whole New Meaning for k  k can be seen as a certain subset of the |S> super-position of data elements  For the graph coloring problem, k=3  More generally for other problems, k can vary from 1 to K where K > 1

Equations affected by generalization  Cnor changes:

Equations effected by generalization  U changes (this in turn changes Ucs which utilizes U):

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

U operator  Constructing the U operator may itself be exponential in the number of qubits  Perhaps some physical process to get around this

Cost Function Oracle?  Trugenberger glosses over the implementation of the cost function (in fact no implementation is suggested)  Some problems may still be intractable if cost function is too complicated

Only a Heuristic  Trugenberger’s algorithm may not get the exact minimal solution  Although, keeping in mind the tradeoff, more control qubits can be added to increase the odds of a good solution

Overview  Inspiration  Basic Idea  Mathematical and Circuit Realizations  Limitations  Future Work

Future Work  Look into physical feasibility of cost function and construction of Ucs  Run more simulations on various problems and compare against classical heuristics  Compare with Grover’s algorithm

Reference  Quantum Optimization by C.A. Trugenberger, July 22, 2001 (can be found on LANL archive)