Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM.

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Heuristic Search techniques
Constraint Satisfaction Problems Russell and Norvig: Chapter
Local Search Jim Little UBC CS 322 – CSP October 3, 2014 Textbook §4.8
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) Oct, 5, 2012.
Local Search Algorithms Chapter 4. Outline Hill-climbing search Simulated annealing search Local beam search Genetic algorithms Ant Colony Optimization.
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
This lecture topic (two lectures) Chapter 6.1 – 6.4, except 6.3.3
Lecture 6: Job Shop Scheduling Introduction
Artificial Intelligence Constraint satisfaction problems Fall 2008 professor: Luigi Ceccaroni.
Optimal Rectangle Packing: A Meta-CSP Approach Chris Reeson Advanced Constraint Processing Fall 2009 By Michael D. Moffitt and Martha E. Pollack, AAAI.
IBM Labs in Haifa Generation Core: IBM's Systematic Constraint Solver.
IBM Labs in Haifa © 2005 IBM Corporation Adaptive Application of SAT Solving Techniques Ohad Shacham and Karen Yorav Presented by Sharon Barner.
Review: Constraint Satisfaction Problems How is a CSP defined? How do we solve CSPs?
1 Restart search techniques for Employee Timetabling Problems Amnon Meisels And Eliezer Kaplansky Ben-Gurion University.
Search by partial solutions. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Random restart General.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) February, 6, 2009.
4 Feb 2004CS Constraint Satisfaction1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Constraint Satisfaction Problems
Constraint Satisfaction
Developing a Deterministic Patrolling Strategy for Security Agents Nicola Basilico, Nicola Gatti, Francesco Amigoni.
Stochastic greedy local search Chapter 7 ICS-275 Spring 2007.
Chapter 5 Outline Formal definition of CSP CSP Examples
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
Elements of the Heuristic Approach
Constraint Satisfaction Problems
Decision Procedures An Algorithmic Point of View
CP Summer School Modelling for Constraint Programming Barbara Smith 1.Definitions, Viewpoints, Constraints 2.Implied Constraints, Optimization,
Building “ Problem Solving Engines ” for Combinatorial Optimization Toshi Ibaraki Kwansei Gakuin University (+ M. Yagiura, K. Nonobe and students, Kyoto.
IBM Labs in Haifa © Copyright IBM SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics Yehuda Naveh Simulation.
June 21, 2007 Minimum Interference Channel Assignment in Multi-Radio Wireless Mesh Networks Anand Prabhu Subramanian, Himanshu Gupta.
ANTs PI Meeting, Nov. 29, 2000W. Zhang, Washington University1 Flexible Methods for Multi-agent distributed resource Allocation by Exploiting Phase Transitions.
Search Methods An Annotated Overview Edward Tsang.
Constraint Satisfaction Problems Chapter 6. Review Agent, Environment, State Agent as search problem Uninformed search strategies Informed (heuristic.
Constraint Satisfaction Read Chapter 5. Model Finite set of variables: X1,…Xn Variable Xi has values in domain Di. Constraints C1…Cm. A constraint specifies.
Chapter 5 Section 1 – 3 1.  Constraint Satisfaction Problems (CSP)  Backtracking search for CSPs  Local search for CSPs 2.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
ANALYSIS AND IMPLEMENTATION OF GRAPH COLORING ALGORITHMS FOR REGISTER ALLOCATION By, Sumeeth K. C Vasanth K.
Hande ÇAKIN IES 503 TERM PROJECT CONSTRAINT SATISFACTION PROBLEMS.
Computer Science CPSC 322 Lecture 16 CSP: wrap-up Planning: Intro (Ch 8.1) Slide 1.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Fall 2006 Jim Martin.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
1 Chapter 5 Constraint Satisfaction Problems. 2 Outlines  Constraint Satisfaction Problems  Backtracking Search for CSPs  Local Search for CSP  The.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
A local search algorithm with repair procedure for the Roadef 2010 challenge Lauri Ahlroth, André Schumacher, Henri Tokola
CHAPTER 5 SECTION 1 – 3 4 Feb 2004 CS Constraint Satisfaction 1 Constraint Satisfaction Problems.
Optimization Problems
CS483/683 Multi-Agent Systems Lecture 2: Distributed variants of 2 important AI problems: Search and Constraint Satisfaction 21 January 2010 Instructor:
1. 2 Outline of Ch 4 Best-first search Greedy best-first search A * search Heuristics Functions Local search algorithms Hill-climbing search Simulated.
Different Local Search Algorithms in STAGE for Solving Bin Packing Problem Gholamreza Haffari Sharif University of Technology
Chapter 5 Team Teaching AI (created by Dewi Liliana) PTIIK Constraint Satisfaction Problems.
INTELLIGENT TEST SCHEDULING TE-MPE Technical Meeting Michael Galetzka.
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Computer Science cpsc322, Lecture 14
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Computer Science cpsc322, Lecture 14
Multi-Objective Optimization
Constraint Satisfaction Problems
Algorithms for Budget-Constrained Survivable Topology Design
Constraint Satisfaction Problems. A Quick Overview
Constraint Satisfaction
CS 8520: Artificial Intelligence
Constraint Satisfaction Problems
Presentation transcript:

Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM

IBM Haifa Research Lab Copyright IBM 2 Outline  CSP solving algorithms  Systematic  Stochastic  Limitations of systematic methods  Stochastic approach  Stocs algorithm  Stocs challenges  Summary

IBM Haifa Research Lab Copyright IBM 3 Constraint satisfaction problems  Variables: Anna, Beth, Cory, Dave  Domains: Red, Green, Orange, Yellow houses  Constraints:  The Red and Green houses are in the city  The Orange and Yellow houses are in the countryside  The Red and Green houses are neighboring, as well as the Orange and Yellow houses  Anna and Dave have dogs, Beth owns a cat  Dogs and cats cannot be neighbors  Dogs must live in the countryside  Solution:  Anna lives in the Orange house, Beth lives in the Red house, Cory lives in the Green house, Dave lives in the Yellow house

IBM Haifa Research Lab Copyright IBM 4 CSP Solving algorithms Systematic: GEC Stochastic: Stocs

IBM Haifa Research Lab Copyright IBM 5 Systematic approach  Systematically go over the search space  Use pruning whenever possible  pruning is done by projection

IBM Haifa Research Lab Copyright IBM 6 Example: projector for multiply a x b = c a є [2, 20], b є [3, 20], c є [1, 20] Projection to input 1 a’ = [2, 6] Projection to input 2 b’ = [3, 10] Projection to result c’ = {6, 8, 9, 10, 12, 14, 15, 16, 18, 20}

IBM Haifa Research Lab Copyright IBM 7 Limitations of systematic methods: example 1 Propagation is hard (factoring)

IBM Haifa Research Lab Copyright IBM 8 Limitations of systematic methods: example 2  The Some-Different constraint: given a graph on the variables, the variables connected by an edge must have different values  Propagation is NP-hard for domains of size k > 3 (k-colorability)

IBM Haifa Research Lab Copyright IBM 9 Limitations of systematic methods: example 3 Only solution:  Local consistency at onset  probability of success: 1/N

IBM Haifa Research Lab Copyright IBM 10 Stochastic approach  State: an assignment of values to all the variables  Cost: a function from the set of states to {0} U R +  Cost = 0 iff all constraints are satisfied by the state

IBM Haifa Research Lab Copyright IBM 11 Stochastic approach  General idea:  Start from some state  Find the next state and move there  Stop if a state with cost 0 is found  Stochastic algorithms are usually incomplete  Different stochastic algorithms use different heuristics for finding the next state  Examples:  Simulated annealing  Tabu search

IBM Haifa Research Lab Copyright IBM 12 Stocs algorithm overview  Check states on length-scales “typical” for the problem. Hop to a new state if cost is lower  Learn the topography of the problem: learn the typical step sizes and directions  Get domain-knowledge as input strategies

IBM Haifa Research Lab Copyright IBM 13  Problem:  7 groups of players  6 members in each group  Play 4 weeks  Without any two players playing together (in the same group) twice  Exponential decrease  No sense in trying step sizes larger than 20.  But may benefit strongly from step sizes of  Reproducible - characterizes the problem Example: Social Golfer Problem

IBM Haifa Research Lab Copyright IBM 14  Problem:  Minimize the autocorrelation on a sequence of N (45) bits  Non-exponential decrease, followed by saturation  Makes sense to always try large steps  Identifies small characteristic features  Extremely reproducible Example: LABS

IBM Haifa Research Lab Copyright IBM 15  Problem:  Select different values for three variables out of a given set of values (smaller than domains)  Easy problem: results are for many runs  Prefer larger step sizes  (up to a cutoff)  Reproducible Example: Selection Problem

IBM Haifa Research Lab Copyright IBM 16  Problem:  Same as before, modeled differently  Prefer intermediate step sizes  Reproducible Example: Selection Problem, different modeling

IBM Haifa Research Lab Copyright IBM 17 Stocs algorithm At each step: decide attempt type: random, learned or user-defined if random: choose a random step if learned: decide learn-type: step-size, direction, … if step-size: choose a step-size which was previously successful (weighted) create a random attempt with chosen step size if direction: choose a direction which was previously successful (weighted) create a random attempt with chosen direction if user-defined: get next user-defined attempt

IBM Haifa Research Lab Copyright IBM 18 Optimization problems  Constraints must be satisfied  In addition, an objective function that should be optimized is given  Example: doll houses  Constraints as before  In addition each doll has a preferred set of houses  The best solution satisfies as much of the preferences as possible

IBM Haifa Research Lab Copyright IBM 19 Optimization with Stocs  Last year we added the optimization capability to Stocs  Optimization is natural for Stocs:  First find a solution  Then keep searching for a better state  Implementation:  Cost function from a state to a pair of non-negative numbers (c1, c2):  c1 is the cost of the constraints  c2 is the value of the objective function  lexicographic order on the pairs:  a better state will always improve the constraints  after a state with c1 = 0 is found, Stocs will continue searching for a better c2

IBM Haifa Research Lab Copyright IBM 20 Preprocessing and initialization  Before starting the search 2 things happen:  Preprocessing of the problem  including:  finding bits that should be constant in any solution  removing unnecessary variables  simplifying constraints  has a big impact on the search:  last year we improved the performance by a factor of 100 with preprocessing  Initialization: finding the initial state  Starting the search at a good state is critical  Currently, each constraint tries to initialize its variables to a satisfying assignment, considering the “wishes” of other constraints

IBM Haifa Research Lab Copyright IBM 21 Summary  Limitations of systematic methods  Stochastic approach: move between full assignments  Stocs: learn the topography of the problem, allow user-defined heuristics  Optimization with Stocs  Preprocessing and initialization  Variable types

IBM Haifa Research Lab Copyright IBM 22 Thank you