The Goldilocks Problem Tudor Hulubei Eugene C. Freuder Department of Computer Science University of New Hampshire Sponsor: Oracle.

Slides:



Advertisements
Similar presentations
Crew Pairing Optimization with Genetic Algorithms
Advertisements

Cadence Design Systems, Inc. Why Interconnect Prediction Doesn’t Work.
Fast Algorithms For Hierarchical Range Histogram Constructions
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
Introduction to Management Science
Geo479/579: Geostatistics Ch14. Search Strategies.
Constraint Processing Techniques for Improving Join Computation: A Proof of Concept Anagh Lal & Berthe Y. Choueiry Constraint Systems Laboratory Department.
Maximizing the Chance of Winning in Searching Go Game Trees Presenter: Ling Zhao March 16, 2005 Author: Keh-Hsun Chen Accepted by Information Sciences.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
1 Approximate Solution to an Exam Timetabling Problem Adam White Dept of Computing Science University of Alberta Adam White Dept of Computing Science University.
Solving the Protein Threading Problem in Parallel Nocola Yanev, Rumen Andonov Indrajit Bhattacharya CMSC 838T Presentation.
A Tool for Partitioning and Pipelined Scheduling of Hardware-Software Systems Karam S Chatha and Ranga Vemuri Department of ECECS University of Cincinnati.
DAMN : A Distributed Architecture for Mobile Navigation Julio K. Rosenblatt Presented By: Chris Miles.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Knight’s Tour Distributed Problem Solving Knight’s Tour Yoav Kasorla Izhaq Shohat.
X-bar and R Control Charts
LP formulation of Economic Dispatch
Statistics: Use Graphs to Show Data Box Plots.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Introduction to Monte Carlo Methods D.J.C. Mackay.

Game Trees: MiniMax strategy, Tree Evaluation, Pruning, Utility evaluation Adapted from slides of Yoonsuck Choe.
Unit 2: Engineering Design Process
Roman Keeney AGEC  In many situations, economic equations are not linear  We are usually relying on the fact that a linear equation.
Minimax Trees: Utility Evaluation, Tree Evaluation, Pruning CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 2 Adapted from slides of Yoonsuck.
AAU A Trajectory Splitting Model for Efficient Spatio-Temporal Indexing Presented by YuQing Zhang  Slobodan Rasetic Jorg Sander James Elding Mario A.
by B. Zadrozny and C. Elkan
Case Base Maintenance(CBM) Fabiana Prabhakar CSE 435 November 6, 2006.
1 Outline:  Outline of the algorithm  MILP formulation  Experimental Results  Conclusions and Remarks Advances in solving scheduling problems with.
Introduction to Statistical Quality Control, 4th Edition
ANTs PI Meeting, Nov. 29, 2000W. Zhang, Washington University1 Flexible Methods for Multi-agent distributed resource Allocation by Exploiting Phase Transitions.
Chapter 9 - Multicriteria Decision Making 1 Chapter 9 Multicriteria Decision Making Introduction to Management Science 8th Edition by Bernard W. Taylor.
U N E C E Software Tools for Statistical Disclosure Control by Complementary Cell Suppression – Reality Check Ramesh A. Dandekar U. S. Department.
DISCRIMINATIVE TRAINING OF LANGUAGE MODELS FOR SPEECH RECOGNITION Hong-Kwang Jeff Kuo, Eric Fosler-Lussier, Hui Jiang, Chin-Hui Lee ICASSP 2002 Min-Hsuan.
Statistics: Mean of Absolute Deviation
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
CSC 2535 Lecture 8 Products of Experts Geoffrey Hinton.
S ystems Analysis Laboratory Helsinki University of Technology Practical dominance and process support in the Even Swaps method Jyri Mustajoki Raimo P.
Secure In-Network Aggregation for Wireless Sensor Networks
Schreiber, Yevgeny. Value-Ordering Heuristics: Search Performance vs. Solution Diversity. In: D. Cohen (Ed.) CP 2010, LNCS 6308, pp Springer-
Operations Fall 2015 Bruce Duggan Providence University College.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Problem solving What is problem solving? Weak and strong methods.
Arben Asllani University of Tennessee at Chattanooga Chapter 5 Business Analytics with Goal Programming Business Analytics with Management Science Models.
Routability-driven Floorplanning With Buffer Planning Chiu Wing Sham Evangeline F. Y. Young Department of Computer Science & Engineering The Chinese University.
Accelerating Dynamic Time Warping Clustering with a Novel Admissible Pruning Strategy Nurjahan BegumLiudmila Ulanova Jun Wang 1 Eamonn Keogh University.
Optimization Problems
Performance of Distributed Constraint Optimization Algorithms A.Gershman, T. Grinshpon, A. Meisels and R. Zivan Dept. of Computer Science Ben-Gurion University.
R-Trees: A Dynamic Index Structure For Spatial Searching Antonin Guttman.
Heuristic Functions. A Heuristic is a function that, when applied to a state, returns a number that is an estimate of the merit of the state, with respect.
Game tree search Chapter 6 (6.1 to 6.3 and 6.6) cover games. 6.6 covers state of the art game players in particular. 6.5 covers games that involve uncertainty.
1 Finding Spread Blockers in Dynamic Networks (SNAKDD08)Habiba, Yintao Yu, Tanya Y., Berger-Wolf, Jared Saia Speaker: Hsu, Yu-wen Advisor: Dr. Koh, Jia-Ling.
CSCE350 Algorithms and Data Structure Lecture 21 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Clustering (1) Chapter 7. Outline Introduction Clustering Strategies The Curse of Dimensionality Hierarchical k-means.
Adaptive Triangular Deployment Algorithm for Unattended Mobile Sensor Networks Ming Ma and Yuanyuan Yang Department of Electrical & Computer Engineering.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Multiple-goal Search Algorithms and their Application to Web Crawling Dmitry Davidov and Shaul Markovitch Computer Science Department Technion, Haifa 32000,
Optimization Problems
Goldilocks and the Three Bears
System Control based Renewable Energy Resources in Smart Grid Consumer
Title: Suggestion Strategies for Constraint- Based Matchmaker Agents
OPSE 301: Lab13 Data Analysis – Fitting Data to Arbitrary Functions
Empirical Comparison of Preprocessing and Lookahead Techniques for Binary Constraint Satisfaction Problems Zheying Jane Yang & Berthe Y. Choueiry Constraint.
Computer Science cpsc322, Lecture 14
FACILITY LAYOUT Facility layout means:
Announcements Homework 3 due today (grace period through Friday)
Optimization Problems
A new and improved algorithm for online bin packing
On the Range Maximum-Sum Segment Query Problem
Constraints and Search
Presentation transcript:

The Goldilocks Problem Tudor Hulubei Eugene C. Freuder Department of Computer Science University of New Hampshire Sponsor: Oracle

Introduction “So away upstairs she went to the bedroom, and there she saw three beds. There was a very big bed for Father bear, but it was far too high. The middle-sized bed for Mother bear was better, but too soft. She went to the teeny, weeny bed of Baby bear and it was just right.” -- Goldilocks and the Three Bears

Introduction (continued) A lot of work in Constraint Satisfaction has been focused on finding solutions to hard problems. Many problems of practical interest are not difficult, but have a huge number of solutions, few of which are acceptable from a practical standpoint. We are proposing a value ordering heuristic that will guide the search towards acceptable solutions.

Motivation and Applications Provide vendors with tools that can be used to suggest upgrades (upselling) and alternative solutions to customers. This can be achieved by combining a matchmaker or deep-interview strategy with our heuristic that guides the search towards acceptable solutions. Provide a way for vendors to promote particular configurations, implement company policies, etc.

Example Solution: v1=1, v2=1, v3=-1 SW=( )+( )=2.9 MinSW=( )+( )=1.4 MaxSW=( )+( )=3.8 v1 v3v2 c12 c13 v1={0=0.2, 1=0.8} v2={1=0.1, 2=0.7} v3={-1=0.8, 4=0.9} c12={(0,1)=0.1, (1,1)=0.7, (1,2)=0.9} c13={(0,-1)=0.2, (0,4)=0.3, (1,-1)=0.5}

Definitions Weights: –Represent “goodness”. –Are associated with: every value in the domain of a variable. every pair of allowed values in a constraint. Solution Weight: –Defined as the sum, over all the variables and constraints, of the weights associated with the values and pairs of values involved. –Lower and upper limits : MinSW (computed as the sum of the minimum weights of the values and pair of values in all the variables and constraints). MaxSW (computed as the sum of the maximum weights of the values and pair of values in all the variables and constraints).

Definitions (continued) Acceptable Solutions: –Given two positive real numbers MinASW and MaxASW (the minimum/maximum acceptable solution weight) s.t.: MinSW  MinASW  MaxASW  MaxSW, a solution is considered acceptable if: MinASW  SW  MaxASW. –The ideal solution weight (IdealSW) is defined as the center of the [MinASW,MaxASW] range. Active Constraints: –During the search, a constraint is considered “active” if it involves at least one variable that has not yet been instantiated.

The acceptable-weight Heuristic One way of looking for acceptable solutions is by parsing the entire search space and stop at the first solution whose weight is acceptable. We can do better than that. We can guide the search towards solutions with acceptable weights, using a special value ordering heuristic called “acceptable-weight”. The heuristic will dynamically compute the average of the individual weight (AIW) that the remaining variables and constraints would contribute to the global solution weight, should it equal IdealSW: ( IdealSW - SW(SolvedSubproblem) ) AIW = (number of uninstantiated variables + number of active constraints)

The acceptable-weight Heuristic (continued) The acceptable-weight heuristic will select the value that will minimize the absolute difference between SW” and IdealSW”. After the assignment of v4, AIW is recomputed, to compensate for the amount we were off compared to the IdealSW”. v1 [1] v2 [6] v3 [4] v4 [1,5] v6 [2,3] v5 [1,2] v1={1=0.1}, v2={6=0.2}, v3={4=0.4}, v4={1=0.1, 5=0.8} c14={(1,1)=0.1, (1,5)=0.2} c24={(6,1)=0.8, (6,5)=0.9} c34={(4,1)=0.6, (4,5)=0.7} c14 c24 c34 c45 c46 Past variables Future variables Current variable AIW=0.4  IdealSW”=( )+AIW+3*AIW=2.3 Options for v4: v4=1: SW”=( )+( )=2.6 v4=5: SW”=( )+( )=3.3  The heuristic selects v4=1. P”={{v1,v2,v3}, {c14,c24,c34}}. Note: v5 & v6 are not considered.

The acceptable-weight Heuristic (continued) The strategy behind the acceptable-weight heuristic is two-fold: –Locally, we try to make sure that each subproblem centered around the current variable has a weight that is in line with the global IdealSW. –Globally, by constantly adjusting the AIW we try to control the overall deviation of the solution weight.

Experimental Results Variables: 100 Domain size: 5 Density:0.00 Tightness: 0.00 Range size: 0.05 MAC range: MAC+acceptable-weight range: Variables: 100 Domain size: 5 Density: 0.00 Tightness: 0.25 Range size: 0.05 MAC range: MAC+acceptable-weight range:

Experimental Results (continued) Variables: 100 Domain size: 5 Density:0.00 Tightness: 0.25 Range size: 0.1 MAC range: MAC+acceptable-weight range: Variables: 100 Domain size: 5 Density: Tightness: 0.25 Range size: 0.1 MAC range: MAC+acceptable-weight range:

Experimental Results (continued) Variables: 100 Domain size: 5 Density: Tightness: 0.25 Range size: 0.1 MAC range: MAC+acceptable-weight range:

Experimental Results (continued) Variables: 100 Domain size: 5 Difficulty peak for tightness=0.25 is at density=

Future Work Potential Improvements: –Look ahead, skip subtrees of the search space where an acceptable weight is not achievable. –Consider future variables when estimating the solution weight of the local subproblem centered around the current variable. –Avoid getting stuck in an interval when the heuristic doesn’t seem capable of escaping it.

Conclusions The acceptable-weight heuristic presented here is designed to guide the search towards solutions with acceptable weights, when solutions can be ranked on a given dimension. Experiments show that MAC+acceptable-weight guides the search towards acceptable solutions very quickly, much faster than MAC alone. Even around the difficulty peak, MAC+acceptable-weight works well, slightly faster than MAC, with a very similar coverage.