Presentation is loading. Please wait.

Presentation is loading. Please wait.

IBM Labs in Haifa © Copyright IBM SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics Yehuda Naveh Simulation.

Similar presentations


Presentation on theme: "IBM Labs in Haifa © Copyright IBM SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics Yehuda Naveh Simulation."— Presentation transcript:

1 IBM Labs in Haifa © Copyright IBM SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics Yehuda Naveh Simulation Based Verification Technologies System Verification and Modeling

2 IBM Labs in Haifa Copyright IBM 2 CSP for simulation-based hardware verification  CSP is a fundamental technology for Simulation-Based Verification  The basis of all our generators: X-Gen, GenesysPro, Piparazzi, DeepTrans, FPgen  The Generation Core (GEC) provides a tool-box for stimuli generation  With focus on CSP: modeling, representation, solution  Octopus provides an umbrella for non-GEC activities  SVRH  …

3 IBM Labs in Haifa Copyright IBM 3 This Presentation  CSP for test generation (brief)  The stochastic solver (using SVRH algorithm)  Future considerations

4 IBM Labs in Haifa Copyright IBM 4 Constraints in test case generation EA: 0x????????_???????? Huge Domains (2 64 ) RA: 0x????????_???????? Quality of solution: Random uniformity  Architecture: Effective address translates into real address in a complex way.  Testing knowledge:Contention on cache row.  Verification task:Real address in some corner memory space, Effective address aligned to 64K. EA: 0x0002FF00_00000000 RA: 0x0002FF00_00000000 Correct, but worthless! (e.g., Dechter et. al., 2002) EA: 0x????????_????0000 Huge Domains (2 64 ) RA: 0x0002FF??_???????? Quality of solution: Random uniformity EA: 0x????????_????0000 Huge Domains (2 64 ) RA: 0x0002FF??_??A4D??? Quality of solution: Random uniformity EA: 0x0B274FAB_0DBC0000 Huge Domains (2 64 ) RA: 0x0002FFC5_90A4D000 Quality of solution: Random uniformity

5 IBM Labs in Haifa Copyright IBM 5 Systematic Methods (DPLL, MAC, k-consistency, …) EA-RA Translation RA EA User: Special RA’s User: Aligned EA X X 1.Reach a level of consistency through successive reductions of sets 2.Choose an assignment for a variable, and maintain the consistency

6 IBM Labs in Haifa Copyright IBM 6 Systematic Methods RA EA User: Special RA’s User: Aligned EA X EA-RA Translation X X X X X

7 IBM Labs in Haifa Copyright IBM 7 Limitations of Systematic Methods: An example Only solution: Local consistency at onset: Choose randomly with probability 1/N of being correct (Solution reached at 600 million years)

8 IBM Labs in Haifa Copyright IBM 8 Limitations of Systematic Methods: Another example Already a single reduction of domains is hard

9 IBM Labs in Haifa Copyright IBM 9 Stochastic approaches: defining the metrics  State: a tuple representing a single assignment to each variable  (a, b, c) out of 2 64x3 states  Cost: A function from the set of states to {0} U R +  Cost = 0 iff all constraints are satisfied by the state.  Simulated Annealing  GSAT  ….. All are local search

10 IBM Labs in Haifa Copyright IBM 10 SVRH: Simulated variable range hopping  Non-local search  If a state is represented by bits, many bits may be flipped in a single step.  Check states on all length-scales in the problem. Hop to a new state if better.  Learn the topography  Get domain-knowledge as input strategies

11 IBM Labs in Haifa Copyright IBM 11 SVRH: Learning  Cost normalization  Topography parameters  Length scales  Preferred directions (rigidity, correlations between variables)  Decision-making parameters (“learning the heuristics”)  Should the next hop be domain-knowledge-based or stochastic?  Should it rely on step-size learning?  Should it rely on correlation learning? Never abandon totally stochastic trials!

12 IBM Labs in Haifa Copyright IBM 12 SVRH: Domain-knowledge strategies Variables:a, b, c : 128 bit integers Constraints: a = b = c a = …0011100101110100110101000101001010100101010 b = …0101001010101010010110111101010100010100010 c = …1100001010100101010000011111010101001010111 Strategy: Check all bits, one at a time. Solution guaranteed in 384 steps.

13 IBM Labs in Haifa Copyright IBM 13 SVRH: Domain-knowledge strategies Variables:a, b, c : 128 bit integers Constraints: a = b = c a = …0011100101110100110101000101001010100100010 b = …0101001010101010010110111101010100010100010 c = …1100001010100101010000011111010101001010111 Strategy: Check all bits, one at a time. Solution guaranteed in 384 steps.

14 IBM Labs in Haifa Copyright IBM 14 SVRH: Domain-knowledge strategies Variables:a, b, c : 128 bit integers Constraints: a = b = c a = …0011100101110100110101000101001010000100010 b = …0101001010101010010110111101010100010100010 c = …1100001010100101010000011111010101001010111 Strategy: Check all bits, one at a time. Solution guaranteed in 384 steps.

15 IBM Labs in Haifa Copyright IBM 15 SVRH: Domain-knowledge strategies Variables:a, b, c : 128 bit integers Constraints: c = (a * b) msb, number of 1’s in c = 5 a = …0011100101110100110101000101001010100101010 b = …0101001010101010010110111101010100010100010 c = …0100000000100000010000000010001001000000000 Strategy: Flip simultaneously all permutations of adjacent bits

16 IBM Labs in Haifa Copyright IBM 16 SVRH: Decision algorithm Decide type (‘random’, ‘learned’, ‘user-defined’) according to previous successes If ‘random’ Choose a random step-size create a random attempt with chosen step-size If ‘learned’: decide learn-type (‘step-size’, ‘direction’, ‘…’) according to previous successes if ‘step-size’: Choose a step-size which was previously successful (weighted) create a random attempt with chosen step size. if ‘direction’ Choose a direction which was previously successful (weighted) create a random attempt with chosen direction etc. If ‘user-defined’ Get next user-defined attempt

17 IBM Labs in Haifa Copyright IBM 17 SVRH: Usage  Variables  A set of bits  Constraints  Cost function  Suggest initialization: random but consistent  Remove fully constrained bits from problem  Domain-Knowledge Strategies  User provides sets of bits to be flipped. Engine decides when to use Commonly used constraints and strategies are provided as building blocks See also: Nareyek ’01, Michel and Van Hentenryck ’02, ’03.

18 IBM Labs in Haifa Copyright IBM 18 Local consistency at onset: Choose randomly with probability 1/N of being correct (Solution reached at 600 million years) SVRH: Solution reached in less then a second (similarly for N = 2 128 ) Results: toy example Only solution:

19 IBM Labs in Haifa Copyright IBM 19 Results: FP-Gen Comparison with ZChaff for floating-point multiply benchmark (133 solvable tasks) ZChaffSVRH Max length64 bit128 bit Average time200.5 sec0.97 sec Best ratio2861 sec0.3 sec Worst ratio25 sec5.7 sec Quality (extreme case)0p0=0x43F0000000000000 0p1=0xB180000000000000 0p0=0x070E342575271FFA 0p1=0x9560F399ECF4E191 Reports UNSATYesNo

20 IBM Labs in Haifa Copyright IBM 20 Results: FP-Gen Comparison with all engines for all-instructions benchmark (150 tasks) LionZChaffMixedSVRH Number of successes29598893 Average time per success (seconds) 0.36539

21 IBM Labs in Haifa Copyright IBM 21 Results: Low Autocorrelation Binary Sequence (LABS)  Hard combinatorial problem, well known optimization benchmark  Objective: minimize the autocorrelation on a string of N 1’s and -1’s  CLS is a special-purpose algorithm designed for this problem NCLS backtracksSVRH attemptsCLS hoursSVRH hours 45368 X 10 6 136 X 10 6 14.701.32 46351791.441.74 471652427.302.35 4878 -3.36 > 5.00

22 IBM Labs in Haifa Copyright IBM 22 Future  Basic  Input language  Static learning  More on next slide  Test Generation  Piparazzi  G-Pro  X-Gen  Formal verification  Bounded model checking, SAT  FormalSim

23 IBM Labs in Haifa Copyright IBM 23 Future – more basic aspects  Pruning steps in an overall SVRH (stochastic) framework  Hybrid algorithm – extensive literature  Report UNSAT (holy grail)  First experiments on SAT: somewhat encouraging, but only at infancy  More sophisticated learning  Correlations (bayesian?), dynamics, other…  Nothing done yet, cannot predict usefulness


Download ppt "IBM Labs in Haifa © Copyright IBM SVRH: Non-local stochastic CSP solver with learning of high-level topography characteristics Yehuda Naveh Simulation."

Similar presentations


Ads by Google