Download presentation
Presentation is loading. Please wait.
Published byMyrtle Rose Modified over 6 years ago
1
Sequential Equivalence Checking Across Arbitrary Design Transformation: Technologies and Applications Viresh Paruthi, IBM Corporation J. Baumgartner, H. Mony, R. L. Kanzelman Formal Methods in Computer-Aided Design, 2006 11/16/2006
2
Outline Equivalence Checking Overview Use of SEC within IBM
Combinational Equivalence Checking (CEC) Sequential Equivalence Checking (SEC) Use of SEC within IBM IBM’s SEC Solution SEC Applications SEC Challenges Conclusion Formal Methods in Computer-Aided Design, 2006
3
Equivalence Checking A technique to check equivalent behavior of two designs Validates that certain design transforms preserve behavior Logic synthesis, manual redesign does not introduce bugs Often done formally to save resources, eliminate risk R2 R1 Logic 1 2 {x0, x1, …} {0, 0, …}? Formal Methods in Computer-Aided Design, 2006
4
Combinational Equivalence Checking (CEC)
No sequential analysis: latches treated as cutpoints Equivalence check over outputs + next-state functions Though NP-complete, CEC is scalable+mature technology CEC is the most prevalent formal verification application Often mandated to validate synthesis R2 R1 X Logic 1 2 S 0? Formal Methods in Computer-Aided Design, 2006
5
Sequential Equivalence Checking (SEC)
Latch cutpointing req’ment severely limits CEC applicability Cannot handle retimed designs, state machine re-encoding, ... Cutpointing may cause mismatches in unreachable states Often requires manual introduction of constraints over cutpoints SEC overcomes these CEC limitations Supports arbitrary design changes that do not impact I/O behavior Does not require for 1:1 latch or hierarchy correspondence Known mappings can be leveraged to reduce problem complexity Check restricted to reachable states Explores sequential behavior of design to assess I/O equivalence Formal Methods in Computer-Aided Design, 2006
6
SEC Is Computationally Expensive
Sequential verification: more complex than combinational Higher complexity class: PSPACE vs. NP Model checking is thus less scalable than CEC SEC deals with 2x size of model checking! Composite model built over both designs being equiv checked However, tuned algorithms exist to scale SEC better in practice Formal Methods in Computer-Aided Design, 2006
7
SEC Paradigms Various SEC paradigms exist Initialized approaches
Check equivalent behavior from user-specified initial states Assumes that designs can be brought into known reset states Uninitialized approaches, e.g. alignability analysis Require designs to share a common reset mechanism Compute reset mechanism concurrently with checking equivalence from a reset state Formal Methods in Computer-Aided Design, 2006
8
IBM’s Approach: Initialized SEC
More flexible: Enables checking specific modes of operation Applicable even if initialization logic altered (or not yet implemented) Applicable even to designs that are not exactly equivalent Pipeline stage added? check equivalence modulo 1-clock delay data_out differs when data_valid=0? check equiv only when data_valid=1 More scalable: 1,000s to even 100,000+ state elements Reset mechanism computation adds (needless) complexity Validation of reset mechanism can be done independently Functional verification performed w.r.t. power-on reset states Formal Methods in Computer-Aided Design, 2006
9
SEC Usage at IBM IBM’s SEC toolset: SixthSense
Developed primarily for custom microprocessor designs Also used by ASICs; for (semi-)formal functional verification Use CEC to validate combinational synthesis Verity is IBM’s CEC toolset Also used for other specific purposes, e.g. ECO verification Use SEC for pre-synthesis HDL comparisons Sequential optimizations manually reflected in HDL SEC efficiently eliminates the risk of such optimizations Formal Methods in Computer-Aided Design, 2006
10
SixthSense Horsepower
SixthSense is a system of cooperating algorithms Transformation engines (simplification/reduction algorithms) Falsification engines Proof engines Unique Transformation-Based Verification(TBV) framework Exploits maximal synergy between various algorithms Retiming, redundancy removal, localization, induction... Incrementally chop problem into simpler sub-problems until solvable Transformations yield exponential speedups to bug-finding (semi-formal), as well as proof (formal) applications Formal Methods in Computer-Aided Design, 2006
11
Transformation-Based Verification (TBV)
Redundancy Removal Engine Retiming Target Enlargement Design N’ Design N’’ Design N’’’ Design N Result N’ Result N’’ Result N’’’ Result N Formal Methods in Computer-Aided Design, 2006
12
Transformation-Based Verification Framework
registers Design + Driver + Checker Counter-example Trace consistent with original design Combinational Optimization Engine registers Problem decomposition via synergistic transformations Reachability Engine SixthSense These transformations are completely transparent to the user All results are in terms of original design optimized trace registers Retiming Engine optimized, retimed trace Localization Engine 132 registers optimized, retimed, localized trace Formal Methods in Computer-Aided Design, 2006
13
Example SixthSense Engines
Combinational rewriting Sequential redundancy removal Min-area retiming Sequential rewriting Input reparameterization Localization Target enlargement State-transition folding Isomorphic property decomposition Unfolding Semi-formal search Symbolic sim: SAT+BDDs Symbolic reachability Induction Interpolation … Expert System Engine automates optimal engine sequence experimentation Formal Methods in Computer-Aided Design, 2006
14
Key to Scalability: Assume-then-prove framework
Guess redundancy candidates Equivalence classes of gates Create speculatively-reduced model Add a miter (XOR) over each candidate and its equiv class representative Replace fanout references by representatives Attempt to prove each miter unassertable If all miters proven unassertable, corresponding gates can be merged Else, refine to separate unproven candidates; go to Step 2 Formal Methods in Computer-Aided Design, 2006
15
Assume-then-prove Framework
Speculative reduction greatly enhances scalability Generalizes CEC Sequential analysis only needed over sequentially redesigned logic Proof step is the most costly facet Most equivalences solved by lower-cost algos (e.g. induction) However, some equivalences can be very difficult to prove Failure to prove a cutpoint often degrades into inconclusive SEC run Novel SixthSense technology: leverage synergistic algorithms to solve these harder proofs Formal Methods in Computer-Aided Design, 2006
16
Causes of refinement Asserted miter – incorrect candidate guessing
Resource limitations preclude proof Induction becomes expensive with depth Approximation weakens power of reachability Refinement weakens induction hypothesis Immediate separation of candidate gates Avalanche of future resource-gated refinements End result? Suboptimal redundancy removal Inconclusive equivalence check Formal Methods in Computer-Aided Design, 2006
17
SixthSense: Enhanced Redundancy Proofs
Use of robust variety of synergistic transformation and verification algorithms Enables best proof strategy per miter Exponential run-time improvements Greater speed and scalability Greater degree of redundancy identified Powerful use of Transformation-based Verification Synergistically leverage transformations to simplify large problems Reduction in model size, number of distinct miters Transformation alone sufficient for many proofs Formal Methods in Computer-Aided Design, 2006
18
Benefits of Transformation-Based Verification
Reduction in model size, number of distinct miters Useful regardless of proof technique Transformations alone sufficient for many proofs Sub-circuits differing by retiming and resynthesis solved using polynomial-resource transformations Scales to aggressive design modifications Leverage independent proof strategy on each miter Different algorithms suited for different problems Entails exponential difference in run-times Formal Methods in Computer-Aided Design, 2006
19
TBV on Reduced Model Methodology restrictions
Retiming may render name- and structure-based candidate guessing ineffective Synergistic increase in reduction potential TBV flows more effective after merging Applying TBV before + after induction-based redundancy removal insufficient Need to avoid resource-gated refinement “Exploiting Suspected Redundancy without Proving it”, DAC 2005 Formal Methods in Computer-Aided Design, 2006
20
Redundancy Removal Results
Induction alone unable to solve all properties TBV => solves all properties, faster than induction Formal Methods in Computer-Aided Design, 2006
21
TBV on speculatively-reduced model
IFU Initial COM LOC CUT Registers 33231 30362 19 ANDs 304990 276795 86 76 71 Inputs 1371 1329 23 10 S6669 Initial COM CUT RET Registers 325 186 138 ANDs 3992 3067 1747 2186 1833 1788 Inputs 83 61 40 24 RETiming, LOCalization, COMbinational reduction, CUT: reparameterization Formal Methods in Computer-Aided Design, 2006
22
Enhanced search without Proofs
Use miters as filters No miter asserted => search remains within states for which speculative merging is correct i.e., search results valid on original model also Miters need not be proven unassertable Enables exploitation of redundancy that holds only for an initial bounded time-frame Faster and deeper bounded falsification Improved candidate guessing using spec-reduced model Formal Methods in Computer-Aided Design, 2006
23
Bounded Falsification Results (% improvement)
Formal Methods in Computer-Aided Design, 2006
24
Miter Validation Results (% improvement)
Formal Methods in Computer-Aided Design, 2006
25
SixthSense Sequential Equivalence Checking
Drivers (stimulus) Black-Box list Checkers Mapping file Mismatch OLD Design Trace SixthSense NEW Design Proof of Equality Initialization Data Outputs Initialized OLD Design Inputs =? Initialized NEW Design Formal Methods in Computer-Aided Design, 2006
26
Running the Sequential Equivalence Check
Little manual effort to use Produces a counterexample showing output mismatch With respect to specified initial state(s) Trace is short and has minimal activity to simply illustrate mismatch Or, proves that no such trace exists Proof of equivalence Mandatory inputs: Requires OLD and NEW version of a design Formal Methods in Computer-Aided Design, 2006
27
Running the Seq Equiv Check: Optional Inputs
Initialization data; equiv checked w.r.t. given initial values Mapping file Indicates I/O signal renaming/polarities, add cutpoints, omit checks… Drivers, filter input stimuli to prevent spurious mismatches Black Box file, to easily delete components from design Outputs correlated, driven randomly; Inputs correlated, made targets Checkers (check equivalence of internal events) Ensure that coverage obtained before change, is valid after "Audit" known mismatches to enable meaningful proofs Formal Methods in Computer-Aided Design, 2006
28
Sequential Equivalence Checking Applications
Used at block/unit-level on multiple projects… To verify remaps, retiming, synthesis optimizations… CEC inadequate to deal with these changes Exposed 100's of unintended mismatches/design errors No need to run lengthy regression buckets for lesser coverage SixthSense often provides proofs/bugs in lesser time No need to debug lengthy, more cluttered traces SixthSense traces are short, with minimal activity to illustrate bug Quickly finds bugs before faulty logic is released Formal Methods in Computer-Aided Design, 2006
29
Example SEC Applications
Timing optimizations: retiming, adding redundant logic,… Power optimizations: clock gating, logic minimization, … Check specific modes of design behavior Backward-compatibility modes of a redesign preserve functionality BIST change must not alter functionality Verifying RTL vs. higher-level models Quantifying late design fixes Eg., constrain SEC to disallow ops that are the ones affected by a fix Formal Methods in Computer-Aided Design, 2006
30
Example Applications: Clock-Gating Verification
Disables clocks to certain state elements when they are not required to update Approach: Equiv-check identical unit One with clock-gating enabled, one disabled Check design behavior does not change during care time-frames Leveraged to converge upon an optimal clock-gating solution Iteratively apply SEC to ascertain if clock-gating a latch alters function input input Unit with Unit with Unit with Unit with Clock Gating Clock Gating Clock Gating Clock Gating Disabled Disabled Enabled Enabled Formal Methods in Computer-Aided Design, 2006
31
Example Applications: Quantifying a late design fix
Late bug involving specific cmds on target memory node Fix made with backwards-compatible "disable" chicken-switch Wanted to validate: "disable" mode truly disabled fix Fix had no impact upon other commands, non-target nodes Several quick SixthSense equiv check runs performed: With straight-forward comparison, 192/217 outputs mismatched "Disabled" NEW design is equivalent to OLD If configured as non-target node, NEW equivalent to OLD If specific commands excluded (via a driver), NEW equiv to OLD Formal Methods in Computer-Aided Design, 2006
32
Example Applications: Hierarchical Design Flow
FPU designed hierarchically Conventional latch-equivalent VHDL (yellow) Simple, abstract cycle-accurate VHDL (green) FPU spec (blue) is behavioral model Verification approach: First, formally verified green box is equivalent to its spec using SixthSense (SEC) Next, yellow box is verified equivalent to green, macro by macro (takes minutes) Finally, schematics verified using Verity (CEC) FPU verification is done completely by Formal FPU spec FPU spec SixthSense High-Level Design High - SixthSense VHDL VHDL (Latch (Latch-Equivalent) - Verity(CEC) Schematics Schematics Formal Methods in Computer-Aided Design, 2006
33
SEC Future Directions: Hierarchical Design Flow
Enables raising the level of abstraction (ESL) IBM methodology requires, CEC-equivalent to circuit, RTL model Allows for verifying self-test logic, asynchronous crossings, scan, … Specification of each macro precisely captured by high-level model Allows creativity in designing optimal circuit for the macro Verifn can begin without having the entire design ready Verify the high-level macros, unit/core/chip compositions Verifn done in parallel to circuit design; reduces design+verifn cycle Formal correctness eliminates risk of late design changes Efficient automated equiv proof of high-level vs. ckt-accurate macros Formal Methods in Computer-Aided Design, 2006
34
SEC Future Directions: Sequential Optimizations
SEC is an enabler for “safe” sequential synthesis E.g. retiming, addition/deletion of sequential redundancy Opens the door for automated (behavioral) synthesis Results in higher quality, more optimized designs Enabler for system-level design and verification SEC enables sequential optimizations Identify sequential redundancy, unreachable states… Validate user specified don’t-care conditions Verify “global” optimizations, e.g. FSM re-encoding, clock-gating,… Leveraged in diverse areas such as power-gating, fencing, etc. Formal Methods in Computer-Aided Design, 2006
35
SEC Challenges: Scalability
SEC has to scale to real world problems Large design slices, arbitrary transforms, low-level HDL spec,… Tighten induction to resolve miters in spec-reduced model TBV attempts to do just that, but further improvements welcome! Improved proof techniques critical to improving scalability Improved falsification methods to help with candidate guessing Helps distinguish false equivalences to converge faster Abstractions to reduce computational complexity Leverage techniques such as uninterpreted functions, blackboxing,… Hierarchical proof decomposition Bottom-up approach – blackboxes verified portions of the logic, and captures constraints at the interfaces Formal Methods in Computer-Aided Design, 2006
36
SEC Challenges: Combined CEC and SEC
Leverage mappings of state elements obtained from CEC Take advantage of the wealth of techniques to correspond latches Name-based, structural, functional, scan-based… Used as cutpoints to define a boundary between CEC and SEC Significantly simplifies the SEC problem via co-relation hints Refining a cut if a false negative obtained is a hard problem Automatically propagate constraints across mapped state elements Benefits to CEC Improved latch pair matching via functional analysis Latch-phase determination, functional correspondence,… Apply constraints derived from SEC to simplify problems Formal Methods in Computer-Aided Design, 2006
37
Conclusion: Sequential Equivalence Checking
Eliminates Risk: SEC is exhaustive, unlike sim regressions Improves design quality: Enables aggressive optimizations, even late in design flow Saves Resources: Obviates lengthy verification regressions Generalizes CEC, and improves productivity Opens door to automated sequential synthesis Formal Methods in Computer-Aided Design, 2006
38
Conclusion: SEC at IBM SEC becoming part of standard methodology at IBM Pre-synthesis HDL-to-HDL applications CEC closes gap with combinational synthesis flow IBM’s SEC solution driven by scalability across arbitrary design transforms Hooks for: initial values, interface constraints, “partial equivalence”… SixthSense: TBV-Powered SEC Leverage a rich set of synergistic algos for highly-scalable SEC Formal Methods in Computer-Aided Design, 2006
39
Conclusion: References/Links
Website (lists SixthSense publications): Relevant Papers: “Exploiting Suspected Redundancy without Proving it”, DAC 2005 “Scalable Sequential Equivalence Checking across Arbitrary Design Transformations”, ICCD 2006 Formal Methods in Computer-Aided Design, 2006
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.