Download presentation
Presentation is loading. Please wait.
Published byOsborn Thomas Modified over 9 years ago
1
IBM Systems and Technology Group © 2006 IBM Corporation Formal Methods in Computer-Aided Design, 200611/16/2006 Sequential Equivalence Checking Across Arbitrary Design Transformation: Technologies and Applications Viresh Paruthi, IBM Corporation J. Baumgartner, H. Mony, R. L. Kanzelman
2
IBM Systems and Technology Group © 2006 IBM Corporation 2Formal Methods in Computer-Aided Design, 2006 Outline Equivalence Checking Overview –Combinational Equivalence Checking (CEC) –Sequential Equivalence Checking (SEC) Use of SEC within IBM IBM’s SEC Solution SEC Applications SEC Challenges Conclusion
3
IBM Systems and Technology Group © 2006 IBM Corporation 3Formal Methods in Computer-Aided Design, 2006 Equivalence Checking A technique to check equivalent behavior of two designs Validates that certain design transforms preserve behavior –Logic synthesis, manual redesign does not introduce bugs Often done formally to save resources, eliminate risk R2 R1 Logic 1 Logic 2 {x0, x1, …} {0, 0, …}?
4
IBM Systems and Technology Group © 2006 IBM Corporation 4Formal Methods in Computer-Aided Design, 2006 Combinational Equivalence Checking (CEC) No sequential analysis: latches treated as cutpoints Equivalence check over outputs + next-state functions –Though NP-complete, CEC is scalable+mature technology CEC is the most prevalent formal verification application –Often mandated to validate synthesis R2 R1 X Logic 1 Logic 2 S 0?
5
IBM Systems and Technology Group © 2006 IBM Corporation 5Formal Methods in Computer-Aided Design, 2006 Sequential Equivalence Checking (SEC) Latch cutpointing req’ment severely limits CEC applicability –Cannot handle retimed designs, state machine re-encoding,... –Cutpointing may cause mismatches in unreachable states Often requires manual introduction of constraints over cutpoints SEC overcomes these CEC limitations –Supports arbitrary design changes that do not impact I/O behavior Does not require for 1:1 latch or hierarchy correspondence –Known mappings can be leveraged to reduce problem complexity Check restricted to reachable states –Explores sequential behavior of design to assess I/O equivalence
6
IBM Systems and Technology Group © 2006 IBM Corporation 6Formal Methods in Computer-Aided Design, 2006 SEC Is Computationally Expensive Sequential verification: more complex than combinational –Higher complexity class: PSPACE vs. NP –Model checking is thus less scalable than CEC SEC deals with 2x size of model checking! –Composite model built over both designs being equiv checked –However, tuned algorithms exist to scale SEC better in practice
7
IBM Systems and Technology Group © 2006 IBM Corporation 7Formal Methods in Computer-Aided Design, 2006 SEC Paradigms Various SEC paradigms exist Initialized approaches –Check equivalent behavior from user-specified initial states –Assumes that designs can be brought into known reset states Uninitialized approaches, e.g. alignability analysis –Require designs to share a common reset mechanism –Compute reset mechanism concurrently with checking equivalence from a reset state
8
IBM Systems and Technology Group © 2006 IBM Corporation 8Formal Methods in Computer-Aided Design, 2006 IBM’s Approach: Initialized SEC More flexible: –Enables checking specific modes of operation –Applicable even if initialization logic altered (or not yet implemented) –Applicable even to designs that are not exactly equivalent Pipeline stage added? check equivalence modulo 1-clock delay data_out differs when data_valid=0? check equiv only when data_valid=1 More scalable: 1,000s to even 100,000+ state elements –Reset mechanism computation adds (needless) complexity Validation of reset mechanism can be done independently –Functional verification performed w.r.t. power-on reset states
9
IBM Systems and Technology Group © 2006 IBM Corporation 9Formal Methods in Computer-Aided Design, 2006 SEC Usage at IBM IBM’s SEC toolset: SixthSense –Developed primarily for custom microprocessor designs –Also used by ASICs; for (semi-)formal functional verification Use CEC to validate combinational synthesis –Verity is IBM’s CEC toolset –Also used for other specific purposes, e.g. ECO verification Use SEC for pre-synthesis HDL comparisons –Sequential optimizations manually reflected in HDL –SEC efficiently eliminates the risk of such optimizations
10
IBM Systems and Technology Group © 2006 IBM Corporation 10Formal Methods in Computer-Aided Design, 2006 SixthSense Horsepower SixthSense is a system of cooperating algorithms –Transformation engines (simplification/reduction algorithms) –Falsification engines –Proof engines Unique Transformation-Based Verification(TBV) framework –Exploits maximal synergy between various algorithms –Retiming, redundancy removal, localization, induction... –Incrementally chop problem into simpler sub-problems until solvable Transformations yield exponential speedups to bug-finding (semi-formal), as well as proof (formal) applications
11
IBM Systems and Technology Group © 2006 IBM Corporation 11Formal Methods in Computer-Aided Design, 2006 Transformation-Based Verification (TBV) Redundancy Removal Engine Retiming Engine Target Enlargement Engine Design N’ Design N’’ Design N’’’ Design N Result N’ Result N’’ Result N’’’ Result N
12
IBM Systems and Technology Group © 2006 IBM Corporation 12Formal Methods in Computer-Aided Design, 2006 Transformation-Based Verification Framework SixthSense 140627 registers Design + Driver + Checker Combinational Optimization Engine 119147 registers 100902 registers Retiming Engine Localization Engine 132 registers Problem decomposition via synergistic transformations Reachability Engine optimized, retimed, localized trace optimized, retimed trace optimized trace Counter-example Trace consistent with original design These transformations are completely transparent to the user All results are in terms of original design
13
IBM Systems and Technology Group © 2006 IBM Corporation 13Formal Methods in Computer-Aided Design, 2006 Example SixthSense Engines Combinational rewriting Sequential redundancy removal Min-area retiming Sequential rewriting Input reparameterization Localization Target enlargement State-transition folding Isomorphic property decomposition Unfolding Semi-formal search Symbolic sim: SAT+BDDs Symbolic reachability Induction Interpolation … Expert System Engine automates optimal engine sequence experimentation
14
IBM Systems and Technology Group © 2006 IBM Corporation 14Formal Methods in Computer-Aided Design, 2006 Key to Scalability: Assume-then-prove framework 1.Guess redundancy candidates Equivalence classes of gates 2.Create speculatively-reduced model Add a miter (XOR) over each candidate and its equiv class representative Replace fanout references by representatives 3.Attempt to prove each miter unassertable 4.If all miters proven unassertable, corresponding gates can be merged 5.Else, refine to separate unproven candidates; go to Step 2
15
IBM Systems and Technology Group © 2006 IBM Corporation 15Formal Methods in Computer-Aided Design, 2006 Assume-then-prove Framework Speculative reduction greatly enhances scalability –Generalizes CEC Sequential analysis only needed over sequentially redesigned logic Proof step is the most costly facet –Most equivalences solved by lower-cost algos (e.g. induction) However, some equivalences can be very difficult to prove –Failure to prove a cutpoint often degrades into inconclusive SEC run Novel SixthSense technology: leverage synergistic algorithms to solve these harder proofs
16
IBM Systems and Technology Group © 2006 IBM Corporation 16Formal Methods in Computer-Aided Design, 2006 Causes of refinement Asserted miter – incorrect candidate guessing Resource limitations preclude proof –Induction becomes expensive with depth –Approximation weakens power of reachability Refinement weakens induction hypothesis –Immediate separation of candidate gates –Avalanche of future resource-gated refinements –End result? Suboptimal redundancy removal Inconclusive equivalence check
17
IBM Systems and Technology Group © 2006 IBM Corporation 17Formal Methods in Computer-Aided Design, 2006 SixthSense: Enhanced Redundancy Proofs Use of robust variety of synergistic transformation and verification algorithms –Enables best proof strategy per miter Exponential run-time improvements –Greater speed and scalability –Greater degree of redundancy identified Powerful use of Transformation-based Verification –Synergistically leverage transformations to simplify large problems –Reduction in model size, number of distinct miters Transformation alone sufficient for many proofs
18
IBM Systems and Technology Group © 2006 IBM Corporation 18Formal Methods in Computer-Aided Design, 2006 Benefits of Transformation-Based Verification Reduction in model size, number of distinct miters –Useful regardless of proof technique Transformations alone sufficient for many proofs –Sub-circuits differing by retiming and resynthesis solved using polynomial-resource transformations –Scales to aggressive design modifications Leverage independent proof strategy on each miter –Different algorithms suited for different problems –Entails exponential difference in run-times
19
IBM Systems and Technology Group © 2006 IBM Corporation 19Formal Methods in Computer-Aided Design, 2006 TBV on Reduced Model Methodology restrictions –Retiming may render name- and structure-based candidate guessing ineffective Synergistic increase in reduction potential –TBV flows more effective after merging –Applying TBV before + after induction-based redundancy removal insufficient Need to avoid resource-gated refinement “Exploiting Suspected Redundancy without Proving it”, DAC 2005
20
IBM Systems and Technology Group © 2006 IBM Corporation 20Formal Methods in Computer-Aided Design, 2006 Redundancy Removal Results Induction alone unable to solve all properties TBV => solves all properties, faster than induction
21
IBM Systems and Technology Group © 2006 IBM Corporation 21Formal Methods in Computer-Aided Design, 2006 TBV on speculatively-reduced model IFUInitialCOMLOCCUTCOM Registers332313036219 ANDs304990276795867671 Inputs137113292310 S6669InitialCOMCUTRETCUTCOM Registers325186138000 ANDs399230671747218618331788 Inputs836140 24 RETiming, LOCalization, COMbinational reduction, CUT: reparameterization
22
IBM Systems and Technology Group © 2006 IBM Corporation 22Formal Methods in Computer-Aided Design, 2006 Enhanced search without Proofs Use miters as filters –No miter asserted => search remains within states for which speculative merging is correct i.e., search results valid on original model also –Miters need not be proven unassertable –Enables exploitation of redundancy that holds only for an initial bounded time-frame Faster and deeper bounded falsification Improved candidate guessing using spec-reduced model
23
IBM Systems and Technology Group © 2006 IBM Corporation 23Formal Methods in Computer-Aided Design, 2006 Bounded Falsification Results (% improvement)
24
IBM Systems and Technology Group © 2006 IBM Corporation 24Formal Methods in Computer-Aided Design, 2006 Miter Validation Results (% improvement)
25
IBM Systems and Technology Group © 2006 IBM Corporation 25Formal Methods in Computer-Aided Design, 2006 SixthSense Sequential Equivalence Checking SixthSense Initialized OLD Design Initialized NEW Design Inputs =? Outputs NEW Design OLD Design Initialization Data Mapping file Drivers (stimulus) Black-Box listCheckers Mismatch Trace Proof of Equality
26
IBM Systems and Technology Group © 2006 IBM Corporation 26Formal Methods in Computer-Aided Design, 2006 Running the Sequential Equivalence Check Little manual effort to use Produces a counterexample showing output mismatch –With respect to specified initial state(s) –Trace is short and has minimal activity to simply illustrate mismatch Or, proves that no such trace exists –Proof of equivalence Mandatory inputs: –Requires OLD and NEW version of a design
27
IBM Systems and Technology Group © 2006 IBM Corporation 27Formal Methods in Computer-Aided Design, 2006 Running the Seq Equiv Check: Optional Inputs Initialization data; equiv checked w.r.t. given initial values Mapping file –Indicates I/O signal renaming/polarities, add cutpoints, omit checks… Drivers, filter input stimuli to prevent spurious mismatches Black Box file, to easily delete components from design –Outputs correlated, driven randomly; Inputs correlated, made targets Checkers (check equivalence of internal events) –Ensure that coverage obtained before change, is valid after –"Audit" known mismatches to enable meaningful proofs
28
IBM Systems and Technology Group © 2006 IBM Corporation 28Formal Methods in Computer-Aided Design, 2006 Sequential Equivalence Checking Applications Used at block/unit-level on multiple projects… –To verify remaps, retiming, synthesis optimizations… –CEC inadequate to deal with these changes Exposed 100's of unintended mismatches/design errors –No need to run lengthy regression buckets for lesser coverage –SixthSense often provides proofs/bugs in lesser time –No need to debug lengthy, more cluttered traces SixthSense traces are short, with minimal activity to illustrate bug –Quickly finds bugs before faulty logic is released
29
IBM Systems and Technology Group © 2006 IBM Corporation 29Formal Methods in Computer-Aided Design, 2006 Example SEC Applications Timing optimizations: retiming, adding redundant logic,… Power optimizations: clock gating, logic minimization, … Check specific modes of design behavior –Backward-compatibility modes of a redesign preserve functionality –BIST change must not alter functionality Verifying RTL vs. higher-level models Quantifying late design fixes –Eg., constrain SEC to disallow ops that are the ones affected by a fix
30
IBM Systems and Technology Group © 2006 IBM Corporation 30Formal Methods in Computer-Aided Design, 2006 Example Applications: Clock-Gating Verification Clock-gating: –Disables clocks to certain state elements when they are not required to update Approach: Equiv-check identical unit –One with clock-gating enabled, one disabled –Check design behavior does not change during care time-frames Leveraged to converge upon an optimal clock-gating solution –Iteratively apply SEC to ascertain if clock-gating a latch alters function Unit with Clock Gating Disabled Unit with Clock Gating Enabled input Unit with Clock Gating Disabled Unit with Clock Gating Enabled input
31
IBM Systems and Technology Group © 2006 IBM Corporation 31Formal Methods in Computer-Aided Design, 2006 Example Applications: Quantifying a late design fix Late bug involving specific cmds on target memory node –Fix made with backwards-compatible "disable" chicken-switch Wanted to validate: –"disable" mode truly disabled fix –Fix had no impact upon other commands, non-target nodes Several quick SixthSense equiv check runs performed: –With straight-forward comparison, 192/217 outputs mismatched –"Disabled" NEW design is equivalent to OLD –If configured as non-target node, NEW equivalent to OLD –If specific commands excluded (via a driver), NEW equiv to OLD
32
IBM Systems and Technology Group © 2006 IBM Corporation 32Formal Methods in Computer-Aided Design, 2006 Example Applications: Hierarchical Design Flow FPU designed hierarchically –Conventional latch-equivalent VHDL (yellow) –Simple, abstract cycle-accurate VHDL (green) –FPU spec (blue) is behavioral model Verification approach: –First, formally verified green box is equivalent to its spec using SixthSense (SEC) –Next, yellow box is verified equivalent to green, macro by macro (takes minutes) –Finally, schematics verified using Verity (CEC) –FPU verification is done completely by Formal FPU spec High- VHDL (Latch- Schematics Verity(CEC) SixthSense FPU spec High-Level Design VHDL (Latch-Equivalent) Schematics SixthSense
33
IBM Systems and Technology Group © 2006 IBM Corporation 33Formal Methods in Computer-Aided Design, 2006 SEC Future Directions: Hierarchical Design Flow Enables raising the level of abstraction (ESL) –IBM methodology requires, CEC-equivalent to circuit, RTL model Allows for verifying self-test logic, asynchronous crossings, scan, … –Specification of each macro precisely captured by high-level model Allows creativity in designing optimal circuit for the macro Verifn can begin without having the entire design ready –Verify the high-level macros, unit/core/chip compositions –Verifn done in parallel to circuit design; reduces design+verifn cycle Formal correctness eliminates risk of late design changes –Efficient automated equiv proof of high-level vs. ckt-accurate macros
34
IBM Systems and Technology Group © 2006 IBM Corporation 34Formal Methods in Computer-Aided Design, 2006 SEC Future Directions: Sequential Optimizations SEC is an enabler for “safe” sequential synthesis –E.g. retiming, addition/deletion of sequential redundancy –Opens the door for automated (behavioral) synthesis Results in higher quality, more optimized designs Enabler for system-level design and verification SEC enables sequential optimizations –Identify sequential redundancy, unreachable states… –Validate user specified don’t-care conditions –Verify “global” optimizations, e.g. FSM re-encoding, clock-gating,… –Leveraged in diverse areas such as power-gating, fencing, etc.
35
IBM Systems and Technology Group © 2006 IBM Corporation 35Formal Methods in Computer-Aided Design, 2006 SEC Challenges: Scalability SEC has to scale to real world problems –Large design slices, arbitrary transforms, low-level HDL spec,… –Tighten induction to resolve miters in spec-reduced model TBV attempts to do just that, but further improvements welcome! –Improved proof techniques critical to improving scalability –Improved falsification methods to help with candidate guessing Helps distinguish false equivalences to converge faster Abstractions to reduce computational complexity –Leverage techniques such as uninterpreted functions, blackboxing,… –Hierarchical proof decomposition Bottom-up approach – blackboxes verified portions of the logic, and captures constraints at the interfaces
36
IBM Systems and Technology Group © 2006 IBM Corporation 36Formal Methods in Computer-Aided Design, 2006 SEC Challenges: Combined CEC and SEC Leverage mappings of state elements obtained from CEC –Take advantage of the wealth of techniques to correspond latches Name-based, structural, functional, scan-based… –Used as cutpoints to define a boundary between CEC and SEC Significantly simplifies the SEC problem via co-relation hints Refining a cut if a false negative obtained is a hard problem –Automatically propagate constraints across mapped state elements Benefits to CEC –Improved latch pair matching via functional analysis Latch-phase determination, functional correspondence,… –Apply constraints derived from SEC to simplify problems
37
IBM Systems and Technology Group © 2006 IBM Corporation 37Formal Methods in Computer-Aided Design, 2006 Conclusion: Sequential Equivalence Checking Eliminates Risk: –SEC is exhaustive, unlike sim regressions Improves design quality: –Enables aggressive optimizations, even late in design flow Saves Resources: –Obviates lengthy verification regressions Generalizes CEC, and improves productivity Opens door to automated sequential synthesis
38
IBM Systems and Technology Group © 2006 IBM Corporation 38Formal Methods in Computer-Aided Design, 2006 Conclusion: SEC at IBM SEC becoming part of standard methodology at IBM –Pre-synthesis HDL-to-HDL applications –CEC closes gap with combinational synthesis flow IBM’s SEC solution driven by scalability across arbitrary design transforms –Hooks for: initial values, interface constraints, “partial equivalence”… SixthSense: TBV-Powered SEC –Leverage a rich set of synergistic algos for highly-scalable SEC
39
IBM Systems and Technology Group © 2006 IBM Corporation 39Formal Methods in Computer-Aided Design, 2006 Conclusion: References/Links Website (lists SixthSense publications): www.research.ibm.com/sixthsense Relevant Papers: “Exploiting Suspected Redundancy without Proving it”, DAC 2005 “Scalable Sequential Equivalence Checking across Arbitrary Design Transformations”, ICCD 2006
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.