Simulation Management. Pass or Fail? Managing Simulations Regression Behavioral Models.

Slides:



Advertisements
Similar presentations
Hub The Only Co-Simulation Tool of Its Kind on the Market The Only Co-Simulation Tool of Its Kind on the Market.
Advertisements

11-Jun-14 The assert statement. 2 About the assert statement The purpose of the assert statement is to give you a way to catch program errors early The.
Simulation executable (simv)
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Software Quality Assurance Plan
T. E. Potok - University of Tennessee Software Engineering Dr. Thomas E. Potok Adjunct Professor UT Research Staff Member ORNL.
Spring 07, Feb 6 ELEC 7770: Advanced VLSI Design (Agrawal) 1 ELEC 7770 Advanced VLSI Design Spring 2007 Verification Vishwani D. Agrawal James J. Danaher.
1 Assertion Based Verification 2 The Design and Verification Gap  The number of transistors on a chip increases approximately 58% per year, according.
ECE Synthesis & Verification1 ECE 667 Spring 2011 Synthesis and Verification of Digital Systems Verification Introduction.
Copyright © 1998 Wanda Kunkle Computer Organization 1 Chapter 2.1 Introduction.
2/9/2007EECS150 Lab Lecture #41 Debugging EECS150 Spring2007 – Lab Lecture #4 Laura Pelton Greg Gibeling.
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
02/10/06EECS150 Lab Lecture #41 Debugging EECS150 Spring 2006 – Lab Lecture #4 Philip Godoy Greg Gibeling.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Test Design Techniques
Methods for checking simulation correctness How do you know if your testcase passed or failed?
VerificationTechniques for Macro Blocks (IP) Overview Inspection as Verification Adversarial Testing Testbench Design Timing Verification.
Streamline Verification Process with Formal Property Verification to Meet Highly Compressed Design Cycle Prosenjit Chatterjee, nVIDIA Corporation.
Architecting Testbenches. Reusable Verification Components Verilog Implementation VHDL Implementation Autonomous Generation and Monitoring Input and Output.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
Pragmatic Projects Prepared by Doug Glidden. Pragmatic Projects Pragmatic Teams Ubiquitous Automation Ruthless Testing It’s All Writing Great Expectations.
CMSC 345 Fall 2000 Unit Testing. The testing process.
IAY 0600 Digitaalsüsteemide disain Event-Driven Simulation Alexander Sudnitson Tallinn University of Technology.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
1 Software Development Configuration management. \ 2 Software Configuration  Items that comprise all information produced as part of the software development.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
ASIC/FPGA design flow. FPGA Design Flow Detailed (RTL) Design Detailed (RTL) Design Ideas (Specifications) Design Ideas (Specifications) Device Programming.
Copyright © 2002 Qualis Design Corporation Industry and Textbook Overview Qualis Design Corporation PO Box 4444 Beaverton, Oregon USA Phone:
Using Formal Verification to Exhaustively Verify SoC Assemblies by Mark Handover Kenny Ranerup Applications Engineer ASIC Consultant Mentor Graphics Corp.
1 H ardware D escription L anguages Modeling Digital Systems.
FPGA-Based System Design: Chapter 6 Copyright  2004 Prentice Hall PTR Topics n Design methodologies.
EE694v-Verification-Lect10-1- Lect 10 - Stimulus & Response Applying input stimulus to a design Creating clock signals Other waveforms Synchronizing inputs.
VHDL IE- CSE. What do you understand by VHDL??  VHDL stands for VHSIC (Very High Speed Integrated Circuits) Hardware Description Language.
Software Development Cycle What is Software? Instructions (computer programs) that when executed provide desired function and performance Data structures.
1 Hybrid-Formal Coverage Convergence Dan Benua Synopsys Verification Group January 18, 2010.
Functional Verification Figure 1.1 p 6 Detection of errors in the design Before fab for design errors, after fab for physical errors.
The Macro Design Process The Issues 1. Overview of IP Design 2. Key Features 3. Planning and Specification 4. Macro Design and Verification 5. Soft Macro.
© 2006 Synopsys, Inc. (1) CONFIDENTIAL Simulation and Formal Verification: What is the Synergy? Carl Pixley Disclaimer: These opinions are mine alone and.
Week 9 Data structures / collections. Vladimir Misic Week 9 Monday, 4:20:52 PM2 Data structures (informally:) By size: –Static (e.g. arrays)
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
ECE 545 Project 2 Specification. Project 2 (15 points) – due Tuesday, December 19, noon Application: cryptography OR digital signal processing optimized.
Verification – The importance
Lab 2 Parallel processing using NIOS II processors
ECE-C662 Lecture 2 Prawat Nagvajara
Chapter 3 System Performance and Models Introduction A system is the part of the real world under study. Composed of a set of entities interacting.
Week 14 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
VHDL and Hardware Tools CS 184, Spring 4/6/5. Hardware Design for Architecture What goes into the hardware level of architecture design? Evaluate design.
Chapter 1 Software Engineering Principles. Problem analysis Requirements elicitation Software specification High- and low-level design Implementation.
Software Quality Assurance and Testing Fazal Rehman Shamil.
IAY 0600 Digital Systems Design Event-Driven Simulation VHDL Discussion Alexander Sudnitson Tallinn University of Technology.
Chapter 11 System-Level Verification Issues. The Importance of Verification Verifying at the system level is the last opportunity to find errors before.
How to configure, build and install Trilinos November 2, :30-9:30 a.m. Jim Willenbring.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Debugging Lab Antonio Gómez-Iglesias Texas Advanced Computing Center.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
 System Requirement Specification and System Planning.
Software Testing.
Software Project Configuration Management
ASIC Design Methodology
Testing Tutorial 7.
Introduction Introduction to VHDL Entities Signals Data & Scalar Types
Chapter 2: System Structures
ECE-C662 Introduction to Behavioral Synthesis Knapp Text Ch
EECS150 Fall 2007 – Lab Lecture #4 Shah Bawany
Debugging EECS150 Fall Lab Lecture #4 Sarah Swisher
Debugging EECS150 Fall Lab Lecture #4 Sarah Swisher
CSE 1020:Software Development
Presentation transcript:

Simulation Management

Pass or Fail? Managing Simulations Regression Behavioral Models

Pass Or Fail? Goal of testcase is to determine if the DUV passes or fails given a certain stimulus. What is the determining factor if it passes? Need proof-positive of successful simulation. Include a termination message in the output log file If message is not present, assume failure. False-positive situations where testbench does not detect certain situations Provide error injection to ensure they are caught. Provide logging messages for all activity– could be verbose Use bracket regions for injected errors. Keep track of success or failures by using a common log package. Makes end-of-test determination easy!

Managing Simulations Are you simulating the right model? Configuration Management Verilog Configuration Management VHDL Configuration Management

Configuration Management A configuration is the set of models used in a simulation. Different from source management (revision control). Revision control deals with source files, while configuration deals with what models you are using. (behavioral vs. RTL) For system level tests, could have a mix of both Want an easy way to specify a particular configuration. Could use a script to submit runs.

Verilog Configuration Management Many ways to include source files Command line File containing a list of filenames (-f option – called manifest) Directory to search for missing module name(s) Name of a file that may contain definitions of missing modules (-v option) Include directives, based on the +incdir command line option

Verilog Configuration Management (Cont) Only one that can be source controlled and reliably reproduced – Manifest Not constrained to just files, can include all command line options required These can be hierarchical Use relative path names, not absolute. Assumption that everyone on project is setup similar. Some simulators have a –F option It prepends path information of the manifest files. Other alternative is a preprocessing script that does the same.

VHDL Configuration Management VHDL is compiled (Verilog can be compiled or interpreted) How do you know what you are simulating? Makefiles Reporting metrics Configuration units

VHDL Configuration Management (Cont) Makefiles Most effective way If a file is found to be older then its dependencies, things are recompiled. Can do in submission scripts for regressions Ensures everything is up-to-date

VHDL Configuration Management (Cont) Reporting Metrics Environment should report name (and version) of files Use within testcase run by using asserts Use with a makefile Configuration Units Use Configuration declarations Binds architectures to entities Testcase uses which configuration it needs Use configuration of configurations at top level

Output File Management Simulations create output files Log file Wave file When running massively parallel jobs, problem with collisions. Especially if hard coded names are used. Want to have the ability to create unique names.. Can use scripts to create the names (utilizing the simulators command line options. Use Verilog/VHDL conventions to create this Verilog: Use the manifest file and parameters with a script VHDL: Use generics and pass in values from a script.

Regression Regression suite ensures that modifications to the design remain backward compatible with previously verified functionality. Running Regressions Regression Management

Running Regressions Must be run at regular intervals Typically nightly Added to a master list called the regression suite Suite is to large to run overnight, can split it up into different ways: Two lists: One list to run nightly One to be run over the weekend (includes nightly run) Include a “fast mode” Could pre-configure things Could enable only certain functions in the stimulus models

Regression Management Ensure your using a version that is regression certified! Simulation Run Time Automatic classification of regression.

Simulation Run Time Want to maximize simulation resources! Minimize wasted cycles due to run away simulations Use a time bomb. Should go off after enough time has elapsed to allow for all operations to have completed. Time could be reset on an event This flags a failure condition Can’t determine is condition is due to a deadlock, run away, or successful simulation Consider wasted simulation cycles Run 100 us but test only needs to run to 10 us.

Simulation Run Time (continued) Don’t want to run any longer than necessary Randomization causes run times to vary Create a BFM for clock generation It would run a specified time from testcase, then stop clocks (thus shutting down the simulation) – this is the time bomb. Coordinate this with generators/monitors. If generator is done sending in transactions and checkers are done validating output, stop simulation! If run time is reached and generator not done sending in transactions or monitors still have checking to be performed – failure.

Automatic classification of regressions Using an output log scan script, determine the success or failure of a test. For any given regression suite, a summary could be ed to everyone on the team. Could be used for status and discussions in team meetings Should include: T ime/Date Design environment (unit name) Testcase name Random Number seed Simulation time Real time (wall clock) System run on Operating System Version Memory in system Paging Space on system

Behavioral Models Benefits of Behavioral Models Behavioral vs. Synthesizeable Models Example of Behavioral Modeling Characteristics of Behavioral Model Modeling Reset Writing Good Behavioral Models Behavioral Models are Faster Demonstrating Equivalence

Benefits of Behavioral Models Audit of the specification Missing functional details of the specification are uncovered earlier, not during the debug of the RTL. Development and debug of testbenches in parallel with the RTL Don’t have to wait for unstable RTL. Since it is behavioral, debug turn around is faster. When RTL is available, already have a debugged regression suite. System verification can start earlier, same benefits as above, in addition: If behavioral is validated as equivalence to RTL, system tests will run faster. Used as an evaluation tool for customers

Behavioral vs. Synthesizeable Models Behaviorals may not be synthesizeable Behaviorals are not just for test benches Behaviorals describe functionality, not implementation specifics. Require different mindset Focus on functionality When implementation affects the behavioral, you start writing RTL++

Characteristics of Behavioral Model They are partitioned for maintenance RTL is partitioned for synthesis Usually decided along implementation lines Produces wide and shallow structure Behavioral is partitioned at the whim of author Usually along main functional boundaries Avoids one large file Allows multiple people to work on concurrently Produces narrow and shallow structure

Characteristics of Behavioral Model (Cont) Should not have a clock Only perform computations when necessary Should not contain FSM Synchronous design implementation of control logic, not behavioral Data should remain at a high level of abstraction These structures are designed for ease-of-use, not implementation Use BFM’s for physical bus connections

Modeling Reset Behaviorals must reset their variables and state of the model. Partition model with reset in mind. Have one process monitoring physical signal(s). Could communicate to all other portions of models using procedures. Could communicate to all other portions of model using a boolean.

Writing Good Behavioral Models Models that are not done correctly are thrown out, thus not gaining benefits of behavioral modeling. RTL is used as soon as it is available Writing good models require specialized skills. Think at a higher level of abstraction Focus on relevant functional details Do not let the testbench dictate what is functionally relevant

Behavioral Models are Faster Faster to write No implementation constraints (timing, synthesis) Fewer lines of code – Only have to worry about function. Faster to debug Fewer statements -> fewer bugs Faster to simulate Not sensitive to every clock cycle, just to what is pertinent for the function Faster to bring to “market” Since everything is faster, can start using this at a system level test earlier than the RLT will be available.

Cost of Behavioral Models Requires additional resources Someone has to write them! May mean RTL is delayed (if a designer does it as a deliverable) Can hire additional resources Maintenance requires additional efforts To support the model, needs ongoing maintenance Architecture changes, model must reflect it.

Demonstrating Equivalence How do you know RTL meets the function that was described in the behavioral? Can’t “prove” using mathematics Can’t use equivalence tools. They can’t take behavioral models as inputs. Use same test suite that was used for behaviorals against RTL. Most applicable to black-box testing Most system level tests are either black-box or gray box.