Verification Plan. This is the specification for the verification effort. It gives the what am I verifying and how am I going to do it!

Slides:



Advertisements
Similar presentations
Object Oriented Analysis And Design-IT0207 iiI Semester
Advertisements

Verification and Validation
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Software Quality Assurance Plan
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Documenting a Software Architecture By Eng. Mohanned M. Dawoud.
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
Case Tools Trisha Cummings. Our Definition of CASE  CASE is the use of computer-based support in the software development process.  A CASE tool is a.
Requirements Analysis Lecture #3 David Andrews
16/27/2015 3:38 AM6/27/2015 3:38 AM6/27/2015 3:38 AMTesting and Debugging Testing The process of verifying the software performs to the specifications.
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
From Concept to Silicon How an idea becomes a part of a new chip at ATI Richard Huddy ATI Research.
Introduction to Software Testing
Software Testing & Strategies
Chapter 6: The Traditional Approach to Requirements
S/W Project Management Software Process Models. Objectives To understand  Software process and process models, including the main characteristics of.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
1 Shawlands Academy Higher Computing Software Development Unit.
Software Testing Lifecycle Practice
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
© 2012 IBM Corporation Rational Insight | Back to Basis Series Chao Zhang Unit Testing.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
Teaching material for a course in Software Project Management & Software Engineering – part II.
ASIC/FPGA design flow. FPGA Design Flow Detailed (RTL) Design Detailed (RTL) Design Ideas (Specifications) Design Ideas (Specifications) Device Programming.
SOFTWARE TESTING Scope of Testing  The dynamic Indian IT industry has always lured the brightest minds with challenging career.
1 The Software Development Process  Systems analysis  Systems design  Implementation  Testing  Documentation  Evaluation  Maintenance.
Testing Workflow In the Unified Process and Agile/Scrum processes.
FPGA-Based System Design: Chapter 6 Copyright  2004 Prentice Hall PTR Topics n Design methodologies.
Verification Plan & Levels of Verification
The First in GPON Verification Classic Mistakes Verification Leadership Seminar Racheli Ganot FlexLight Networks.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
© 2012 xtUML.org Bill Chown – Mentor Graphics Model Driven Engineering.
Black Box Testing Techniques Chapter 7. Black Box Testing Techniques Prepared by: Kris C. Calpotura, CoE, MSME, MIT  Introduction Introduction  Equivalence.
Unit Testing 101 Black Box v. White Box. Definition of V&V Verification - is the product correct Validation - is it the correct product.
1 6 Systems Analysis and Design in a Changing World, 2 nd Edition, Satzinger, Jackson, & Burd Chapter 6 The Traditional Approach to Requirements.
Logic Verification Industry Perspective Bruce Wile IBM Server Group Verification Lead 4/2/01.
Systems Analysis and Design in a Changing World, Fourth Edition
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
Fall 2004EE 3563 Digital Systems Design EE 3563 VHSIC Hardware Description Language  Required Reading: –These Slides –VHDL Tutorial  Very High Speed.
Software Testing and Quality Assurance Practical Considerations (4) 1.
The Software Development Process
Macro Verification Guidelines Chapter 7.. Chap 7. Macro Verification Guidelines The goal of macro verification The macro is 100 percent correct in its.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Topic 4 - Database Design Unit 1 – Database Analysis and Design Advanced Higher Information Systems St Kentigern’s Academy.
Lecture 1 – Overview (rSp06) ©2008 Joanne DeGroat, ECE, OSU -1- Functional Verification of Hardware Designs EE764 – Functional Verification of Hardware.
EE694v-Verification-Lect7-1- Verification Plan & Levels of Verification The Verification Plan Yesterdays and today’s design environment Design specification.
Software Quality Assurance and Testing Fazal Rehman Shamil.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
SCOPE DEFINITION,VERIFICATION AND CONTROL Ashima Wadhwa.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Lecture 1 – Overview (rSp06) ©2008 Joanne DeGroat, ECE, OSU -1- Functional Verification of Hardware Designs EE764 – Functional Verification of Hardware.
Levels of Verification Figure 2.2 p 37 Verification is applied at all different abstraction levels Mostly bottom up, some top down.
Software Engineering (CSI 321)
Systems Analysis and Design
CHAPTER 2 Testing Throughout the Software Life Cycle
Introduction to Software Testing
System-level verification and Board level verification
Lecture 09:Software Testing
Verification Plan & Levels of Verification
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
L9 - Verification Strategies
CS310 Software Engineering Lecturer Dr.Doaa Sami
Presentation transcript:

Verification Plan

This is the specification for the verification effort. It gives the what am I verifying and how am I going to do it!

Role of the Verification Plan Specifies the Verification effort Defines 1 st time success (ideally)

Specifying the Verification When are you done? Metrics The specification determines the “what has to be done”! The specification does NOT determine the following: How many will it take? How long will it take? Are you are doing needs to be done? Verification Plans define that!

Specifying the Verification (continued) The plan starts from the design specification – why – The reconvergence model!! Must exist in written form “the same thing as before, but at twice the speed, with these additions…” – Not acceptable specification. The specification is a golden document – it is the “law”. It is the common source for the verification efforts and the implementation efforts.

Defining 1 st time success The plan determines what is to be verified. If, and only if, it is in the plan will it be verified. The plan provides the forum for the entire design team to determine 1 st time success. For 1 st time success, every feature must be identified and under what conditions. As well as expected response. Plan documents these features (as well as optional ones) and prioritized them. This way, consciously, risk can be mitigated to bring in the schedule or reduce cost.

Defining 1 st time success (continued) Plan creates a well defined line, that when crossed can endanger the whole project and its success in the market. It defines: How many test scenarios (testbenches/testcases) must be written How complex they need to be Their dependencies From the plan, a detailed schedule can be produced: Number of resources it will take – people, machines, etc Number of tasks that needs to be performed Time it will take with current resources

Verification Plan Team owns it! Everyone involved in the project has a stake in it. Anybody who identifies deficiencies should have input so they are fixed. The process used to write the plan is not new. It has been used in the industry for decades. It is just now finding its way to hardware. Others who use it: NASA FAA Software

Verification Levels Verification can be performed at various granularities: Designer/Unit/sub-subunit ASIC/FPGA/Reusable component System/sub-system/SoC board

Verification Levels (continued) How to decide what levels? There are trade- offs: Smaller ones offer more control and observability but more effort to construct more environments Larger ones, the integration of the smaller partitions are implicitly verified at the cost of lower control and observability but less effort because fewer environments Whatever level is decided, those pieces should remain stable. Each verifiable piece should have its own specification document

Unit Level Usually pretty small Interfaces and functionality tend to change often Usually don’t have independent specification associated with them Environment (if one) is usually ad hoc Not intended to gain full coverage, just enough to verify basic operation (no boundary conditions) High number of units in a design usually means it is not reasonable to do unit level due to high level of effort for environments

Reusable Component Level Independent of any particular use Intended to be used as is in many different parts of a design. Saves in verification time. Can code BFM’s for these pieces that others who communication with the reusable component can use. They necessitate a regression suite Components will not be reused if users do not have confidence in them Gained by well documented and executed process

ASIC/FPGA Level Physical partition provides a natural boundary for verification Once specified, very little change due to ramification down stream. FPGA’s used to get debugged at integration. Now they are so dense, that a more ASIC flow is required.

System Level What is a system? Everyone’s definition is a little different. Verification at this level focuses on interaction, not function buried in one piece. A large ASIC may take this approach. (Assuming that all pieces that make up the ASIC are specified, documented, and fully verified). Assume that individual components are functionally correct. Testcase defines the system.

Board Level This can be the same as system. Think of an adapter for a PC. Purpose is to confirm that the connectivity that the board design tool is correct. When doing board level verification, one must ensure that what is in the “system”, is what is to be manufactured. Things to consider Many board level components don’t fit into a digital simulation – analog! What are suitable models (3 rd party?) How to generate the net list (hand stitch or tool)

Current Practices for Verifying a System Designer Level Sim Verification of a macro (or a few small macros) Unit Level Sim Verification of a group of macro’s Chip (Element) Sim Verification of an entire logical function (I.E. processor, storage controller, Ethernet MAC, etc) System Level Sim Multiple chip verification Sometimes done at a ‘board’ level Can utilize a mini operating system

In an Ideal World…. Every Macro would have perfect verification performed All permutations would be verified based on legal inputs All outputs checked on the small chunks of the design Unit, Chip, and System level would then only need to check interconnections Ensure that macros used correct Input/Output assumptions and protocols

Reality Macro verification across an entire system is not feasible May be over 100 macros on a chip – would require 200 verification engineers Number of skilled verification engineers does not exist Business can’t support the development expense Verification Leaders must make reasonable trade-offs Concentrate on Unit level Designer level on riskiest macros

Verification Strategies Must decide the following: White Box, Black Box, Grey Box Types of testcases Level of abstraction where majority of verification takes place How to verify the response How random will it be

Verifying the Response Deciding how to apply stimulus is usually easy part. Deciding response is hard part. Must plan how to determine the expected response How to determine that design provided response that you expected Detect errors and stop simulation as early as possible. Why? Error identified when it takes place – easier to debug No wasted simulation cycles

Random Verification Does not mean randomly apply ‘0’s and ‘1’s to input - Still must produce valid stimulus The sequence of operations and the content of the data transferred is random! This creates conditions that one does not think of. Hit corner cases Unexpected conditions Reduces bias from the engineer

Random Verification (continued) Very complex to specify and code Must be a fair representation of what the operating conditions are for the design. Must include distributions and ranges More complicated to determine expected response. With proper constraints, a random environment can produce directed tests, the converse is not true.

Specifications to Features 1 st step in planning is to identify features to test Each feature labeled with short description Ideally the specification and verification plan are cross referenced Describe where the function will be verified Unit level features – bulk of verification System level features Concentrate on functional errors, not tool induced errors.

Features to testcases Prioritize the features Must have features gate tape out Less important receive less attention This helps project managers make informed decisions when schedule pressures arise. Group features into groups – these become the testcases. Testcases should have descriptions and cross-references to features list. If a feature does not have a testcase assigned, it is not being verified. Define dependencies – This will help prioritize Specify testcase stimulus (or constraints for random) Also must specify how each feature will be checked Inject errors to ensure that checks are working

Testcases to testbenches Testcases naturally fall into groups Similar configuration or same abstraction level Cross-reference testbenches with testcases Assign testbenches to verification engineers Verifying the testbench Use a broken model for each feature to be verified Peer reviews Provide logging messages in the testbench

Three Verification Commandments 1. Thou shalt stress thine logic harder than it will ever be stressed again! 2. Thou shalt place checking upon all things! 3. Thou shalt not move onto a higher platform until the bug rate has dropped off!

Verification Do’s and Don’ts Do: Talk to designers about the function Spend time thinking about all the pieces that need to be verified Talk to “other” designers about the signals that interface to DUV Don’t: Rely on the designer’s word for input/output specification Allow a chip to go out without being properly verified – pay now or pay later (with inflation!)