Download presentation
Presentation is loading. Please wait.
Published byVictor Lindsey Modified over 9 years ago
1
Some Course Info Jean-Michel Chabloz
2
Main idea This is a course on writing efficient testbenches Very lab-centric course: –You are supposed to learn through the labs – best way to understand and fix concepts –Lectures give you the concepts and the tools (SystemVerilog) to run the labs –One complete verification project running through the entire course, designed to be as industry-like as possible –Lectures include discussions about the labs and hands-on tutorials by the teacher –Exam is supposed to be easy if you can do the labs on your own
3
Verification Basics Jean-Michel Chabloz
4
ASIC design flow Goes in order through the following steps: –Specifications/System Design –Register Transfer Level –Netlist –Layout –Physical chip
5
ASIC design flow From specs to RTL: –human translation From RTL to Netlist to Layout to Physical: –mostly automated translation (considered a solved – or mostly solved - problem) –If RTL is bug-free, the physical chip will work The main source of bugs is the specs-to- RTL translation
6
Software design flow Goes in order through the following steps: –Specifications –High-level language (C,...) –Low-level language (assembly) –Machine language Only the specs-to-high language translation is made mostly by humans If high-level language is correct, the machine language will work
7
Software implementation flow The high-level language to machine-language flow is: –Fast –Inexpensive Software flow: 1)Write HL language 2)Translate HL language to machine language 3)Run machine language and find bugs 4)Fix HL language 5)Translate HL language to machine language 6)Run machine language and find bugs 7)Repeat from 4 until finished
8
FPGA implementation flow Can use something similar to software: 1)Write RTL language 2)Translate RTL language to gates and implement in FPGA 3)Run FPGA language and find bugs 4)Fix RTL language 5)Translate RTL language to gates and implement in FPGA 6)Run FPGA and find bugs 7)Repeat from 4 until finished
9
ASIC implementation flow 1)Write RTL language 2)Simulate RTL and find bugs 3)Fix RTL language 4)Simulate RTL and find bugs 5)Repeat from 3 until finished 6)Synthesize, make layout, implement chip
10
Validation vs Verification Validation: –are we making the right thing, that will answer to the needs of the user? –are the specs right? Verification: –are we making what we wanted to make? –is the model equivalent to the specs? We consider only verification
11
Verification plan Plan of all what should be verified in a DUT Should include all possible features and potential sources of bugs When all tests in the verification plan have been tested and no bugs were found, then the verification work is over
12
Checking RTL to spec equivalence The person (team) who wrote the RTL must not be the one who checks it correctness Else, bugs might not be found due to double-mistake in interpreting the specs In case of error, the mistake in interpreting the specs might be due to both the designer and the verifier
13
Formal vs simulation-based verification Formal verification is a new paradigm: –Tools prove that the RTL is equivalent to a high-level model or that it satisfies certain properties –No need for simulation –Might one day totally substitute simulation- based verification We consider only simulation-based verification
14
Verification models Basic model: give inputs, check outputs - black box verification We might use white-box verification (give inputs, check internal signals) Or grey-box verification (give inputs, check some internal signals specifically inserted for debug purpose)
15
Verification ~70% of effort when developing RTL – trend: growing Testbenches are more complex than RTL models Growth of testbench complexity is more than linear with RTL complexity. –example: 10 state machines with 4 states: 4^10 total states 20 state machines with 4 states: 4^20 total states 10 state machines with 8 states: 8^10 total states Have to test the RTL model under situations similar to those that the manufactured chip will encounter during use (have to develop a “model of the universe”)
16
Needs of verification language very different needs from RTL –RTL: language must be simple enough for the “stupid” synthesis tools to understand them and know how to synthesize them ex – fifo: put a RAM, use pointer A, pointer B, increment pointers when reading or writing –Verification: language must be super-powerful to be able to implement quickly and efficiently the “model of the universe”, no need to be understood by synthesis tools ex – fifo: virtual storage with push and pop
17
Hardware description languages VHDL/Verilog developed in the 1980s good languages for RTL design – synthesizable code very few non-synthesizable constructs for writing testbenches
18
Hardware verification languages e/Vera Developed in the 1990s Only high-level constructs, impossible to write RTL code People had to mix VHDL/Verilog RTL models with e/Vera testbenches HVL cover three main features lacking in HDLs: –Random constrained stimuli generation –Assertions –Functional coverage
19
Random Constrained Stimuli Generation Not to be confused with the “random” construct in Verilog/VHDL Main idea: –define random variables and constraints –ask the “random solver” to find a random set of variables that satisfies the constraints –constraints can be added, disabled to create different tests
20
Random Constrained Stimuli Generation example: –random bit a –random integers b, c between 0 and 255 –constraint CA: if a=1 then (b+c)!=256 –constraint CB: b>c It would be hard to make a routine to randomly generate one of the legal combinations using only direct randomization of variables
21
Assertions Tools for automatic checking of properties “automated waveform checker” Example: –when req goes at one grant must be 1 between 2 and 3 cycles later –req must never be at one for more than two consecutive cycles Can be used in testbenches or “bundled” with the RTL to check input correctness
22
Functional Coverage Testing if the testbench is good enough Did we do all the tests that we wanted to do based on our verification plan? Example: –A crossbar can put in correspondence all inputs with all outputs –Did we try all combination of input/outputs? –With functional coverage we can record how many times all combinations of input/outputs were tested, and see the results in a report
23
Code Coverage Except functional coverage, there are other coverage metrics that are tool features and can be used independently on the language These do not require to write code to enable coverage Statement coverage: has every statement in the DUT be executed? Path coverage: have all paths been followed? Expression coverage: have all causes for control-flow change been tried? FSM coverage: has every state in an FSM be visited?
24
Code coverage – Statement coverage y if (a>1 || b>1) begin y c <= d; y d <= d+1; y end y else begin y if (a==2) begin n d <= d-1; y end y else y d <= d-2; y end
25
Code coverage – Path coverage if (a>1 && b>1) begin c <= d; d <= d+1; end if (a>2) begin if (a==3) begin d <= d-1; end else d <= d-2; end Run 1
26
Code coverage – Path coverage if (a>1 && b>1) begin c <= d; d <= d+1; end if (a>2) begin if (a==3) begin d <= d-1; end else d <= d-2; end Run 2
27
Code coverage – Path coverage if (a>1 && b>1) begin c <= d; d <= d+1; end if (a>2) begin if (a==3) begin d <= d-1; end else d <= d-2; end At the end of all the runs, we find out this legal path was not exercised
28
Code coverage – expression coverage if ((a>1 && b>1) || (a<0) || (b<0)) begin c <= d; d <= d+1; end All statements were executed (100% statement coverage), but not all values for the expressions became true
29
Code coverage - FSM coverage Checks if all states in a state machine were exercised
30
Directed tests Testbenches without randomness, targeting a specific item in the verification plan. Example: –write in the fifo for 16 cycles after each other, check that the fifo is full, then read all 16 elements, check that it is empty If the design is complex enough, it is impossible to cover all features with directed testbenches
31
Random verification 1)Generate random tests using random constrained stimuli generation 2)Check for bugs and correct them if there are 3)Check for the coverage values. If not satisfying, add constraints and repeat from 1 Note: some directed testbenches might be necessary to cover the corner cases
32
SystemVerilog Hardware description and verification language Superset of Verilog – all Verilog systems work in SystemVerilog First standardized by IEEE in 2005 2013: IEEE standard 1800-2012 –Download the standard –“Holy book of SystemVerilog” –Answer to all of your questions are inside –Good for reference, not for learning SystemVerilog “is” the standard IEEE 1800-2012 –Simulators/Synthesizers implementations might be incomplete
33
SystemVerilog RTL subset of the language: –small superset of the Verilog RTL subset – some constructs have been inserted to simplify different things Verification subset of the language: –very very very big superset of Verilog Object-oriented constructs Random constrained stimuli generation Assertions Functional Coverage
34
SystemVerilog SystemVerilog testbenches can be used to test SystemVerilog RTL models, but can also be used to test VHDL/Verilog RTL models – mixed-language simulation
35
Testbenches Basic model: give inputs, read outputs The element to test is called DUT (Design Under Test) or DUV (Design Under Verification) Inputs generatorRTL modelOutputs checker
36
Testbench structuring It is important to conceptually divide testbenches into blocks, depending on the function: –Generator of high-level input data (example: we decide to send out 4 packets of 1024 bytes followed by 2 packets of 64 bytes) –Driver: read the high-level input data and drive the DUT input ports. –Output monitor: read data from the DUT’s output ports. –Output checking: check correctness of the output This allows easier readability and reuse. If the DUT input protocol change, only the driver must change. This requirement is most important for big systems, with a lot of reuse and many people working on it.
37
Structured testbenches Inputs driver Gives the inputs to The DUT RTL model Outputs monitor Reads the outputs and Translates into HL Outputs checker Checks outputs with Expected HL outputs Generator of High-level inputs
38
Testbenches Often a golden model can be used: Golden model and RTL must be developed by different teams, errors might be in both Inputs driver Gives the inputs to The DUT RTL model Outputs monitor Reads the outputs and Translates into HL Outputs checker Checks outputs with Expected HL outputs Generator of High-level inputs Golden model (matlab, TLM, Timed, untimed, …)
39
Collection of IPs Each IP must first be verified at block-level Then top-level verification follows Verification systems for IPs are packaged into VIPs (Verification IPs), with drivers, monitors, assertions to check input correctness, high-level models, etc. A scoreboard keeps track of which tests have been run and coverage Possible to build a chip in which only some components are RTL, the others are golden models Using VIPs it is easy to build fast complex models of what is around a block or a chip SoC Verification
40
UVM Universal Verification Methodology Methodology on top of SystemVerilog that automates all this Key focus: reuse Components are enclosed into agents, containing checkers, monitors, drivers, etc. A chip can be built connecting together the different VIPs We do not consider UVM, it is only adapted to complex systems – for the testbench complexity level used in the course labs, some form of “conceptual” division between testbench blocks is considered sufficient.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.