Home Tips Eternal Sunshine of the Verifier's Mind

Search

Eternal Sunshine of the Verifier's Mind PDF Print E-mail
User Rating: / 6
PoorBest 
Wednesday, 03 March 2010 16:39

To be successful in verification you not only need to possess the right technical skills, but you also need to possess the right mindset. Possessing the right mindset will lead you to success rapidly. Here are 3 things that I’ve found very important to keep in mind with everything you do in verification:

 

Verification vs. Debugging – Verification may be defined as “proving or establishing authenticity or validity” or “evidence that establishes or confirms the accuracy or truth of something”. Perhaps the lexical meaning of the word should account for it, but the verification process sometimes wrongly begins with the premise that the design process delivers a more-or-less working RTL and the verification process is merely there as a safety net just to be a 100% sure that everything is working well. Kind of like Quality Control or something. Experienced verifiers and managers are very unlikely to err here, but others may easily fall into this conceptual pitfall, including many designers and project managers. This kind of premise is not only half true but also quite dangerous. In reality, the RTL design process produces RTL code. That’s it. The verification process produces RTL code that works. And  there’s a big difference.

As a matter of fact, if I had my way, I would split verification into 3 distinct areas – A) VIP Development – that’s the part where verification IP is being developed. This is a pure software engineering effort. B) Simulation – that’s the part where fresh or immature RTL is being simulated and debugged. C) Closure – that’s the part where the RTL is in good shape and regression suites are available to test and cover corner cases. Phase A is pretty much a standalone, self-contained, bounded effort. The other ones (and especially phase B) are totally opposite in nature. Design and Verification during these phases need to work hand in hand. Program the right mindset into your brain – i.e. that verification and design cannot be separated – and you’re on your way to success.

 

Automation – “Manual verification” or “Features verified by visual inspection” are history. Verification is all about building an automated system that eventually will check your design and find bugs in a push-button fashion. With everything you do in verification, you should keep that thought in mind. Interestingly enough, tool vendors promote automation in different ways, always wanting to take it to the next level (e.g. vManager, VMM Planner, etc.) but many users are not so enthusiastic about going there and prefer to keep it simple. Some of their concerns are valid – fully blown automation requires a significant effort to ramp and might be more suitable for big companies. Nevertheless, even in small teams, keeping automation in mind can do wonders to overall productivity. In these cases, Perl and similar scripting languages can do the job well. Don’t settle for “visual inspection” or “manual testing”. Instead, make your environment smart, make it robust, so that every single test provides a definitive pass/fail indication. Don’t leave holes in your checkers and use scripts to help you in launching and handling simulations. Remember, the most valuable asset is your time – let the machine do the hard and repetitive work.

 

Safety Nets – Verification is and always will be a matter of redundancy. Why? Because it’s done by humans and holes may exist in any verification layer. Coverage, for instance, is in many ways a safety net that validates and measures the quality of your generation (In the early days, when coverage had not yet been a metric for progress, it was merely used as a feedback on generation). When you’re building a verification environment, keep “safety nets” in the back of your mind. For example, let’s say you have a generic scoreboard that compares each item that comes out of the DUT with the respective item that comes in. If a mismatch occurs, an error is issued. But what if for some reason the input channel to the DUT is blocked in such a way that no items ever enter the scoreboard and nothing ever comes of the DUT? If you don’t have an orthogonal checker that makes sure that at least one item has entered/exited the DUT you might fall into the infamous pitfall of “0 items in, 0 items out, TEST PASSED”. There are, of course, less trivial examples. So what to do? A good methodology (not base class library) should be able to cover up for some user mistakes – for example, sampling coverage from monitors rather than generators will help make sure that the data you’re sampling has really entered the DUT. But safety nets can also be much less sophisticated (albeit not less powerful). For example – In the regression phase, add a script that makes sure that no test has been left untested after a full regression run. Simple, but powerful. Remember, the more safety nets you add, the more confidence you have in your verification system.


 
More articles :

» The Cost Of Verification

We spend about one third of our lives in bed, right? doesn’t it make sense then to buy a good bed and not a cheap one? The same can be said for your verification tools as you spend so much of your project time on verification. Buying cheaper tools...

» Let The New Game Begin

Things are changing. The EDA industry is changing, and the verification world is changing (check out Janick Bergeron's inspiring at SNUG San Jose for a glimpse of the future of verification). One of the major challenges we’re already facing today...

» Coverage Driven Thoughts

In today’s short post what I’ll try to do is share with you some of the recent trends and ideas that deal with coverage. I won’t go into much technical detail today in order not to wear you (and myself) out, but really - if I want to be more...

» Top Level Verification - What's The Big Deal?

How to attack your chip from the top? Why is it so difficult to put together a good top level verification plan? Here are a few ideas.

» The Miracle Of Verification

Is verification really a miracle and verifiers have ceased to be engineers? Not too long ago I wrote an about some common myths in Verification. Today I would like to talk about a bigger myth which I like to call the "Verification Miracle Myth"....

Comments  

 
0 #1 2010-03-04 08:31
Basic stuff, but well worth saying especially when it's said as neatly as this - should be required reading for anyone starting out on a verification career. Thanks!
Quote
 

Add comment


Security code
Refresh

Copyright © 2017 Think Verification - Tips & Insights on ASIC Verification. All Rights Reserved.
Joomla! is Free Software released under the GNU/GPL License.