Verification Guild
A Community of Verification Professionals
Search


  Login  
Nickname

Password

Security Code:
Security Code
Type Security Code
BACKWARD
Don't have an account yet? You can create one. As a registered user you have some advantages like theme manager, comments configuration and post comments with your name.

  Modules  
  • Home
  • Downloads
  • FAQ
  • Feedback
  • Recommend Us
  • Web Links
  • Your Account

  •   Who's Online  
    There are currently, 109 guest(s) and 0 member(s) that are online.

    You are Anonymous user. You can register for free by clicking here

     
    Verification Guild :: View topic - Question: Know - How on setting up functional regression
     Forum FAQForum FAQ   SearchSearch   UsergroupsUsergroups   ProfileProfile   Private MessagesPrivate Messages   Log inLog in 

    Question: Know - How on setting up functional regression

     
    This forum is locked: you cannot post, reply to, or edit topics.   This topic is locked: you cannot edit posts or make replies.    Verification Guild Forum Index -> Miscellaneous
    View previous topic :: View next topic  
    Author Message
    SAHO
    Senior
    Senior


    Joined: Oct 16, 2004
    Posts: 24

    PostPosted: Fri Nov 05, 2004 4:06 pm    Post subject: Question: Know - How on setting up functional regression Reply with quote

    Hello experts:

    I am wondering if someone can provide some sample regression script that automates functional verification process.

    I am afraid mine is too ugly to show. Basically, I create another macro file (DO file for ModelSIM) which in turn calls many many tests (also DO files).

    SAHO
    Back to top
    View user's profile
    Janick
    Site Admin
    Site Admin


    Joined: Nov 29, 2003
    Posts: 1394
    Location: Ottawa, ON Canada

    PostPosted: Fri Nov 05, 2004 4:47 pm    Post subject: Reply with quote

    They all have more or less the same structure and are rather easy to implement using your favorite scripting language.

    Code:
    - Check out appropriate version of design and verification from source control system

    - Compile static portion of the design and verification environment

    - For each test (either automatically identified or specified on the command line or in a config file)

      - Compile source for test

      - Repeat N times (if directed test, N == 1)

        - Generate seed

        - Create test + seed-specific output directory

        - Save command used to invoke simulation

        - Invoke simulation (optionally using concurrent machines)

        - Wait for simulator/machine to be available

    - Summarize results from all output files

    - Merge coverage databases

    - Email summary
    Back to top
    View user's profile Send e-mail Visit poster's website
    srini
    Senior
    Senior


    Joined: Jan 23, 2004
    Posts: 442
    Location: Bengaluru, India

    PostPosted: Fri Nov 05, 2004 6:02 pm    Post subject: Re: Question: Know - How on setting up functional regression Reply with quote

    SAHO wrote:
    Hello experts:

    I am wondering if someone can provide some sample regression script that automates functional verification process.

    I am afraid mine is too ugly to show. Basically, I create another macro file (DO file for ModelSIM) which in turn calls many many tests (also DO files).

    SAHO

    Hi,
    You may want to try:

    http://www.verifica.org/library.item.htm?object=tr014

    This needs a free registration, I forgot my pwd hence didn't check the paper/script associated, myself. But this may be close to what you are looking for.

    Good Luck
    Srinivasan
    _________________
    Srinivasan Venkataramanan
    Chief Technology Officer, CVC www.cvcblr.com
    A Pragmatic Approach to VMM Adoption
    SystemVerilog Assertions Handbook
    Using PSL/SUGAR 2nd Edition.
    Contributor: The functional verification of electronic systems
    Back to top
    View user's profile Send e-mail Visit poster's website
    asif
    Senior
    Senior


    Joined: Oct 20, 2004
    Posts: 18

    PostPosted: Mon Nov 08, 2004 1:48 am    Post subject: Reply with quote

    Quote:

    - For each test (either automatically identified or specified on the command line or in a config file)

    - Compile source for test

    - Repeat N times (if directed test, N == 1)

    - Generate seed

    - Create test + seed-specific output directory




    Regression means "stressing" the design under all possible test scenarios. Well talking to other folks in the industry i have come across two flavors of regression.

    1. Run all the possible/available tests in some sequential order. Each tests gives reset to the DUT.

    2. Run all the possible tests without giving reset to the DUT.


    In the second case one needs to design a total new environment for regression.

    The first case is the one Janick has pointed out but it does not mean a regression in true sense. Its just running the same old test just with a different seed. One the design is reset the test scenario created by the previous test is lost.

    -Asif
    Back to top
    View user's profile
    Janick
    Site Admin
    Site Admin


    Joined: Nov 29, 2003
    Posts: 1394
    Location: Ottawa, ON Canada

    PostPosted: Mon Nov 08, 2004 4:05 pm    Post subject: Reply with quote

    Quote:
    In the second case one needs to design a total new environment for regression.


    I do not think a different environment should be used between test development and regression. The same script and environment should be used. During test development, you are running a trivial regression composed of a single test - the one you are developping.


    Quote:
    1. Run all the possible/available tests in some sequential order. Each tests gives reset to the DUT.
    2. Run all the possible tests without giving reset to the DUT.


    The environment required by #2 will indeed be very different from the one required by #1. Thus the decision to concatenate tests in a regressions (with or without reset between them) or to run each test individually must be taken at the begining of the project.

    To be considered:

    - Is test concatenation going to highlight bugs? In some classes of design, initial state may be very significant (e.g. processors). In others, it may actually render the subsequent test meaningless (e.g. FIFOs). Can a better verification planning process or formal technoology accomplish the same thing?

    - Is test concatenation simply used as a proxy for a long-run test to try and find deep-state bugs? How likely are those deep-state bugs going to be hit? Is there a way to shorten the path to reaching that deep-state (e.g. by pre-loading some registers)?

    - Are we really talking about test concatenation or simply the addition of random scenarios to the mix of an existing random test? The new scenario is randomly injected (potentially multiple times) in the middle of other random or directed scenarios in one big random test. Different seeds should produce a different mix.

    - How to quickly recreate a bug? Easy in separate simulations. Difficult in concatenated tests with a simulated reset in between. Hard in concatenated tests without reset.

    - How is the code for each test kept separate so it can be concatenated or run stand-alone? If tests are implemented via AOP structural extensions, how are these kept orthogonal?

    - Ability to farm out concurrent simulations. By concatenating tests, you create a serial process. Separate simulations can be farmed out concurrently.
    Back to top
    View user's profile Send e-mail Visit poster's website
    asif
    Senior
    Senior


    Joined: Oct 20, 2004
    Posts: 18

    PostPosted: Wed Nov 10, 2004 8:19 am    Post subject: Reply with quote

    Quote:

    To be considered:

    - Is test concatenation going to highlight bugs? In some classes of design, initial state may be very significant (e.g. processors). In others, it may actually render the subsequent test meaningless (e.g. FIFOs). Can a better verification planning process or formal technoology accomplish the same thing?

    - Is test concatenation simply used as a proxy for a long-run test to try and find deep-state bugs? How likely are those deep-state bugs going to be hit? Is there a way to shorten the path to reaching that deep-state (e.g. by pre-loading some registers)?

    - Are we really talking about test concatenation or simply the addition of random scenarios to the mix of an existing random test? The new scenario is randomly injected (potentially multiple times) in the middle of other random or directed scenarios in one big random test. Different seeds should produce a different mix.

    - How to quickly recreate a bug? Easy in separate simulations. Difficult in concatenated tests with a simulated reset in between. Hard in concatenated tests without reset.

    - How is the code for each test kept separate so it can be concatenated or run stand-alone? If tests are implemented via AOP structural extensions, how are these kept orthogonal?

    - Ability to farm out concurrent simulations. By concatenating tests, you create a serial process. Separate simulations can be farmed out concurrently

    We have now come down to the most basic topic of Verification.Verification is to test one single feature of the design which a given test is targeting at one time or Verification is to test a complete system where the nature of traffic is going to concatenation of all the test case we directed.

    The answer being the Real world traffic is going be a concatenation of the test cases .The concatenation which of course , generates a valid/real test case.

    So the test case should be written in such way that it does not bother about initial setup.The test case first configures targets/checks only one feature of the design.

    The environment has be designed such that there is something called an "intelligent router/concater" That understands how the real life traffic can be generated. I assume the checker in the design to be independent of tests are run independent or concatenated .

    Regarding re-creating the bug .This would be definitely possible as the concatenation is intelligent.

    Running a single real life test over parallel tests in farm would add more value.

    -Asif
    Back to top
    View user's profile
    alexg
    Senior
    Senior


    Joined: Jan 07, 2004
    Posts: 586
    Location: Ottawa

    PostPosted: Thu Nov 11, 2004 2:05 pm    Post subject: Reply with quote

    Janick wrote:
    They (scripts) all have more or less the same structure and are rather easy to implement using your favorite scripting language.


    In my opinion, effective organization of verification project is one of the most complex tasks in chip development process. Design teams who underestimate the importance of this task may have serious problems and lots of wasted time during the project lifetime.

    First, there is huge amount of data wich has to be organized - designs, verification components, environments, tests, scripts, tools etc
    Second, there are too many people involved. Verification project involves both design and verification teams working concurrently and modifying sources of the project. These teams have different goals, which may result in disturbances and synchronization problems during the project lifecycle. For example, designers update design, which may reduce it's functional stability. However, verification engineers, developing test cases or verification components, prefer stable design, and may not update their local design copies when new design modifications appear.

    There are different ways of project organization, some of them presented by Janick. I don't want to recommend any specific one (since solutions may be specific to the design, experience accumulated in R&D etc), but rather give few advices:
    1. Develop the concept before developing scripts. Define project database structure, main script functions and script interface.
    2. Define owners for each one of database components.
    3. Write scripts in the modular way, similar to good old C program organizarion: main block and the library of procedures/functions. Divide script on common stable part (full reuse across the projects) and part which has to be modified across the projects or project configurations.
    4. It is not enough to write good scripts. Then, it is necessarily to develop organizational rules and use them all the time.

    Regards,
    Alexander Gnusin
    Back to top
    View user's profile Send e-mail
    richardbradley
    Senior
    Senior


    Joined: Feb 10, 2004
    Posts: 73
    Location: St Louis, Mo

    PostPosted: Thu Nov 11, 2004 4:29 pm    Post subject: Reply with quote

    I am not exactly sure what it is you are trying to do. But if what you want to do is run a bunch of pre-defined tests in order, check out the parallel script under the download section of the verification guild:

    http://www.verificationguild.com/modules.php?name=Downloads&d_op=viewdownload&cid=2

    This is actually meant to farm-out tests across any number of unix/linux machines. Each test is discrete. Not really meant for the windows environment, but I guess it could work with cygwin??? (Well, maybe not.)

    ~Rich
    Back to top
    View user's profile Visit poster's website
    Display posts from previous:   
    This forum is locked: you cannot post, reply to, or edit topics.   This topic is locked: you cannot edit posts or make replies.    Verification Guild Forum Index -> Miscellaneous All times are GMT - 5 Hours
    Page 1 of 1

     
    Jump to:  
    You cannot post new topics in this forum
    You cannot reply to topics in this forum
    You cannot edit your posts in this forum
    You cannot delete your posts in this forum
    You cannot vote in polls in this forum

    Powered by phpBB © 2001, 2005 phpBB Group
    Verification Guild (c) 2006-2014 Janick Bergeron
    PHP-Nuke Copyright © 2005 by Francisco Burzi. This is free software, and you may redistribute it under the GPL. PHP-Nuke comes with absolutely no warranty, for details, see the license.
    Page Generation: 0.12 Seconds