Hi Andy,
Al,
For interaction with the rest of the testbench from within the
wrapper, you have a few different options that don't require adding
signals to the DUT hierarchy and port maps.
You can use global signals or shared variables (including method
calls on those shared variables) declared in a simulation package.
These are often good for maintaining a global pass/fail status and/or
error count.
My experience with global signals is quite bad in general and even when
writing software, every time I had to deal with large amount of global
variables I ended up regretting that choice...
I'm not familiar with protected types, but I guess that at least they
provide some sort of encapsulation with their methods and their private
data structures. In this case the wrappers might update coverage (write
access to data structures) while the TB can steer its course accordingly
(read access to data structures). Keeping data structures separate for
each interface (or wrapper) might facilitate the effort.
In VHDL-2008, you can also hierarchically access signals without
going through formal ports to get them.
Ok, this is something I did not know, I should keep reading about the
main differences between 2008 and previous versions of the standard.
I have also simply used assert/report statements to put messages in
the simulation log that can be post-processed.
Yep, that's something that already gives you more observability.
I don't use vmkr, so I don't know how it might work with wrapper
architectures.
FYI I'm using vmk, not vmkr. I tried to use the latter but I did have
problems in compiling it.
I used to use make to reduce compile time for
incremental changes, but that does not seem to be as big an issue as
it used to be.
I agree, but I'm kind of used to incremental compilation and do not see
any pitfall in it, but it is possible that my understanding of the
process is somewhat limited.
There is something to be said for a script that
compiles the simulation (or the DUT) from scratch, the same way every
time, regardless of what has or has not changed..
The compilation order has to be taken care of anyhow and this is
something that so far tools have asked the users to do (AFAIK). If I
have to insert a new entity in my code I simply run vmk first:
and then 'make'. The dependencies are automatically found and I do not
need to know where to put my new file in the list of files to be
compiled. Moreover, if I need to add a component that has several other
components in its hierarchy the hassle grows if everything should be
handled manually, but not if there's a tool handy.
What is the benefit of running the simulation from scratch the same way
every time?
Wrapper architectures are not compatible with gate level simulations
(at least not wrappers for entities within the DUT).
After synthesis optimizations, retiming, etc. specific internal
interface features may not even exist at the gate level.
However, a wrapper can instantiate the gate level model in place of
the RTL..
Uhm, that's interesting indeed. Meaning that for integration purposes of
several IPs you may still use wrappers and benefit from their
advantages. The simulation would still be a functional one, but some of
the elements might be gate level models.
You could in principle have behavioral models (even not synthesizable),
just to proceed with the functional verification and get the simulation
framework in place before the rtl model is ready. Using rtl libraries
instead of behavioral would be sufficient to switch.
Wrappers can also create or instantiate a different model of an RTL
entity for improving the simulation performance, or providing
internal stimulus, etc.
I guess at this point I have no more excuses for not using wrappers! ;-)