Why are you converting a numeric constant to a std_logic_vector in the
first place though? Presumably you would have yet more code that uses
that constant and, it would likely be in a mathematical context where
you can simply use the 'my_natural_constant' as is without any
conversion
Example:
some_unsigned_signal <= some_other_unsigned_signal +
my_natural_constant;
Why? When I began writing VHDL the concept of std_logic was a sort
of mystery to me. Since all (or at least most from Xilinx)
introductory tutorials used std_logic, I thought it was proper
to use it as well. I think most beginners will use std_logic
within their architectures. Then the arithmetic starts getting
difficult starting with signed and unsigned binaries, and then
multiplication, and so on. Also there is an issue with indexed
arrays that want a natural or integer index argument. I recall that
complicating the issue, too, was the fact that Variables did not
show up on ModelSim, without some cryptic procedure that I have
forgotten now, because I assign desired variable into std_logic.
I haven't seen any book or article address these issues, only a
few posts here and there.
In my current design there are a lot of constants that are modified
for synthesis as opposed to simulation due to the inordinate amount
of video data that the systhesis hardware deals with. Following the
advice from a previous post I use an array for each pair of constants.
type syn_sim is array(0 to 1) of natural;
constant start_row_syn_sim : syn_sym := (20,2);
.. . . and more arrays
constant start_row : natural := start_row_syn_sim(sim);
.. . . and more constants
sim is a generic natural that defaults to 0 or synthesis, but
is set to 1 for simulation from the testbench. I had thought
of using boolean for sim but I thought that down the road I
might want to have different sets of simulation constants.
Hence I have a number of these my_std_logic_vector <= my_const_natural;
assignments in my architecture.
Except for at the top level design ports you would be better off using
the proper data type internally (i.e. natural, integer, my_type, etc.)
for all signals and constants.
That sounds good. I have never declared signal my_natural : natural;
however, there are quite a few instances in my code where I have to
pass std_logic and std_logic_vectors to other modules, FIFOs and the
like, so it seems that there would be a similar amount of mess
untangling all of that. True?
You'll be better off sticking with numeric_std.
That does seem to be the prevailing wind. However I was wondering
if their was a severe gothcha in using both. I find it so easy to
drop in a counter using std_logic_unsigned and not have to worry
about converting it at all.
Brad Smallridge
AiVision