about input_delay and out_delay.

M

Mike

Hi,

I'm new to Synopsys Design Compiler. I have some questions:
1. Should I set drive strength and load for the ports
before optmization? But I don't know what kind of value
i can use.

2. Should I always set input_delay and out_delay? In the
tutorial manual of Synopsys Design Compilefr, it only
set output delay to the Clock? Why don't we need to set
input_delay?

Thanks.

Liang
 
M

Mike

Hi Jon

Thanks for your detailed reply.

Here I have another question about the post-simulation with the synthesized
code
and I really appreciate your help if you give me some hints:

I declared an inout entity port with std_logic_vector(7 downto 0) type
in my VHDL program. In the testbench, I instantiate this entity and try to
assign some value to this port, do some calculations and output the result
to this port.

The test bench works properly before synthesis. But after synthesis using
Synopsys
Design Compiler, I substituted the synthesized code for the origianl VHDL
code.
And I found this port wouldn't accept the value from signal assignment in
the
test bench and the value of this port becomes "XX" (the initial value is
"UU")
after the assignment takes effect.

BTW, even I initialize this port to "ZZZZZZZZ" in the entity definition
in the synthesized VHDL program, it still doesn't work.

Thanks a lot.

Liang
 
J

Jon

Hi Liang,
I assume that the inout port was mapped to a bi-directional IO cell
make sure the tri-state control signal is correctly controlled or the
driver might always be driving the output. In most simulators you can
check the driver to see if it is driving when you apply your input.
Another problem could be the initial condition of 'U' which might
propogate unknowns into the system causing 'X' to appear on the
outputs.

Initilize all the inputs to driving a valid logic level either a 0
or 1 to check to see if the undefined inputs are propogating to the
outputs.

Jon
 
L

Liang Yang

Hi Jon,

Thanks for your reply.

I tried to initialize this inout port to all 0, but it still shows "UU"
when I start simulation. During pre-synthesis simulation,
the initial value of DATA is also "UU", but it changes to correct value
after assignment take effect in the test bench.

I try to figure out which one is driving this DATA port when its value
is changed. But I don't know how to do this in Active HDL. The break point
is located in the lsi_10k library and I can't trace it until the program
control
comes back to my program.

Thanks.

Liang


Here is my code:
--design.vhd
library lsi_10k;
use lsi_10k.all;
....
entity DESIGN_UNIT is

port( DATA : inout std_logic_vector (7 downto 0) := "ZZZZZZZZ"; ...);

end DESIGN_UNIT;

--testbench.vhd
entity test_bench is
end entity test_bench;

architecture behav of test_bench is
signal DATA: std_logic_vector(7 downto 0);
signal Clock: bit :='0';
...
begin
DUT: entity DESIGN_UNIT
port map(DATA, ctl_data, ctl_op, Reset, Clock);

simulation: process is
begin
clock <= '0';
wait for 20 ns;

DATA <= "00000001";
Clock <= '1';
wait for 20 ns; -- DATA port will become "XX" at 20+1 ns

wait;
end process;
end architecture behav;
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top