Hello!
Recently I have written a testbench to my program.
It should work like this: after rising edge certain input signals change their values and depending on those values I get according bits on output.
In simulation, there is a delay, a couple of clock cycles before I get the answer and I was wondering what causes the delay and how to measure it.
Here is the simulation window, clock cycle is 10ns, after first rising edge the program changes input values and should initialize output. Then it waits for about 110ns before it changes the value from ZZZ to bits. Where is this 110ns coming from? I don't have any delays in my program.
Recently I have written a testbench to my program.
It should work like this: after rising edge certain input signals change their values and depending on those values I get according bits on output.
In simulation, there is a delay, a couple of clock cycles before I get the answer and I was wondering what causes the delay and how to measure it.
Here is the simulation window, clock cycle is 10ns, after first rising edge the program changes input values and should initialize output. Then it waits for about 110ns before it changes the value from ZZZ to bits. Where is this 110ns coming from? I don't have any delays in my program.