Ray Andraka wrote:
One more thing, another possible way to reduce the set which retains the
uniform probability is to discard samples that are outside of the
desired range. That however, requires the random numbers be generated
faster than they are needed and good ones be stored in a fifo where they
are kept until needed.
Using a table to remap the data to a smaller set will generally result
in a probability distribution that is no longer uniform, which can mess
up your application pretty badly.
Note that the probability distribution for the bit output out of an
unmodified LFSR is not quite 50-50 because there are always an odd
number of states in the sequence. There is some gating you can do to
extend the sequence to include the "illegal state" that will also
balance the distribution: it is basically detecting the all '0's state
and using that to insert the all '1's state into the sequence. I
believe one of the Xilinx app notes also has a note regarding this
(might be XAPP052, or it might be another one, there were two in the
late '90s addressing LFSRs).
If you need some distribution other than a uniform distribution, you
need to perform some transformation, either by using the uniform
distribution to address a PDF table or by performing some computation
(for example, a gaussian distribution can be derived by summing samples
from a uniform distribution using the central limit theorem).
I address these techniques to some degree in my 1998 paper "An FPGA
based processor yields a real time high fidelity radar environment
simulator", which is available for free on my website at
http://www.andraka.com/papers.htm. That design was done on eight
XC4025's, without the benefit of on-chip BRAM or multipliers. The whole
thing could probably be put into a single XCV4SX55 today, with
considerably less total design effort.