Richard Heathfield said:
In <
[email protected]>,
websnarf wrote:
You can mean that if you like, but it's a very humpty-dumpty way of
looking at randomness.
"indeterministic" is the opposite of "deterministic" as I see it.
as I see it, a deterministic system is a system in which the results can be
generated from a finite (and typically known) set of rules.
hence, if you could know the complete state of the system at a given moment,
you could also know all possible future states as well (whether or not this
"can" be done in practice is another issue).
however, a deterministic system can be "chaotic", as is typically the case
with PRNGs.
granted, then there is the philosophical issue of "global" or "universal"
determinism or indeterminism, however, in this case this is not my focus of
concern. we can just assume that chaotic patterns in physical reality are
non-deterministic. (or, even if not, that they reprsent a MUCH larger state
than is typically available to a computer...).
But you can't, reasonably speaking, see /all/ an algorithm's possible
inputs, because (for any non-trivial algorithm) there are so many of
them.
agreed.
That sentence that no grammar with. But if you mean that randomness
depends on how much information the user has or can gain about the
sequence that assists in predicting future numbers, you are correct.
nevermind that none of this makes it "random" in a strict sense.
determinism vs non-determinism is not a matter of observervation (how much
the observer knows or does not know).
The problem with a /seeder/ is that it is used as a start point for a
PRNG. PRNGs are, practically by definition, not random. Given
knowledge of the start state and the algorithm, you have 100%
predictive power.
<snip>
yep, this is a flaw of PRNGs...
TRNG's, however, do not generally have the problem, but as noted, a TRNG is
not purely an algorithm (since, invariably, there is dependence on some sort
of external entropy source).
however, as I see it, the issue is not "as" difficult as people are making
it out to be, since there are plenty of useful sources which already exist
in a modern PC, and there are ways to both accumulate and "amplify" this
entropy (such that, the more often and longer the program runs, the more
entropy it has accumulated, and thus the better its RNG state).
so, my approach is to use a hybrid approach, with a TRNG component to
continuously mine entropy, and a PRNG-like part to essentially "hold" the
RNG state, and to work as an "amplifier" (basically, to get usable random
numbers).
the reason is that most natural sources tend to resemble a weak noise
pattern, which in a pure form is not very usable as random numbers (where
usually we want pure chaos, not a noise pattern).
however, having done some amount of statistical modeling on these noise
patterns, I have reason to suspect that they are, infact, random...
now, what are such sources?...
one I have used is a partially "filtered" version of the 'rdtsc' opcode,
which basically measures the current CPU time (in clock cycles). most of the
chaos I think is all of the internal goings-on in a computer (bus activity,
interrupts, activity in different apps and threads, ...), all of which will
disrupt the uniformity of this value (granted, on a modern system, on a DOS
box it would be somewhat less impressive).
I can then use a thread which mostly just sleeps, periodically reads and
filters the value (subtract last value), and adds it into the RNG state.
this thread will also occasionally write this state to a file (this file is
read-in when the RNG is first initialized).
other sources are possible, but this is just the one I mostly use at
present.
the exact sources, ... depend a lot on OS/... for example, for DOS there are
a few other sources I would likely attempt to use (specific common hardware
devices), and others which are specific to Windows or Linux (both of which
provide their own TRNGs, AFAIK based on entropy sources available to the OS
kernel).
'rdtsc' is just something available via the CPU, and so is essentially
"free"...
(and, yes, I didn't trust it at first, but it seems to hold up on what
statistical analysis tests I have used).
note: my analysis was based mostly on signal filtering and entropy
measurement (such as FIR / LPC based filtering, or DCT based analysis),
rather than bitwise tests (as usually used for PRNGs), given the specific
"signal" I was getting (a particularly weak source could be filtered into
being "silence").
No no no no no. One of the most important uses of "truly" random
processes for generating numbers is that of producing crypto keys.
You really, really don't want to offer any part of that process to a
remote machine.
yep...
granted, it does depend some on what he meant by this.
granted, a "random number server" would be a bad idea, but an app working as
a server would likely have enough "unique stuff" comming and going which it
could essentially "mine" as an entropy source, and this would be essentially
non-predictable from the outside (what all information has been seen, and
just how it has been all combined within the innards of the server).