How do you guys limit rand() results to a range?

R

Rosario193

the last question:

Why is it "rand()/((RAND_MAX/N) + 1)" and not rand()/(RAND_MAX/N)

if whe have 7 letters

0 1 2 3 4 5 6
1 2 3 4 5 6

there are 6 intervals and not 7

#include <stdio.h>
#include <stdlib.h>
#include <time.h>


int randomLetter(int runType)
{int i, r;
int countLetters=7;
int Acounter=0, Bcounter=0, Ccounter=0, Dcounter=0;
int Ecounter=0, Fcounter=0, Gcounter=0;
int step=RAND_MAX/countLetters;

srand(time(NULL));
for(i=1; i<=100000;i++)
{ if(runType==1) {r = rand()%countLetters;}
else {r = rand()/step;}
if(r==0) Acounter++;
else if(r==1) Bcounter++;
else if(r==2) Ccounter++;
else if(r==3) Dcounter++;
else if(r==4) Ecounter++;
else if(r==5) Fcounter++;
else if(r==6) Gcounter++;
else if(r==7){Gcounter++; printf("#7#");}
else printf("findE %d, ", r);
}
if(runType==1) printf("r = rand() %% %d\n", countLetters);
else printf("r = rand() / ((RAND_MAX / %d))\n",
countLetters);
printf("A=%d\n",Acounter);
printf("B=%d\n",Bcounter);
printf("C=%d\n",Ccounter);
printf("D=%d\n",Dcounter);
printf("E=%d\n",Ecounter);
printf("F=%d\n",Fcounter);
printf("G=%d\n",Gcounter);

printf("Total=%d\n",Acounter+Bcounter+Ccounter+Dcounter+Ecounter+Fcounter+Gcounter);
return 0;
}


int main(void) {
randomLetter(1);
randomLetter(2);
return 0;
}

#7# here not appear 1/7 appear 3 or 4 time for 10000...
 
R

Richard Bos

Walter Banks said:
What we did to create a random number for a range in the MC
simulator was first create a dependable random number generator
with a distribution from 0..1 and then for a range used

range(n,m) = n + ((m-n) * rand0_1);

A random number with a distribution of 0..1 can be a linear
integer random number but treated like a fractional number.

Myeah. But it's still a row of bits, so the same problem remains: you
_will_ have a slightly uneven distribution whenever m-n is not a power
of two. It doesn't matter how you conceive of the random number, inside
the computer it always remains a row of bits.

Richard
 
S

Stefan Ram

James Kuyper said:
Those articles also don't seem particularly connected to your comments
in your first paragraph - were they intended to be?

One cannot just assume that there are »real« random numbers,
because the statistical properties of random numbers obtain
from real quantum random number generators are different
than the statistical properties of random numbers obtain
from real classical random number generators, even if both
are not pseudo-random number generators.
 
J

James Kuyper

One cannot just assume that there are �real� random numbers,

It should not be "just" assumed - but that assumption is in fact the
default in quantum physics.
because the statistical properties of random numbers obtain
from real quantum random number generators are different
than the statistical properties of random numbers obtain
from real classical random number generators, even if both
are not pseudo-random number generators.

Nothing is more normal than for two different random number generators
(whether really random or pseudo-random) to have detectably different
statistical properties. That doesn't make it any less accurate to
describe the really-random numbers as really-random.
 
R

Rosario193

#include <stdio.h>
#include <stdlib.h>
#include <time.h>


int randomLetter(int runType)
{int i, r;
int countLetters=7;
int Acounter=0, Bcounter=0, Ccounter=0, Dcounter=0;
int Ecounter=0, Fcounter=0, Gcounter=0;
int step=RAND_MAX/countLetters;

srand(time(NULL));
for(i=1; i<=100000;i++)
{ if(runType==1) {r = rand()%countLetters;}
else {li: r=rand()/step;
if(r>=countLetters) goto li;
}
if(r==0) Acounter++;
else if(r==1) Bcounter++;
else if(r==2) Ccounter++;
else if(r==3) Dcounter++;
else if(r==4) Ecounter++;
else if(r==5) Fcounter++;
else if(r==6) Gcounter++;
else if(r==7){Gcounter++; printf("#7#");}
else printf("findE %d, ", r);
}
if(runType==1) printf("r = rand() %% %d\n", countLetters);
else printf("r = rand() / ((RAND_MAX / %d))\n",
countLetters);
printf("A=%d\n",Acounter);
printf("B=%d\n",Bcounter);
printf("C=%d\n",Ccounter);
printf("D=%d\n",Dcounter);
printf("E=%d\n",Ecounter);
printf("F=%d\n",Fcounter);
printf("G=%d\n",Gcounter);

printf("Total=%d\n",Acounter+Bcounter+Ccounter+Dcounter+Ecounter+Fcounter+Gcounter);
return 0;
}

the last try
[0, randmax]
is deconposed in 6 intervals

step=randmax/6
[0, step) [step, 2step) ... [randmax-step, randmax)
plus one element
{randmax}

when rand() return randmax i check for some other value
 
S

Stefan Ram

James Kuyper said:
That doesn't make it any less accurate to
describe the really-random numbers as really-random.

This /is/ not only not accurate, but even meaningless,
because you have not defined »really-random numbers« yet!
Given a sequence of numbers, when do you call them
»really-random numbers«?
 
W

Walter Banks

Richard said:
Myeah. But it's still a row of bits, so the same problem remains: you
_will_ have a slightly uneven distribution whenever m-n is not a power
of two. It doesn't matter how you conceive of the random number, inside
the computer it always remains a row of bits.

Potentially that is true (when m-n is not a power of two) but it depends
what the grain for the desired random number is. If you are looking for
an integer you have a random number that is a fract multiplied by a integer

creating a fixed point number k.l bits. k is the number of bits to describe

(m-n) and l is the number of bits in the random number. The error is
going to be in the fraction part and if you are looking for a integer
result
it probably doesn't matter.

In my case most of the work I have done with random number generators
has been with Monte Carlo simulators. As long as the errors in the
random numbers don't affect the results inside of three standard
deviations form the mean they are for most of my work adequate.

Treating a random number as a fract makes creating random numbers
with various statistical distributions a lot easier and tends not create
as many distribution errors.

There is a good on line reference pdf that has may of the same random
number distribution algorithms found in "Statistical Distributions"
Hastings et al mentioned earlier. http://ftp.arl.mil/random/random.pdf

As I said earlier random number generator algorithms try very hard
not to be random. The best we can do in most cases is create a
random number that is "good enough" for the intended application.

w..
 
M

Martin Shobe

This /is/ not only not accurate, but even meaningless,
because you have not defined »really-random numbers« yet!
Given a sequence of numbers, when do you call them
»really-random numbers«?

Huh? "really-random numbers" isn't about which sequence is produced but
about how that sequence is produced.

(BTW, if you want to talk about a different notion of randomness that is
about which sequence is produced you could look into Kolmogorov randomness.)

Martin Shobe
 
J

James Kuyper

This /is/ not only not accurate, but even meaningless,
because you have not defined �really-random numbers� yet!
Given a sequence of numbers, when do you call them
�really-random numbers�?

When they're generated in an appropriate fashion by making use of
quantum-mechanical uncertainty effects. That's not a definition of
"really random", it's just the only way I know of to generate "really
random" numbers.

I have no intention of attempting a mathematically precise definition of
what I mean by "really random" - it's not my specialty, and any
definition I could attempt would be easily picked apart by those who do
make it their specialty. But whoever invented the term "pseudo-random"
had some idea of what he meant by attaching the "pseudo-" prefix; by
"really random", I basically mean something for which the "pseudo-"
prefix is unnecessary.
 
J

Jorgen Grahn

To my uninformed eye, both ways produce well-distributed random results
within a 10 million size test loop.

Modulo doesn't do that. Suppose you have a very good source of 8 bit random
numbers in the range [0,256), and you reduce these, say, modulo 200 to the
range in the range [0, 200). The problem is that basically, this operation
chops off the values in the range [200,256) and folds them to the range
[0,56). So these values are twice as likely to occur.

Can you write a C program to demonstrate that?

If you think about it, it's obviously true. Kaz made up a convincing
example.

An analogy is a 256-character text, broken at column 200:


200
*****************
****
56

The problem is smaller if the PRNG range is much larger than the range
you want, but it really goes away only if the former is an even
multiple of the latter.

/Jorgen
 
K

Keith Thompson

This /is/ not only not accurate, but even meaningless,
because you have not defined »really-random numbers« yet!
Given a sequence of numbers, when do you call them
»really-random numbers«?

Off the top of my head, I'd call a sequence really random if knowing
the first N-1 numbers in the sequence gives you no information
about the Nth number in the sequence. (Likewise for any subset of
N-1 numbers.)

Or I might go beyond that: knowing the values of the first N-1
numbers *and* knowing the mechanism used to generate them gives
you no such information.

(This disqualifies the digits of pi, for example, even if they pass
all statistical tests for randomness.)
 
G

glen herrmannsfeldt

It should not be "just" assumed - but that assumption is in fact the
default in quantum physics.

(snip)

Except that at some point one needs to convert the quantum
randomness to ordinary (classical) randomness, such as bits.

In the case of Quantum Cryptography, one wants to delay that
as long as possible, but for more ordinary uses, it happens
earlier.

-- glen
 
J

James Kuyper

(snip)

Except that at some point one needs to convert the quantum
randomness to ordinary (classical) randomness, such as bits.

True, but that conversion doesn't make the numbers pseudo-random, which
is the key point as far as I'm concerned. I'm not quite sure what
Stefan's point was.
 
G

glen herrmannsfeldt

(snip on quantum randomness)
When they're generated in an appropriate fashion by making use of
quantum-mechanical uncertainty effects. That's not a definition of
"really random", it's just the only way I know of to generate "really
random" numbers.
I have no intention of attempting a mathematically precise definition of
what I mean by "really random" - it's not my specialty, and any
definition I could attempt would be easily picked apart by those who do
make it their specialty. But whoever invented the term "pseudo-random"
had some idea of what he meant by attaching the "pseudo-" prefix; by
"really random", I basically mean something for which the "pseudo-"
prefix is unnecessary.

OK, I will try anyway. (Even though I am also not a specialist.)

You want statistical independence.

If you have the previous N bits, can you predict the next bit,
especially as N gets larger.

The favorite example is radioactivity, where at a given time,
a radioactive nucleus has some probability to decay. The assumption
is that the decay of any nucleus is statistcially independent of
any other. That is, that there is no overlap in the wave function.

But there is overlap in the wave function, it is just extrememly low.
Wave function overlap decreases exponentially with distance,
(or maybe exponentially with the square of the distance).
The ratio of atomic radius to nuclear radius is about 1e5, so
the overlap might be exp(-1e5).

Way simplifying things, if you have the previous exp(1e5) bits,
you might be able to predict something about the next one.
(Especially, as that is only for two atoms, and you want a lot
more of them.)

If you take two atoms that are 1cm apart, the coupling decreases
to something like exp(-1e13) (or, if squared, exp(-1e26).

If you want another macroscopic quantum problem, how many different
velocities can a baseball pitcher pitch?

To simplify the question, how many different quantum states are
there for a baseball in a baseball stadium that have velocity
less than 100 mi/h?


-- glen
 
G

glen herrmannsfeldt

Keith Thompson said:
(e-mail address removed)-berlin.de (Stefan Ram) writes:
(snip)
Off the top of my head, I'd call a sequence really random if knowing
the first N-1 numbers in the sequence gives you no information
about the Nth number in the sequence. (Likewise for any subset of
N-1 numbers.)
Or I might go beyond that: knowing the values of the first N-1
numbers *and* knowing the mechanism used to generate them gives
you no such information.

And for what values of N?

How about for N=exp(1e5) or even more?

-- glen
 
S

Stefan Ram

James Kuyper said:
True, but that conversion doesn't make the numbers pseudo-random, which
is the key point as far as I'm concerned. I'm not quite sure what
Stefan's point was.

The point is: The English noun phrase »real random number«
has a meaning in the realm of a technical discourse only
after the point of its definition.

In colloquial everyday language we can use wordings such as
»a real poncho«, because for terms in everyday language it's
ok to have some vagueness. But in the case of random numbers
in this thread, one needs to be less vague.

When one does not give such a definition, it might not be
appropriate to describe this with the attribute »accurate«.
 
J

James Kuyper

(snip on quantum randomness)



OK, I will try anyway. (Even though I am also not a specialist.)

You want statistical independence.

Actually, no.
If you have the previous N bits, can you predict the next bit,
especially as N gets larger.

The favorite example is radioactivity, where at a given time,
a radioactive nucleus has some probability to decay. The assumption
is that the decay of any nucleus is statistcially independent of
any other. That is, that there is no overlap in the wave function.

But there is overlap in the wave function, it is just extrememly low.
Wave function overlap decreases exponentially with distance,
(or maybe exponentially with the square of the distance).
The ratio of atomic radius to nuclear radius is about 1e5, so
the overlap might be exp(-1e5).

Way simplifying things, if you have the previous exp(1e5) bits,
you might be able to predict something about the next one.
(Especially, as that is only for two atoms, and you want a lot
more of them.)

You can predict something based upon that overlap, but what you can
predict is only a shift in the probability distribution. The actual
decay time is still a perfectly random selection from that distribution.
You cannot predict the actual time until the next decay. No matter how
much information you have, the time until the next decay could be either
arbitrarily long or arbitrarily short, without violating any of the laws
of quantum physics as they are currently understood. That is the
fundamental distinction between quantum randomness and
pseudo-randomness. If you knew the full internal state of a
pseudo-random number generator, and the algorithm it uses, you could
determine the next random number precisely.

It's not just a matter of some of the universe's state information being
hidden from us. Einstein, Podalsky and Rosen (EPR) tried to interpret
quantum uncertainty as being due to "hidden variables" - state
information about the universe that we were unaware of (and which we
might inherently be incapable of being aware of). They deliberately left
the details of what that state information was and how it influences the
measurements completely unspecified. Despite leaving it unspecified,
they were able to describe a quantum-mechanical experiment, and a
statistic that could be calculated from measurements that could be taken
while running that experiment. They rigorously derived a requirement
that this statistic must be greater than or equal to 1, regardless of
how the hidden variables actually worked. Quantum mechanics, on the
other hand, predicted that the value of that statistic should be 0.5.
From this, EPR concluded that quantum mechanics was unrealistic, and
could therefore be, at best, only an approximation to reality.
At the time their paper was published, it was not possible to conduct
the experiment with sufficient precision to clearly distinguish a value
of 1 from a value of 0.5. Many years later, when scientist were finally
able to perform it, reality decided not to cooperate with EPR's concept
of "realism". The measured value unambiguously confirmed the quantum
mechanical prediction, violating the constraint that EPR had derived
from assuming that hidden variables were involved.
Scientists still believe that quantum mechanics can only be an
approximation to reality - but it's no longer because of the fundamental
role that true randomness plays in the theory.

I don't want to start an extended discussion of EPR - even experts get
into long pointless arguments talking about it. I just want to say that,
when I talk about "really random", I'm talking about the kind of thing
that EPR were implicitly assuming was inherently impossible when they
derived their limit equation.
 
S

Stefan Ram

glen herrmannsfeldt said:
But there is overlap in the wave function, it is just extrememly low.

The wave function of a quantum system with locational degrees
of freedom in space always is defined for the whole space R³.

The overlap of the domain of two functions defined on all of
the R³ is always 100 %.
 
K

Keith Thompson

glen herrmannsfeldt said:
And for what values of N?

How about for N=exp(1e5) or even more?

For any value of N. If knowing the first 1e100 numbers tells
you *anything* about the 1e100+1th, then the sequence is not
"really-random". If the numbers are bits, knowing that the next bit
is 1 with a probability of 50.001% means it's not "really-random".

Do you think otherwise?
 
S

Stefan Ram

Martin Shobe said:
Huh? "really-random numbers" isn't about which sequence is produced but
about how that sequence is produced.

I have only reacted to the wording »really-random numbers«,
wherein »really-random« is an attribute of the numbers, not of
their source, otherwise the wording would be »numbers from
a really-random source«.

Tests for randomnes usually test the numbers, not their source.

But anyone here is free to post his definition of how that
sequence is produced.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,780
Messages
2,569,611
Members
45,285
Latest member
CryptoTaxxSoftware

Latest Threads

Top