Re: /dev/random vs. /dev/urandom

From: Paulo Marques
Date: Mon Jan 10 2005 - 08:05:36 EST


linux-os wrote:
[...]
One is free to use any number of samples. The short number of samples
was DELIBERATELY used to exacerbate the problem although a number
or nay-sayers jumped on this in an attempt to prove that I don't
know what I'm talking about.

It seems to me that you actually don't.

Since this is a *uniform* distribution in the range [0..2^N[, than any of those N bits must also show a uniform distribution, or the distribution of the sum of the bits wouldn't be uniform. (isn't this obvious?)

It would be different of course, if this was not a uniform distribution, or the range was not a power of 2...

Of course, I agree that throwing away 5 bits in every byte of perfect entropy that the kernel worked so hard to gather is just wrong, but the randomness of the result is not the reason why.

In the first place, the problem was to display the error of using
an ANDing operation to truncate a random number. In the limit,
one could AND with 0 and show that all randomness has been removed.

Not really.. you just get a perfect random uniform distribution if the range [0..0] :)

--
Paulo Marques - www.grupopie.com

"A journey of a thousand miles begins with a single step."
Lao-tzu, The Way of Lao-tzu

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/