Re: Intel 810 Random Number Generator

From: Theodore Y. Ts'o (tytso@MIT.EDU)
Date: Tue Jan 25 2000 - 20:15:41 EST


There've been a lot of comments on the subject, so I waited until I was
completely caught up on e-mail to send one reply to the entire
thread....

First of all, /dev/random is supposed to be a very high quality random
number generator. It's use is *supposed* to be for generating critical
cryptographic keys, particularly long-term keys, and for seeding
pseudo-random number generators. It's not particularly optimized for
generating large numbers of random numbers for use in Monte Carlo
applications, for example, and in such applications you usually want a
statistically good, PRNG since being able to replicate your results is
*good*. (As opposed to crypto key generation, were being able to
replicate your results is *bad*. :-)

As such, using a true random number generator is a good thing. If done
well, it can do a much better job than some of the sampling tricks
currently used in /dev/random. The caveat is in the "if it works"...
Hardware random number generators are notoriously hard to get right.
All sorts of environmental inputs (60 Hz hum, harmonics for CPU clocks,
sampling artifacts, etc.) can potentially swamp the effects of the
quantuum noise generated by the device.

For this reason, it's a really good idea to wash the outputs from the
hardware random number generator through some kind of real-time verifier
to make sure (to some level of confidence) that the hardware generator
hasn't suddenly started screwing up. This could happen in any number of
ways --- a transitor in the RNG chip getting damaged from a power spike,
or from a cosmic ray, a NSA/FBI executed black bag job, etc.

This verifier can be done in the kernel, but that limits you to the
amount of statistical tests you can do, and you generally don't want to
put too much of that in the kernel. This is true especially if you want
to be uber-paranoid, and run spectral analysis over the data looking for
potential problems. Even if you're just running the FIPS tests over
each batch of data before feeding it into the /dev/random, that requires
enough CPU and involves enough complexity that simply shoving it into
the kernel probably isn't the right answer.

Of course, you can simply just be very conservative about how many bits
of credit you give the inputs to the 810RNG. The Intel paper about the
810 RNG (by Jun and Kocher) claims that 1/2 bit of entropy per output
bit is a good conservative estimate. People who like to be more careful
can simply give even less credit.

However, I think that past a certain point, it does make more sense to
have a user-level daemon which handles the verification and sampling
tests for the chip. If it's done right, it can work across multiple
hardware RNG's, simply by reading the randomness from the device
drivers, and then doing the appropriate pre-processing, and then feeding
it into /dev/random with the appropriate entropy credits. Perhaps there
should be a "simple mode" which simply takes inputs from the driver and
feeds them directly into /dev/random without the intervention of the
user-mode process, and it can simply be very conservative about how much
entropy credit it gives to the entropy count. But there should be a way
for a user-mode process to disable this and take over control over
the post-RNG processing.

                                                - Ted

P.S. If you look at the Jun and Kocher paper (thanks to Colin Plumb for
giving me a pointer to it:

        http://developer.intel.com/design/security/rng/CRIwp.htm

It's clear that there is a hardware whitener (a von Neumann bias
eliminator) to remove 0 vs 1 biases. There are hints that it's
possible to turn off the whitener, so that you get access to the raw
stream of bits from the RNG before any whitening is done.
Unfortunately, how to actually do this (probably some kind of debug
mode) doesn't seem to be published anywhere. If any one knows how to do
this, please let me know. Ideally, if the software is going to be doing
real-time verification of the RNG's soundness, it should be doing so on
the pre-whitened data stream. As an example, the following string of
numbers is anything but random:

        1 2 3 4 5 6 7 8 9 10

However, if this is run through a MD5 or SHA whitener, the result would
*look* random, even though the source material is anything but random.
So you really want to look for patters and do any analysis on the raw
data stream.

So, if anyone can figure out (and tell me) how to turn off the 810's
hardware whitener circuits, that would be really useful. Thanks!!

-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu
Please read the FAQ at http://www.tux.org/lkml/



This archive was generated by hypermail 2b29 : Mon Jan 31 2000 - 21:00:16 EST