[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: netscape's response
| Of course none of this reduces the magnitude of the screw up/bug/design
| flaw/whatever. I really can't say which of these it was since I wasn't
| around at the time that this code was being written. I must admit that
| the RNG seed code was not an area that I thought to examine when I took
| over our security library.
It isn't really easy. I guess you were around to see the pointer to
RFC 1750, approx "Security Randomness reqs"?
| This was a bad mistake on our part, and we are working hard to fix it.
| We have been trying to identify sources of random bits on PCs, Macs, and
| all of the many unix platforms we support. We are looking at stuff that
| is system dependent, user dependent, hardware dependent, random external
| sources such as the network and the user. If anyone has specific
| suggestions I would love to hear them so that we can do a better job.
* I think you should use as much user-generated randomness as possible,
like the mouse movement patterns, interarrival times of events from
the user interface etc.
* You can also gather statistics from the networking card, like number
of collisions, packets in/out, number of passing packets etc.
* Measuring the interarrival times of requests/responses from a remote
server should also be a good one, I guess. I depends on the network
in between, the actual processes executing on it, the scheduling
algorithm etc.
* And finally, insert some sampling of the noise in the sound blaster.
* And try to reseed it, as often as possible and convenient. Make it
depend on the previous value of the random generator seed, somehow.
The difficult part is to verify the quality of the random seeding and
reseeding. How does it behave on a unloaded system? Could someone put
your system under some strain, and hence affect the random generator
to lock down into a small subspace or even onto a fixed value?
How independant are the values anyway? And when you start to talk
about ergodity etc, I'm lost anyway. :-)
I think it is important to bring together factors of the user _and_
the environment, preferrable an environment that reaches as far from
the local site as possible. This makes "jamming" of the random seed
selection process harder.
The other problem in gathering random bits for a seed is that most
bits are visible by someone else close enough within your environment.
Interarrival times of packets are fine, but anyone can observe them
with quite a good accuracy. How do you escape the "local environment
problem"?
. - .
One wild idea that I just got was to have servers and clients exchange
random numbers (not seeds of course), in a kind of chaining way. Since
most viewers connect to a number of servers, and all servers are
connected to by many clients, they would mix "randomness sources" with
each other, making it impossible to observe the local environment
only. And the random values would of course be encrypted under the
session key, making it impossible to "watch the wire".
Problems:
* watch out for "multiply by zero" attacks by a rogue server/client.
* watch out for "almost singular values" in the same way.
* only let one source contribute a certain amount of randomness, like
(key length)/(aver # of peers).
* never reveal your current seed, only a non-trivially derived random
value from it. (of course)
* make sure your initial seed is good enough, or the whole thing is
broken.
* perhaps save part of the previous session state into a protected
file, to be able to keep up the quality of the initial seed.
I think I like it, perhaps not from a practical point of view as much
as the 'non-attackability' of it. Its quite cypher-a.
But I bet someone has already done this a long time ago. My usual
luck! :-(
If not, I want a 'I saved Netscape!' t-shirt from you, Jeff!
/Christian
PS. I'm a Swede, I don't know if I'm allowed to reveal these state
secrets. So please shut your eyes, ok?