[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

What is the maximum # of bits a key could ever need to be?




This occurred to me this morning in the inner sanctum of inspired cogitation,
aka the shower.  "What is the maximum # of bit a public-key would ever need
to be, given no breakthroughs in factoring?"

I came up with an answer, but it depends on some numbers that I don't have
handy; perhaps other people on the list can fill in the blanks.

First, we need an equation that tell us how difficult it is, in # of operations,
to factor a number of N bits.  eg: N_ops(N) = # of operations it will take.

Then all we need to do is find the N for which N_ops(N) is greater than

U_Duration * U_Particles * (1 / P_time)

Where U_Duration is the expected duration of the universe, U_Particles is the
number of particles in the universe (I am assuming that every particle can
be used as a processor; the programming I leave as an exercise to the
alert reader), and P_Time is the Planck time (damned if I can remember it)
in seconds, which ought to be a good upper bound for clock speed on the
Universal CPU.

A most likely useless number, but it would be interesting to know what it
comes out to.

Best,
R