[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: New directions in anonymity (needed)
Mike Ingle wrote:
>
> There has been a lot of talk about anonymity and the best ways of providing
> it recently. Anonymity now is about where cryptography was before Diffie and
> Hellman's paper, or untraceable transactions before Chaum. To use it, you
> have to trust someone, or at least a group of people.
>
> All of our anonymous systems boil down to only two techniques: indirection
> and broadcast. Indirection is sending a message through one or more
> intermediate nodes to conceal its point of origin. Broadcast is sending a
> message to multiple recipients to conceal the intended recipient.
First, I think it important to clearly distinguish between "sender
anonymity," where the physical identity or sending site is hidden, and
"receiver anonymity," where the same is true of the receiving site.
Chaumian digital mixes--what you Americans call "remailers"--mainly
solves the sender anonymity problem. Message pools, or broadcast to a
group or site that includes the receiver, mainly deals with receiver
anonymity. The combination of the two deals with both.
Both are solved elegantly with the Dining Cryptographer's Protocol,
about which much is written on this list every few months. Messages
are "sent" in an Ouija-board fashion and received by the person who
can successfully decrypt a public message sent over the system.
The process can be made information-theoretically secure, with
one-time pads used instead of ciphers. But, as Mike Ingle points out
here, the process can be compromised if _enough collusion_ exists. Ah,
but so can all known systems!
Digital money is not untraceable if you're the only person in the
universe (or in _your_ uhiverse...same difference) who is using it, or
if the banks, shops, and other customers are in collusion, leaving you
as the one and only user. (Proof: The banks and shops are not actually
using digital cash, leaving the "mark"--pun intended--as the only
person merrily spending digital cash. Ergo, all purchases made with
digital cash are traceable to this mark. I can imagine contrived
situations in the early days of digital cash when this kind of set-up
occurs.)
My intuition is that any cryptographic situation in which an
individual/entity can be "isolated" so that all other individuals and
entities, and all connections, can be compromised (added to the set of
colluders) will be "broken." How else could it be?
Anyway, I think DC Nets are every bit as secure as e-cash is.
Obviously the security depends on a bunch of things, but the concept
of anonymous communication is not weaker in any fundamental way than
is digital cash. Full collusion by everyone who is not "you" causes
big problems with any cryptographic system that I know of.
(Stuff about reply blocks elided. I agree that more work is needed to
understand attacks on reply block methods. However, I'm not at all
convinced that "tracking back" is any different from the conventional
approach of demanding input-output mappings at each level, by
subpoenaing the remailer's records, etc. But I'll think about this
some more and maybe comment at another times.)
> Broadcast is exactly as secure as it is nonscalable. If you broadcast to 100
> people, an attacker's uncertainty is one in 100. The security grows linearly
> with the overall bandwidth. For cryptographic-level security, it would need
> to grow exponentially with bandwidth.
This gets to the important point of _proving_ vs. _suspecting_. To
make this concrete, consider that 10,000 people may download the
entire contents of, say, "alt.anonymous," a message pool newsgroup
created largely for the publication of anonymous messages.
The Internetpol may _suspect_ that one of these 10,000 readers is the
person to whom the messatge is intended, but cannot _prove_ it. Even
if the list of readers is down around 100, I know of no legal systems
that would accept this as proof. (There are many issues here, and many
more detailed scenarios. For example, over the course of time, as
readership changes but some readers remain readers, patterns may
emerge from the noise. And there are issues of bandwidth. I am not too
worried about running out of bandwith soon, since an awful lot of text
messages can fit into just a single one of the MPEG or JPEG images
that flood the *.pictures newsgroups...and these *.picutures
newsgroups are sent to and locally stored on tens of thousands of
machines....)
> Anonymity needs something fundamentally new, something comparable to public
> key for cryptography or blind signatures for digital cash. Suppose a server
I think you ought to carefully look at Chaum's work on Dining
Cryptographers. It does all this. (It ain't perfect, and it ain't been
implemented in practical terms, a la a "Pretty Good Dining
Cryptographers," but it's at least as basic a concept as the other
things are....some might say that all are variations on the same theme.)
> has a large file. A message comes in and is combined into this file. Another
> message comes in with a key to retrieve data. The server processes the
> retrieval key against this large file and comes up with an output, which it
> returns to the person doing the retrieving. This output contains the input
> message, transformed in such a way that even the server cannot match it
> to the input that produced it. This is what we need.
But how is this any less "suspcious" in terms of pointing to the
receiver than is the more general and easier process of simply
retrieving the file from the server? Why would doing some sort of
computation on the distant server make the act of access any less
incriminating.? If 100 people all retrieve the file, they're the
"suspects," despite nothing being provable. And if those 100 people
first tell the server to do some particulary transformation, they're
still equally suspect. (Unless they tell the server to use a chain of
remailers to do the retransmission to them, which is different from my
understanding here, and is in fact the "anonymous anonymous ftp" we've
long talked of.) And if fewer than the 100 people actually send the
message, then of course the list of suspect is even smaller. (The
advantages of passive, do-nothing broadcast are lost, a net lose.)
> It would require three keys: an encryption key, a selection key, and a
> decryption key. The sender uses the recipient's encryption key to encrypt
[stuff elided]
Be careful also not to assume the server is trustable, as I gather it
is presumed to be here. (Doing the transformations, and all.)
> A method like this would permit anonymity without trust, much as public key
> allows secrecy without trust and digicash allows transaction privacy without
> trust. Is there any way to do it? There are functions like the Fourier
> transform that can distribute data over a large file, but the inverse
> function gives you back the same data. We need the not-so-inverse function
> to give back a different piece of data, but one that can be converted, by a
> private key, into the original, which the server never sees.
I expect broadcast models, or anonymous anonymous ftp models, to be
more sophisticated than simple "publish everything in alt.anonymous"
strategies. That is, multiple routes, multiple places, hierarchies,
etc. All involve trade-offs in time, space, security, cost,
convenience.
Some methods may involve splitting files into multiple pieces for
independent retrieval. However, I am not really convinced that the
alleged problems are solved this way.
The point about "which the server never sees" is confusing to me.
Surely the server _could_ see the transformed data if he wanted to, as
he was the one who did the remote transformation (as I understand
Mike's scheme). (If the scheme involves a server which can do
transformations but not see the output, then this is a much harder
problem, dealing with models of computation, secure hardware, etc.)
(The idea of a sealed box which executes a model in which no records
are kept, etc., is of course Chaum's original 1981 concept of a
digital mix. No "key" need be sent by the recipient if in fact the box
executes the proper mix function, keeps no records, is not
compromised, etc. If Mike is suggesting that his "server" behave more
like a true digital mix and thus not be as dependent as on human
trust, then of course this is a good idea. We need more mix-like
boxes, for our remailings. This has nothing to with the broadcast
model per se, dealing as it does with the general mix issue.)
In any case, what difference does it make if the server sees the data
or file? He surely can track his internal CPU processes and know which
incoming file was sent out, and to which address! (Just the remailer
in-out mapping problem again, which has always been unconcerned with
the internal contents of files.)
Sorry for the length here.
--Tim May
--
..........................................................................
Timothy C. May | Crypto Anarchy: encryption, digital money,
[email protected] | anonymous networks, digital pseudonyms, zero
| knowledge, reputations, information markets,
W.A.S.T.E.: Aptos, CA | black markets, collapse of governments.
Higher Power: 2^859433 | Public Key: PGP and MailSafe available.
Cypherpunks list: [email protected] with body message of only:
subscribe cypherpunks. FAQ available at ftp.netcom.com in pub/tc/tcmay