[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

steganography and delayed release of keys (Re: Eternity Services)

Tim May <[email protected]> writes:
> News spool services are already showing signs of getting into this "Usenet
> censorship" business in a bigger way. Some news spool services honor
> cancellations (and some don't).  Some don't carry the "sensitive"
> newsgroups. And so on. Nothing in their setup really exempts them from
> child porn prosecutions--no more so than a bookstore or video store is
> exempted, as the various busts of bookstores and whatnot show, including
> the "Tin Drum" video rental case in Oklahoma City.

One tactic which could protect a USENET newsgroup operator from child
porn prosecutions is if he had no practical way to recognize such
materials until after it was distributed to down stream sites.

Using steganography, we could for example adopt a strategy such as

1) Cross-post, and / or post to random newsgroups 

2) Threshold secret split your posts so that only N of M are required
   to reconstruct.

3) steganographically encode the eternity traffic.  Pornographic images
   in alt.binaries.* would be suitable because there are lots of those

4) Encrypt the original steganographically encoded posting (encrypt
   the eternity document and hide it inside the image file posted)

5) Post the decryption key a day or two later to ensure we get the full
   feed before a censor can recognize the traffic

The attacker is now forced to delay USENET posts until the key is
posted if he wishes to censor eternity articles.

Measures 1) and 2) address the problems with newsgroups not being
carried everywhere.  2) improves reliability as distribution can be

Cancellations can be discouraged by liberal abuse of cancellation
forgeries, which a Dimitri Vulis aided greatly by providing easy to
use cancel bot software.

A worrying trend is the use of NoCeMs to filter whole news feeds,
where the NoCeM rating system I considered was designed for third
party ratings applied by individuals.  NoCeMs could become a negative
if used in this way, because news admins may use them as a tool to
censor large parts of the USENET distribution, in too centralised a

> >The solution I am using is to keep reposting articles via remailers.
> >Have agents which you pay to repost.  This presents the illusion of
> This of course doesn't scale at all well. It is semi-OK for a tiny, sparse
> set of reposted items, but fails utterly for larger database sets. (If and
> when Adam's reposted volumes begin to get significant, he will be viewed as
> a spammer. :-) )

The best criticism of my eternity design to date!  I agree.

But this limitation is difficult to avoid while retaining the level of
availability.  Trade offs improving efficiency will tend to move away
from an existing widespread broadcast medium (USENET) towards
specialised protocols, and pull technology (the web hosting model),
leading to actual machines serving materials.

We can probably arrange that these servers do not know what they are
serving, however if the whole protocol is setup specifically for the
purpose of building an eternity service, it will be shut down.

Longer term perhaps something could be achieved in slowly building up
to larger numbers of servers, but this does not seem such a
main-stream service that it would be easy to get this degree of

That is to say this problem is more than designing protocols which
would be resilient _if_ they were installed on 10,000 servers around
the world; the problem is as much to do with coming up with a
plausible plan to deploy those servers.