[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

how to release code if the programmer is a target for coercion




Tim May brought this issue up recently -- if someone develops a greatest-
thing-since-sliced-bread Eternity package, then releases it. it's pretty
likely that they will eventually be approached by (mi6/mossad/CIA/KCIA/etc.).

What's likely to happen?

Certainly they could kill you.  They could make it look like random street
crime, or an accident, or kill you with #16 000 in your pocket just to make
it clear what their reasons were (Gerald Bull, Mossad, London, .32acp).  
At best, you could force them to make it very obvious it was murder, by
having paranoid levels of security, but they could still kill you.  If they
kill the programmer (s/programmer/programmers, or just enough that
the project becomes disorganized, etc.), that at worst will keep new releases
of the software from coming out.  At least until another group starts
working on things, assuming protocols and/or source are public.  Perhaps
that new group would be a front for the NSA, though.  A risk.

More likely, they'd try to coerce you.  This could include threats of death,
which are best responded to by ignoring them, since they don't gain anything
if they kill you.  Or torture, which is equally ineffective if they kill you.
Or slander/etc. to try to discredit you. (unlikely to work at least among
cypherpunks, in the absence of technical attacks as well).

Most likely, they would try to buy you.  This could be by outright offering
money for back doors, which would be great if it worked, but is unlikely to
happen in the first place.  If offered a bribe, you could go public with
that fact (preferably after taking the money :), and worst case publically
distance yourself from the project.  More likely, they could try to infiltrate
the technical team covertly, or ask for features which happen to be 
non-public security compromises, perhaps under the guise of security 
improvements (this presumes they are beyond the state of the art).


WOrst case, one can say that if the intelligence community is far beyond
the state of the art, you have no chance at protecting against covertly
modified source code, modified to break in a subtle way which is not known
to the general community.  However, the world is already in this position --
if they solved the DLP, or even knew of subtle bugs in a processor, or 
whatever,
tools would already be weak.

Assuming the intelligence community is not so advanced that thorough inspection
by the world at large won't reveal back doors or weaknesses, the solution
is code review by the public.  It was brought up that verifying PGP was
just about at the limit of the world's capability.   I agree that it is
certainly beyond the level of one person, or even a small group, to verify --
plus, you want code verified by enough non-interacting entities that it is
unlikely they can all be bought.

I was looking around for a solution to this -- Lenny Foner at the MIT
Media lab has something for his agents project which might be a solution.
A system by which sections of source code are verified by individuals,
signed, other sections are verified by others, etc.  Then, during
compilation, you could say something like "all of the security
critical code must be verified by at least one of these 12 people, everything
else by at least 5 people per section in this list of distinct people".  It
would then show where the "threadbare" sections are, and you could optionally
verify them yourself, have someone else verify them for money, or post
to cypherpunks asking for help.

Security critical code should be encapsulated fairly well anyway -- I'd
feel a lot worse about having a bug in such a part of my code than
a screen refresh bug.  One could potentially limit unintended side effects
by running security code on its own processor, in its own sandbox, or
just use a different instance of the JVM to run it.

I think technical means can solve the problem of source code review
and intelligence community rubber hose tactics.  (preferably they'll
still be handing out free money anyway, though)

-- 
Ryan Lackey
[email protected]
http://mit.edu/rdl/