[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: DVD legal maneuvers (fwd)




...
> > In a society based on trust, respect and mutual agreement, anti reverse
> > engineering provisions are entirely appropriate and enforceable. 
> 
> Anti-reverse engineering provisions would indeed be enforceable in a society in
> which one can enforce agreements that restrict that party's ability to
> privately manipulate and then reveal information, but there is a problem --
> that kind of society can't have the anonymity-based protection of freedom of
> speech for which you, O illustrious [email protected], can thank
> cryptography and related disciplines -- at least, it could not have that
> protection in conjunction with the protection of privacy that those same fields
> offer you.
> 
> ----- End of forwarded message from Randall Farmer -----
> 
> Why does trust, respect, and mutual agreement eliminate anonymity?

Those things do not eliminate anonymity and privacy tools; in Anonymous's ideal
society, s/he says, "anti reverse engineering provisions are
entirely...enforceable," and the enforceability of the provisions is the
characteristic with which our anonymity and privacy tools cannot coexist. Lose
the enforceability of anti-reverse-engineering provisions; keep or scrap the
basis in trust, respect, and mutual agreement; and Mix-style anonymity and
cryptographically-ensured privacy may exist. (For some justification, read the
original post and the text below.)

> 
> The reality is that in such a society you wouldn't need to reverse engineer
> in the first place. So your point is incorrect as worded

Under your interpretation and logic, yes -- but your interpretation is not the
only one, not one which uncovers the intent behind the words, and arguably not
the one which yields the most practically useful results.

> and moot within the sup[p]osed context.

Anonymous's point as quoted at the top of this message is moot within the
directly stated context as well -- s/he says that anti-reverse-engineering
provisions could be enforceable in the society based on trust, etc., but the
provisions' enforceability or unenforceability would be moot because, as you
note, reverse-engineering would be unnecessary.

However, both of our points apply once the reader extends the arguments from
Anonymous's ideal to the more practically attainable states[1]. (It's
reasonable to try to make that extension as a reader, or to expect the reader
to make that extension as a poster like Anonymous or me, since the attainable
states, not the Utopias, are of practical interest to most readers.) In those
attainable states that lie between our sorry present state and Anonymous's
ideal, trust would not be absolute. So, I suspect Anonymous expected the reader
not only to see that anti-reverse-engineering provisions would be enforceable
(if gratuitous) in a society based absolutely on trust, respect, and mutual
agreement, but also to think that they would also be enforceable in an
attainable society with a high degree of trust but still enough mistrust that
the enforceability of anti-reverse-engineering provisions would be important.
(There are other possibilties, of course; one is that Anonymous assumes a
sufficient degree of mistrust would exist even in the society based absolutely
on trust, respect, and mutual agreement that anti-reverse-engineering
provisions would be necessary and that, as a result, their enforceability would
be important.)

[1] I use "state" above to mean "situation" or "state of the world," not
"nation" or "government."


Part of my argument was that anti-reverse-engineering provisions would not be
enforceable in a society if that society had anonymity and privacy protected by
cryptography and related fields[2]. (This is logically equivalent to the
statement "a society would not have anonymity and privacy protected by
cryptography and related fields if anti-reverse-engineering provisions were
enforceable in that society," which is more like the way I've stated my claim
other times.) I did not compose my argument for it to be applied only to the
hypothetical situation. As I suspect Anonymous did, I hoped the reader to apply
the argument in relation to something of practical importance -- an attainable
state.

[2] Some logical paperwork: if everyone in the society is trustworthy -- i.e.,
in a Utopian sort of world -- the provisions are as unenforceable as ever in
the presence of strong privacy/anonymity, even if the trustworthiness means
that no failed attempt to enforce them need be made.


A reader looking to the arguments with an eye for the practical can walk away
with a usable conclusion; literal interpretation and narrow application of the
logic may produce different results -- and, as always, your mileage may vary.