[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Toolkits, Bugs, and Interfaces
Ten days ago Tim May ended a post on Toolkits:
"For digital money to succeed, there had better not be flaws and
loopholes that allow attackers to drain your money away or to cause
confusion and doubt amongst your customers!..."
I think near certainty of correct function is needed for all
cryptographic software to find acceptance with the general public.
Of the the aspects needed, algorithmic correctness has received
most attention here thusfar. I want to second Tim's call for a
Toolkit in particular relation to two other needs: a facile user
interface and freedom from bugs. These are necessary so that
when Alice Anyone feels the need for crypto, she can get software,
easily used, that prevents foolish misuse, and is both free of
bugs and weakness to attack.
At the state of the art, we cannot guarantee these any more than
we can assert the future security of our algorithms. But our best
approach is to get working tools into the hands of testers and
critical users to begin the process of debugging and revision.
I would suggest that cypherpunks both write and test code.
I recommend two books to stimulate thought on debugging and interface
design, both of which I enjoyed reading. "Digital Woes: Why we
should not depend on software" by Lauren Ruth Weiner is a new,
(First printing - Sept.93) work about bugs. In 209 pages, backed
by 365 citations to the literature (often comp.risks), it offers a
view of the range of software failures that have occurred.
Perhaps we can attend to history and not need to repeat it.
Donald Norman's "Design of veryday Things" is an outstanding work
on interface design. An excerpt that I read in Dr. Dobbs one
morning made me rush to a bookshop and buy it before noon!
HOW TO DO THINGS WRONG
If you set out to make something difficult to use, you could
probably do no better than to copy the designers of modern
computer systems....:
* Make things invisible. Widen the Gulf of Execution:
give no hints to the operations expected. Establish a Gulf
of Evaluation: give no feedback, no visible results of the
actions just taken. Exploit the tyranny of the blank screen.
...
* Be inconsistent: change the rules. Let something be done
one way in one mode and another way in another mode. This is
especially effective where it is necessary to go back and forth
between these modes.
...
* Make operations dangerous. Allow a single erroneous action
to destroy invaluable work. Make it easy to do disastrous
things. But put warnings in the manual; then when people
complain, you can ask, "But didn't you read the manual?"