[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
forward: Cu Digest, #5.43 -- 2600 & CPSR House Subcommittee Testimony
- To: [email protected]
- Subject: forward: Cu Digest, #5.43 -- 2600 & CPSR House Subcommittee Testimony
- From: [email protected] (J. Eric Townsend)
- Date: Mon, 14 Jun 93 08:59:51 -0700
Emmanuel's comments are somewhat disturbing...
>
>
> Computer underground Digest Sun June 13 1993 Volume 5 : Issue 43
> ISSN 1004-043X
>
> Editors: Jim Thomas and Gordon Meyer ([email protected])
> Archivist: Brendan Kehoe
> Shadow-Archivists: Dan Carosone / Paul Southworth
> Ralph Sims / Jyrki Kuoppala
> Ian Dickinson
> Copy Editor: Etaoin Shrdlu, Seniur
>
> CONTENTS, #5.43 (June 13 1993)
> File 1--Hacker testimony to House subcommittee largely unheard
> File 2--CPSR Clipper Testimony (6-9-93) in House Subcommittee
>
> Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
> available at no cost electronically from [email protected]. The
> editors may be contacted by voice (815-753-6430), fax (815-753-6302)
> or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
> 60115.
>
> Issues of CuD can also be found in the Usenet comp.society.cu-digest
> news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
> LAWSIG, and DL0 and DL12 of TELECOM; on GEnie in the PF*NPC RT
> libraries and in the VIRUS/SECURITY library; from America Online in
> the PC Telecom forum under "computing newsletters;"
> On Delphi in the General Discussion database of the Internet SIG;
> on the PC-EXEC BBS at (414) 789-4210; and on: Rune Stone BBS (IIRG
> WHQ) 203-832-8441 NUP:Conspiracy
> CuD is also available via Fidonet File Request from 1:11/70; unlisted
> nodes and points welcome.
> EUROPE: from the ComNet in LUXEMBOURG BBS (++352) 466893;
> In ITALY: Bits against the Empire BBS: +39-461-980493
>
> ANONYMOUS FTP SITES:
> UNITED STATES: ftp.eff.org (192.88.144.4) in /pub/cud
> uglymouse.css.itd.umich.edu (141.211.182.53) in /pub/CuD/cud
> halcyon.com( 202.135.191.2) in /pub/mirror/cud
> AUSTRALIA: ftp.ee.mu.oz.au (128.250.77.2) in /pub/text/CuD.
> EUROPE: nic.funet.fi in pub/doc/cud. (Finland)
> ftp.warwick.ac.uk in pub/cud (United Kingdom)
>
> COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
> information among computerists and to the presentation and debate of
> diverse views. CuD material may be reprinted for non-profit as long
> as the source is cited. Authors hold a presumptive copyright, and
> they should be contacted for reprint permission. It is assumed that
> non-personal mail to the moderators may be reprinted unless otherwise
> specified. Readers are encouraged to submit reasoned articles
> relating to computer culture and communication. Articles are
> preferred to short responses. Please avoid quoting previous posts
> unless absolutely necessary.
>
> DISCLAIMER: The views represented herein do not necessarily represent
> the views of the moderators. Digest contributors assume all
> responsibility for ensuring that articles submitted do not
> violate copyright protections.
>
> ----------------------------------------------------------------------
>
> Date: Thu, 10 Jun 1993 16:53:48 -0700
> From: Emmanuel Goldstein <[email protected]>
> Subject: File 1--Hacker testimony to House subcommittee largely unheard
>
> What follows is a copy of my written testimony before the House
> Subcommittee on Telecommunications and Finance. The June 9th hearing
> was supposed to have been on the topic of network security, toll
> fraud, and the social implications of the rapidly emerging
> technologies. I was asked to speak for those who had no voice, which
> translates to hackers and consumers. Instead I found myself barraged
> with accusations from the two representatives in attendance (Rep. Ed
> Markey D-MA and Rep. Jack Fields R-TX) who considered 2600 Magazine
> (of which I'm the editor) nothing more than a manual for computer
> crime. One article in particular that Markey latched upon was one in
> our Spring issue that explained how a cable descrambler worked.
> According to Markey, there was no use for this information outside of
> a criminal context. Fields claimed we were printing cellular "codes"
> that allowed people to listen in on cellular calls. In actuality, we
> printed frequencies. The difference didn't seem to matter - after
> explaining it to him, he still said he was very disturbed by the fact
> that I was allowed to keep publishing. It soon became apparent to me
> that neither one had read my testimony as there seemed to be no
> inclination to discuss any of the issues I had brought up. In a way,
> it was very much like being on the Geraldo show. Somehow I thought
> elected representatives would be less sensationalist and more
> interested in learning but this was not the case here. We got
> absolutely nowhere. Markey in particular was rude, patronizing, and
> not at all interested in entertaining any thought outside his narrow
> perception. It's too bad this opportunity was lost. There is a real
> danger in elected officials who don't listen to all relevant opinions
> and who persist in sticking to old-fashioned, outdated notions that
> just don't apply to high technology. You can look forward to more
> restrictive regulations and higher penalties for violating them if
> this mentality continues to dominate.
>
> +++++++++++++++++++
> WRITTEN TESTIMONY FOLLOWS:
>
> Mr. Chairman, members of the Committee, thank you for the
> opportunity to speak on the issue of the rapid growth and changes in
> the telecommunications industry.
>
> My name is Emmanuel Goldstein and I am the publisher of 2600
> Magazine, which is a journal for computer hackers as well as anyone
> else who happens to be interested in the direction that technology is
> taking us. We tend to be brutally honest in our assessments and, as a
> result, we do get some corporations quite angry at us. But we've also
> managed to educate a large number of people as to how their telephone
> system works, what kinds of computers may be watching them, and how
> they can shape technology to meet their needs, rather than be forced
> to tailor their existence to meet technology's needs.
>
> I am also the host of a weekly radio program called Off The Hook
> which airs over WBAI in New York. Through that forum we have
> discovered the eagerness and curiosity that many "ordinary people on
> the street" possess for technology. At the same time we have seen
> fears and suspicions expressed that would be unwise to ignore.
>
> HOW TO HANDLE RAPIDLY CHANGING TECHNOLOGY
>
> The next few years will almost certainly go down in history as
> those in which the most change took place in the least amount of time.
> The computer and telecommunications revolution that we are now in the
> midst of is moving full speed ahead into unknown territory. The
> potential for amazing advances in individual thought and creativity is
> very real. But so is the potential for oppression and mistrust the
> likes of which we have never before seen. One way or the other, we
> will be making history.
>
> I think we can imagine it best if we think of ourselves speeding
> down a potentially dangerous highway. Perhaps the road will become
> slick with ice or fraught with sharp curves. It's a road that nobody
> has gone down before. And the question we have to ask ourselves is
> what kind of a vehicle would we prefer to be in if things should start
> getting out of control: our own automobile where we would have at
> least some chance of controlling the vehicle and bringing it down to a
> safe speed or a bus where we, along with many others, must put all of
> our trust behind a total stranger to prevent a disaster. The answer is
> obviously different depending on the circumstances. There are those of
> us who do not want the responsibility of driving and others who have
> proven themselves unworthy of it. What's important is that we all have
> the opportunity at some point to choose which way we want to go.
>
> Rapidly changing technology can also be very dangerous if we
> don't look where we're going or if too many of us close our eyes and
> let someone else do the driving. This is a ride we all must stay awake
> for.
>
> I am not saying we should be overly suspicious of every form of
> technology. I believe we are on the verge of something very positive.
> But the members of this committee should be aware of the dangers of an
> uninformed populace. These dangers will manifest themselves in the
> form of suspicion towards authority, overall fear of technology, and
> an unhealthy feeling of helplessness.
>
> HOW NEW TECHNOLOGY CAN HURT US
>
> The recent FBI proposal to have wiretap capabilities built into
> digital telephone systems got most of its publicity because American
> taxpayers were expected to foot the bill. But to many of the
> non-technical people I talked to, it was just another example of Big
> Brother edging one step closer. It is commonly believed that the
> National Security Agency monitors all traffic on the Internet, not to
> mention all international telephone calls. Between Caller ID, TRW
> credit reports, video cameras, room monitors, and computer
> categorizations of our personalities, the average American feels as if
> life no longer has many private moments. Our Social Security numbers,
> which once were for Social Security, are now used for everything from
> video rentals to driver's licenses. These numbers can easily be used
> to track a person's location, expenses, and habits - all without any
> consent. If you know a person's name, you can get their telephone
> number. If you have their phone number, you can get their address.
> Getting their Social Security number is not even a challenge anymore.
> With this information, you can not only get every bit of information
> about this person that exists on any computer from Blockbuster Video
> to the local library to the phone company to the FBI, but you can
> begin to do things in this poor person's name. It's possible we may
> want a society like this, where we will be accountable for our every
> movement and where only criminals will pursue privacy. The American
> public needs to be asked. But first, they need to understand.
>
> In Germany, there is a fairly new computerized system of identity
> cards. Every citizen must carry one of these cards. The information
> includes their name, address, date of birth, and nationality - in
> other words, the country they were originally born in. Such a system
> of national identity can be quite useful, but in the wrong hands it
> can be extremely scary. For example, if a neo-Nazi group were to
> somehow get their hands on the database, they could instantly find out
> where everyone of Turkish nationality lived. A malevolent government
> could do the same and, since not carrying the card would be a crime,
> it would be very hard to avoid its wrath.
>
> Before introducing a new technology that is all-encompassing, all
> of its potential side-effects and disadvantages should be discussed
> and addressed. Opportunities must exist for everyone to ask questions.
> In our own country, nobody was ever asked if they wanted a credit file
> opened on them, if they wanted to have their phone numbers given to
> the people and companies they called through the use of Caller ID and
> ANI, or if they wanted to be categorized in any manner on numerous
> lists and databases. Yet all of this has now become standard practice.
>
> This implementation of new rules has resulted in a degree of
> cynicism in many of us, as well as a sense of foreboding and dread. We
> all know that these new inventions will be abused and used to
> somebody's advantage at some point. There are those who would have us
> believe that the only people capable of such misdeeds are computer
> hackers and their ilk. But it just isn't that simple.
>
> UNDERSTANDING COMPUTER HACKERS
>
> To understand computer hackers, it helps to think of an alien
> culture. We have such cultures constantly around us - those with
> teenage children ought to know what this means. There are alien
> cultures of unlimited varieties throughout the globe, sometimes in the
> most unexpected places. I'm convinced that this is a good thing.
> Unfortunately, all too often our default setting on whatever it is we
> don't understand is "bad". Suspicion and hostility follow and are soon
> met with similar feelings from the other side. This has been going on
> between and within our cultures for as long as we've existed. While we
> can't stop it entirely, we can learn to recognize the danger signs.
> The best way that I've found to deal with an alien culture, whether
> it's in a foreign country or right here at home, is to try and
> appreciate it while giving it a little leeway. There is not a single
> alien culture I've encountered that has not been decidedly friendly.
> That includes deadheads, skateboarders, Rastafarians, and hackers.
>
> When we talk about computer hackers, different images spring to
> mind. Most of these images have come about because of perceptions
> voiced by the media. Too often, as I'm sure the members of this
> committee already suspect, the media just doesn't get it. This is not
> necessarily due to malice on their part but rather a general lack of
> understanding and an overwhelming pressure to produce a good story.
> Hence we get an abundance of sensationalism and, when the dust clears,
> hackers are being compared with bank robbers, mobsters, terrorists,
> and the like. It's gotten to the point that the word hacker is almost
> analogous to the word criminal.
>
> Fortunately, the media is learning. Reporters now approach
> hackers with a degree of technological savvy. For the most part, they
> have stopped asking us to commit crimes so they can write a story
> about it. As the technology envelops us, journalists are developing
> the same appreciation and curiosity for it that hackers have always
> had. Any good reporter is at least part hacker because what a hacker
> does primarily is relentlessly pursue an answer. Computers naturally
> lend themselves to this sort of pursuit, since they tend to be very
> patient when asked a lot of questions.
>
> WHAT CONSTITUTES A HI-TECH CRIME?
>
> So where is the boundary between the hacker world and the
> criminal world? To me, it has always been in the same place. We know
> that it's wrong to steal tangible objects. We know that it's wrong to
> vandalize. We know that it's wrong to invade somebody's privacy. Not
> one of these elements is part of the hacker world.
>
> A hacker can certainly turn into a criminal and take advantage of
> the weaknesses in our telephone and computer systems. But this is
> rare. What is more likely is that a hacker will share knowledge with
> people, one of whom will decide to use that knowledge for criminal
> purposes. This does not make the hacker a criminal for figuring it
> out. And it certainly doesn't make the criminal into a hacker.
>
> It is easy to see this when we are talking about crimes that we
> understand as crimes. But then there are the more nebulous crimes; the
> ones where we have to ask ourselves: "Is this really a crime?" Copying
> software is one example. We all know that copying a computer program
> and then selling it is a crime. It's stealing, plain and simple. But
> copying a program from a friend to try it out on your home computer --
> is this the same kind of crime? It seems obvious to me that it is not,
> the reason being that you must make a leap of logic to turn such an
> action into a crime. Imagine if we were to charge a licensing fee
> every time somebody browsed through a magazine at the local bookshop,
> every time material was borrowed from a library, or every time a phone
> number was jotted down from the yellow pages. Yet, organizations like
> the Software Publishers Association have gone on record as saying that
> it is illegal to use the same computer program on more than one
> computer in your house. They claim that you must purchase it again or
> face the threat of federal marshalls kicking in your door. That is a
> leap of logic.
>
> It is a leap of logic to assume that because a word processor
> costs $500, a college student will not try to make a free copy in
> order to write and become a little more computer literate. Do we
> punish this student for breaking a rule? Do we charge him with
> stealing $500? To the hacker culture on whose behalf I am speaking
> today, the only sensible answer is to make it as easy as possible for
> that college student to use the software he needs. And while we're at
> it, we should be happy that he's interested in the first place.
>
> Of course, this represents a fundamental change in our society's
> outlook. Technology as a way of life, not just another way to make
> money. After all, we encourage people to read books even if they can't
> pay for them because to our society literacy is a very important goal.
> I believe technological literacy is becoming increasingly important.
> But you cannot have literacy of any kind without having access.
>
> If we continue to make access to technology difficult,
> bureaucratic, and illogical, then there will also be more computer
> crime. The reason being that if you treat someone like a criminal,
> they will begin to act like one. If we succeed in convincing people
> that copying a file is the same as physically stealing something, we
> can hardly be surprised when the broad-based definition results in
> more overall crime. Blurring the distinction between a virtual
> infraction and a real-life crime is a mistake.
>
> LEGISLATION FOR COMPUTER AGE CRIME
>
> New laws are not needed because there is not a single crime that
> can be committed with a computer that is not already defined as a
> crime without a computer. But let us not be loose with that
> definition. Is mere unauthorized access to a computer worthy of
> federal indictments, lengthy court battles, confiscation of equipment,
> huge fines, and years of prison time? Or is it closer to a case of
> trespassing, which in the real world is usually punished by a simple
> warning? "Of course not," some will say, "since accessing a computer
> is far more sensitive than walking into an unlocked office building."
> If that is the case, why is it still so easy to do? If it's possible
> for somebody to easily gain unauthorized access to a computer that has
> information about me, I would like to know about it. But somehow I
> don't think the company or agency running the system would tell me
> that they have gaping security holes. Hackers, on the other hand, are
> very open about what they discover which is why large corporations
> hate them so much. Through legislation, we can turn what the hackers
> do into a crime and there just might be a slim chance that we can stop
> them. But that won't fix poorly designed systems whose very existence
> is a violation of our privacy.
>
> THE DANGERS OF UNINFORMED CONSUMERS
>
> The concept of privacy is something that is very important to a
> hacker. This is so because hackers know how fragile privacy is in
> today's world. Wherever possible we encourage people to protect their
> directories, encrypt their electronic mail, not use cellular phones,
> and whatever else it takes to keep their lives to themselves. In 1984
> hackers were instrumental in showing the world how TRW kept credit
> files on millions of Americans. Most people had never even heard of a
> credit file until this happened. Passwords were very poorly guarded -
> in fact, credit reports had the password printed on the credit report
> itself. More recently, hackers found that MCI's Friends and Family
> program allowed anybody to call an 800 number and find out the numbers
> of everyone in a customer's "calling circle". As a bonus, you could
> also find out how these numbers were related to the customer: friend,
> brother, daughter-in-law, business partner, etc. Many times these
> numbers were unlisted yet all that was needed to "verify" the
> customer's identity was the correct zip code. In both the TRW and MCI
> cases, hackers were ironically accused of being the ones to invade
> privacy. What they really did was help to educate the American
> consumer.
>
> Nowhere is this more apparent than in the telephone industry.
> Throughout the country, telephone companies take advantage of
> consumers. They do this primarily because the consumer does not
> understand the technology. When we don't understand something
> complicated, we tend to believe those who do understand. The same is
> true for auto mechanics, plumbers, doctors, and lawyers. They all
> speak some strange language that the majority of us will never
> understand. So we tend to believe them. The difference with the phone
> companies, and here I am referring to the local companies, is that you
> cannot deal with somebody else if you happen to disagree with them or
> find them untrustworthy. The phone companies have us in a situation
> where we must believe what they say. If we don't believe them, we
> cannot go elsewhere.
>
> This is the frustration that the hacker community constantly
> faces. We face it especially because we are able to understand when
> the local phone companies take advantage of consumers. Here are a few
> examples:
>
> Charging a fee for touch tone service. This is a misnomer. It
> actually takes extra effort to tell the computer to ignore the tones
> that you produce. Everybody already has touch tone capability but we
> are forced to pay the phone company not to block it. While $1.50 a
> month may not seem like much, when added together the local companies
> that still engage in this practice are making millions of dollars a
> year for absolutely nothing. Why do they get away with it? Because too
> many of us don't understand how the phone system works. I try to draw
> an analogy in this particular case - imagine if the phone company
> decided that a fee would be charged to those customers who wanted to
> use the number five when dialing. They could argue that the five takes
> more energy than the four but most of us would see through this flimsy
> logic. We must seek out other such dubious practices and not blindly
> accept what we are told.
>
> Other examples abound: being charged extra not to have your name
> listed in the telephone directory, a monthly maintenance charge if you
> select your own telephone number, the fact that calling information to
> get a number now costs more than calling the number itself.
>
> More recently, we have become acquainted with a new standard
> called Signalling System Seven or SS7. Through this system it is
> possible for telephones to have all kinds of new features: Caller ID,
> Return Call, Repeat Calling to get through a busy signal, and more.
> But again, we are having the wool pulled over our eyes. For instance,
> if you take advantage of Call Return in New York (which will call the
> last person who dialed your number), you are charged 75 cents on top
> of the cost of the call itself. Obviously, there is a cost involved
> when new technologies are introduced. But there is no additional
> equipment, manpower, or time consumed when you dial *69 to return a
> call. It's a permanent part of the system. As a comparison, we could
> say that it also costs money to install a hold button. Imagine how we
> would feel if we were charged a fee every time we used it.
>
> The local companies are not the only offenders but it is
> particularly bad in their case because, for the vast majority of
> Americans, there is no competition on this level. The same complaints
> are being voiced concerning cable television companies.
>
> Long distance telephone companies are also guilty. AT&T, MCI, and
> Sprint all encourage the use of calling cards. Yet each imposes a
> formidable surcharge each and every time they're used. AT&T, for
> example, charges 13 cents for the first minute of a nighttime call
> from Washington DC to New York plus an 80 cent surcharge. Since a
> calling card can only be used to make telephone calls, why are
> consumers expected to pay an extra fee as if they were doing something
> above and beyond the normal capability of the card? Again, there is no
> extra work necessary to complete a calling card call - at least not on
> the phone company's part. The consumer, on the other hand, must enter
> up to 25 additional digits. But billing is accomplished merely by
> computers sending data to each other. Gone are the days of tickets
> being written up by hand and verified by human beings. Everything is
> accomplished quickly, efficiently, and cheaply by computer. Therefore,
> these extra charges are outdated.
>
> SOCIAL INJUSTICES OF TECHNOLOGY
>
> The way in which we have allowed public telephones to be operated
> is particularly unfair to those who are economically disadvantaged. A
> one minute call to Washington DC can cost as little as 12 cents from
> the comfort of your own home. However, if you don't happen to have a
> phone, or if you don't happen to have a home, that same one minute
> call will cost you $2.20. That figure is the cheapest rate there is
> from a Bell operated payphone. With whatever kind of logic was used to
> set these prices, the results are clear. We have made it harder and
> more expensive for the poor among us to gain access to the telephone
> network. Surely this is not something we can be proud of.
>
> A direct result of this inequity is the prevalence of red boxes.
> Red boxes are nothing more than tone generators that transmit a quick
> burst of five tones which convince the central office that a quarter
> has been deposited. It's very easy and almost totally undetectable.
> It's also been going on for decades. Neither the local nor long
> distance companies have expended much effort towards stopping red
> boxes, which gives the impression that the payphone profits are still
> lucrative, even with this abuse. But even more troubling is the
> message this is sending. Think of it. For a poor and homeless person
> to gain access to something that would cost the rest of us 12 cents,
> they must commit a crime and steal $2.20. This is not equal access.
>
> CORPORATE RULES
>
> Hackers and phone phreaks, as some of us are called, are very
> aware of these facts. We learn by asking lots of questions. We learn
> by going to libraries and doing research. We learn by diving into
> phone company trash dumpsters, reading discarded material, and doing
> more research. But who will listen to people like us who have been
> frequently characterized as criminals? I am particularly grateful that
> this committee has chosen to hear us. What is very important to us is
> open communications. Freedom of information. An educated public.
>
> This puts us at direct odds with many organizations, who believe
> that everything they do is "proprietary" and that the public has no
> right to know how the public networks work. In July of 1992 we were
> threatened with legal action by Bellcore (the research arm of the
> Regional Bell Operating Companies) for revealing security weaknesses
> inherent in Busy Line Verification (BLV) trunks. The information had
> been leaked to us and we did not feel compelled to join Bellcore's
> conspiracy of silence. In April of this year, we were threatened with
> legal action by AT&T for printing proprietary information of theirs.
> The information in question was a partial list of the addresses of
> AT&T offices. It's very hard for us to imagine how such information
> could be considered secret. But these actions are not surprising. They
> only serve to illustrate the wide disparities between the corporate
> mindset and that of the individual. It is essential that the hundreds
> of millions of Americans who will be affected by today's
> all-encompassing inventions not be forced to play by corporate rules.
>
> In 1990 a magazine similar to 2600 was closed down by the United
> States government because Bell South said they printed proprietary
> information. Most people never found out about this because Phrack
> Magazine was electronic, i.e., only available on computer bulletin
> boards and networks. This in itself is wrong; a publication must have
> the same First Amendment rights regardless of whether it is printed
> electronically or on paper. As more online journals appear, this basic
> tenet will become increasingly critical to our nation's future as a
> democracy. Apart from this matter, we must look at what Bell South
> claimed - that a document discussing the Enhanced 911 system which was
> worth $79,449 had been "stolen" and printed by Phrack. (Some newspaper
> accounts even managed to change it into an E911 program which gave the
> appearance that hackers were actually interfering with the operation
> of an E911 system and putting lives at risk. In reality there has
> never been a report of a hacker gaining access to such a system.) It
> was not until after the publisher of Phrack was forced to go to trial
> that the real value of the document was revealed. Anyone could get a
> copy for around $14. The government promptly dropped its case against
> the publisher who, to this day, is still paying back $100,000 in legal
> fees. As further evidence of the inquity between individual justice
> and corporate justice, Bell South was never charged with fraud for its
> claim that a $14 document was worth nearly $80,000. Their logic, as
> explained in a memo to then Assistant U.S. Attorney Bill Cook, was
> that the full salaries of everyone who helped write the document, as
> well as the full cost of all hardware and software used in the
> endeavor ($31,000 for a Vaxstation II, $6,000 for a printer), was
> perfectly acceptable. It is very disturbing that the United States
> government agreed with this assessment and moved to put a pre-law
> student behind bars for violating corporate rules.
>
> MISGUIDED AUTHORITY
>
> I wish I could stand before this committee and say that we have
> been successful in stopping all such miscarriages of justice. While
> the Phrack case may have been the most bizarre, there are many more
> instances of individuals being victimized in similar manners. A
> teenager in Chicago was jailed for a year for copying a file that was
> worth millions, according to AT&T, but was utterly worthless and
> unusable to a kid. A bulletin board operator in California, along with
> his entire family, was held at gunpoint for hours while authorities
> seized his equipment in an unsuccessful attempt to find child
> pornography. Three hackers in Atlanta, after being imprisoned up to a
> year for dialing into a Bell South computer system that had no
> password, were forced to pay $233,000 in restitution so the company
> could install a password system. More recently, a student at the
> University of Texas at Houston was suspended from school for a year
> because he accessed a file that merely listed the users of the system
> (a file which the system allows all users to access). In increasing
> numbers, young people are being sent to jail, not necessarily for
> something they did, but rather for something they could have done in a
> worst-case scenario. Again this indicates fear and misunderstanding of
> technology and its applications. But this time those feelings emanate
> from those in authority.
>
> Locally, an ominous happening occurred at a 2600 monthly meeting
> last November. (These meetings occur in public areas in cities
> throughout the nation on the first Friday of every month.) Shortly
> after it began, the Washington meeting was broken up by Pentagon City
> Mall security guards. Without any provocation, people were forced to
> submit to searches and everybody's name was taken down. One of the
> attendees who was writing down an officer's name had the paper ripped
> from his hand, another had his film taken from his camera as he tried
> to document what was going on. Upon questioning by a reporter from
> Communications Daily, the mall security chief claimed that he was
> acting under orders from the United States Secret Service. Subsequent
> Freedom of Information Act requests by Computer Professionals for
> Social Responsibility have yielded more evidence implicating the
> Secret Service in this illegal and unwarranted action. Nothing of a
> criminal nature was ever found in any of the bags that were searched.
> But a full list of the attendees wound up in the possession of the
> Secret Service. It seems ironic that while hackers are conducting an
> open gathering in the middle of a shopping mall in order to share
> knowledge and welcome new people, agents of the Secret Service are
> lurking in the shadows trying to figure out ways to stop them.
>
> How can we move forward and talk about exciting new applications
> of technology when we're off to such a bad start? The people that are
> being arrested, harassed, and intimidated are the people who will be
> designing and running these new systems. They are the ones who will
> appreciate their capabilities and understand their weaknesses. Through
> our short-sightedness and eagerness to listen to the loudest voices,
> we are alienating the promises of the future. How many here, who grew
> up in decades past, remember hearing teenagers talk of how the
> government is after them, watching their every move, listening to
> their phone calls, doing everything one might expect in a totalitarian
> regime. Such feelings are the sure sign of an ailing society. It does
> not matter if these things are not actually occurring - their mere
> perception is enough to cause lasting harm and mistrust.
>
> PROMISE OF THE INTERNET
>
> The future holds such enormous potential. It is vital that we not
> succumb to our fears and allow our democratic ideals and privacy
> values to be shattered. In many ways, the world of cyberspace is more
> real than the real world itself. I say this because it is only within
> the virtual world that people are really free to be themselves - to
> speak without fear of reprisal, to be anonymous if they so choose, to
> participate in a dialogue where one is judged by the merits of their
> words, not the color of their skin or the timbre of their voice.
> Contrast this to our existing "real" world where we often have people
> sized up before they even utter a word. The Internet has evolved, on
> its own volition, to become a true bastion of worldwide democracy. It
> is the obligation of this committee, and of governments throughout the
> world, not to stand in its way.
>
> This does not mean we should stand back and do nothing. Quite
> the contrary, there is much we have to do if accessibility and
> equality are our goals. Over-regulation and commercialization are two
> ways to quickly kill these goals. A way to realize them is to have a
> network access point in every house. Currently, network access is
> restricted to students or professors at participating schools,
> scientists, commercial establishments, and those who have access to,
> and can afford, local services that link into the Internet. Yes, a lot
> of people have access today. But a far greater number do not and it
> is to these people that we must speak. The bigger the Internet gets,
> the better it gets. As it exists today, cultures from around the globe
> are represented; information of all kinds is exchanged. People are
> writing, reading, thinking. It's potentially the greatest educational
> tool we have. Therefore, it is essential that we not allow it to
> become a commodity that only certain people in society will be able to
> afford. With today's technology, we face the danger of widening the
> gap between the haves and the have-nots to a monumental level. Or we
> can open the door and discover that people really do have a lot to
> learn from each other, given the opportunity.
>
> It is my hope that this committee will recognize the importance
> of dialogue with the American public, in order to answer the questions
> so many are asking and to address the concerns that have been
> overlooked. I thank you for this opportunity to express those issues
> that I feel relevant to this hearing.
>
> ------------------------------
>
> Date: Sat, 12 Jun 1993 12:30:38 EST
> From: Dave Banisar <[email protected]>
> Subject: File 2--CPSR Clipper Testimony (6-9-93) in House Subcommittee
>
> CPSR Clipper Testimony 6/9
>
> On June 9, 1993, Congressman Edward Markey, Chairman of the
> House Subcommittee on Telecommunications and Finance held an
> oversight hearing on Rencryption and telecommunications network
> security. Panelists were Whitfield Diffie of Sun Microsystems, Dr.
> Dorothy Denning, Steven Bryen of Secure Communications, Marc
> Rotenberg of the CPSR Washington Office and E.R. Kerkeslager of AT&T.
>
> Congressman Markey, after hearing the testimony presented,
> noted that the Clipper proposal had raised an arched eyebrow among
> the whole committeeS and that the committee viewed the proposal
> skeptically. This statement was the latest indication that the Clipper
> proposal has not been well received by policy makers. Last Friday,
> the Computer Systems Security and Privacy Advisory Board of NIST
> issued two resolutions critical of the encryption plan, suggesting
> that further study was required and that implementation of the plan
> should be delayed until the review is completed.
>
> At the Third CPSR Cryptography and Privacy Conference on
> Monday, June 7, the Acting Director of NIST, Raymond Kammer, announced
> that the implementation of the proposal will be delayed and that a
> more comprehensive review will be undertaken. The review is due in
> the fall. Kammer told the Washington Post that Rmaybe we wonUt
> continue in the direction we started ous.
>
> +-------------------------------------------------
>
> Prepared Testimony
> and
> Statement for the Record
> of
> Marc Rotenberg, director
> CPSR Washington Office
> on
> Encryption Technology and Policy
> Before
> The Subcommittee on Telecommunications and Finance.
> Committee on Energy and Commerce
>
> U.S. House of Representatives
> June 9, 1993
>
> SUMMARY
>
> The cryptography issue is of particular concern to CPSR.
> During the past several years CPSR has pursued an extensive study of
> cryptography policy in the United States. CPSR has organized public
> conferences, conducted litigation under the Freedom of Information Act,
> and has emphasized the importance of cryptography for privacy
> protection and the need to scrutinize carefully government proposals
> designed to limit the use of this technology.
> To evaluate the Clipper proposal it is necessary to look at a
> 1987 law, the Computer Security Act, which made clear that in the area
> of unclassified computing systems, the National Institute of Standards
> and Technology (NIST) and not the National Security Agency (NSA), would
> be responsible for the development of technical standards. The Act
> emphasized public accountability and stressed open decision-making.
> In the spirit of the Act, in 1989 NIST set out to develop a
> public key cryptography standard. According to documents obtained by
> CPSR through the Freedom of Information Act, NIST recommended that the
> algorithm be "public, unclassified, implementable in both hardware or
> software, usable by federal Agencies and U.S. based multi-national
> corporation." However, the Clipper proposal and the full-blown Capstone
> configuration that resulted is very different: the Clipper algorithm,
> Skipjack, is classified; public access to the reasons underlying the
> proposal is restricted; Skipjack can be implemented only in
> tamper-proof hardware; it is unlikely to be used by multi-national
> corporations, and the security of Clipper remains unproven.
> The Clipper proposal undermines the central purpose of the
> Computer Security Act. Although intended for broad use in commercial
> networks, it was not developed at the request of either U.S. business
> or the general public. It does not reflect public goals.
> The premise of the Clipper key escrow arrangement is that the
> government must have the ability to intercept electronic
> communications. However, there is no legal basis to support this
> premise. In law there is nothing inherently illegal or suspect about
> the use of a telephone. The federal wiretap statute says only that
> communication service providers must assist law enforcement execute a
> lawful warrant.
> CPSR supports the review of cryptography policy currently
> underway at the Department of Commerce. CPSR also supports the efforts
> undertaken by the Subcommittee on Telecommunications and Finance to
> study the full ramifications of the Clipper proposal. However, we are
> not pleased about the review now being undertaken at the White House.
> That effort has led to a series of secret meetings, has asked that
> scientists sign non-disclosure agreements and accept restrictions on
> publication, and has attempted to resolve public concerns through
> private channels. This is not a good process for the evaluation of a
> technology that is proposed for the public switched network.
> Even if the issues regarding Clipper are resolved favorably,
> privacy concerns will not go away. Rules still need to be developed
> about the collection and use of transactional data generated by
> computer communications. Several specific steps should be taken.
> First, the FCC should be given a broad mandate to pursue privacy
> concerns. Second, current gaps in the communications law should be
> filled. The protection of transactional records is particularly
> important. Third, telecommunications companies should be encouraged to
> explore innovative ways to protect privacy. "Telephone cards", widely
> available in other countries, are an ideal way to protect privacy.
>
>
> TESTIMONY
>
> Mr. Chairman, members of the Subcommittee, thank you for the
> opportunity to testify today on encryption policy and the Clipper
> proposal. I especially wish to thank you Congressman Markey, on behalf
> of CPSR, for your ongoing efforts on the privacy front as well as your
> work to promote public access to electronic information.
> The cryptography issue is of particular concern to CPSR.
> During the past several years we have pursued an extensive study of
> cryptography policy in the United States. We have organized several
> public conferences, conducted litigation under the Freedom of
> Information Act, and appeared on a number of panels to discuss the
> importance of cryptography for privacy protection and the need to
> scrutinize carefully government proposals designed to limit the use of
> this technology.
> While we do not represent any particular computer company or
> trade association we do speak for a great many people in the computer
> profession who value privacy and are concerned about the government's
> Clipper initiative.
> Today I will briefly summarize our assessment of the Clipper
> proposal. Then I would like to say a few words about the current
> status of privacy protection.
>
> CLIPPER
> To put the Clipper proposal in a policy context, I will need to
> briefly to describe a law passed in 1987 intended to address the roles
> of the Department of Commerce and the Department of Defense in the
> development of technical standards. The Computer Security Act of 1987
> was enacted to improve computer security in the federal government, to
> clarify the responsibilities of the National Institute of Standards and
> Technology (NIST) and the National Security Agency, and to ensure that
> technical standards would serve civilian and commercial needs.
> The law made clear that in the area of unclassified computing
> systems, NIST and not NSA, would be responsible for the development of
> technical standards. It emphasized public accountability and stressed
> open decision-making. The Computer Security Act also established the
> Computer System Security and Privacy Advisory Board (CSSPAB), charged
> with reviewing the activities of NIST and ensuring that the mandate of
> the law was enforced.
> The Computer Security Act grew out of a concern that classified
> standards and secret meetings would not serve the interests of the
> general public. As the practical applications for cryptography have
> moved from the military and intelligence arenas to the commercial
> sphere, this point has become clear. There is also clearly a conflict
> of interest when an agency tasked with signal interception is also
> given authority to develop standards for network security.
> In the spirit of the Computer Security Act, NIST set out in
> 1989 to develop a public key standard FIPS (Federal Information
> Processing Standard). In a memo dated May 5, 1989, obtained by CPSR
> through the Freedom of Information Act, NIST said that it planned:
>
> to develop the necessary public-key based security standards. We
> require a public-key algorithm for calculating digital signatures and
> we also require a public-key algorithm for distributing secret keys.
>
> NIST then went on to define the requirements of the standard:
>
> The algorithms that we use must be public, unclassified, implementable
> in both hardware or software, usable by federal Agencies and U.S. based
> multi-national corporation, and must provide a level of security
> sufficient for the protection of unclassified, sensitive information
> and commercial propriety and/or valuable information.
>
> The Clipper proposal and the full-blown Capstone configuration,
> which incorporates the key management function NIST set out to develop
> in 1989, is very different from the one originally conceived by NIST.
>
> % The Clipper algorithm, Skipjack, is classified,
> % Public access to the reasons underlying the proposal is
> restricted,
> % Skipjack can be implemented only in tamper-proof hardware,
> % It is Unlikely to be used by multi-national corporations, and
> % The security of Clipper remains unproven.
>
> The Clipper proposal undermines the central purpose of the
> Computer Security Act. Although intended for broad use in commercial
> networks, it was not developed at the request of either U.S. business
> or the general public. It does not reflect public goals. Rather it
> reflects the interests of one secret agency with the authority to
> conduct foreign signal intelligence and another government agency
> responsible for law enforcement investigations.
> Documents obtained by CPSR through the Freedom of Information
> Act indicate that the National Security Agency dominated the meetings
> of the joint NIST/NSA Technical Working group which made
> recommendations to NIST regarding public key cryptography, and that a
> related technical standard for message authentication, the Digital
> Signature Standard, clearly reflected the interests of the NSA.
> We are still trying to determine the precise role of the NSA in
> the development of the Clipper proposal. We would be pleased to
> provide to the Subcommittee whatever materials we obtain.
>
> LEGAL AND POLICY ISSUES
> There are also several legal and constitutional issues raised
> by the government's key escrow proposal. The premise of the Clipper
> key escrow arrangement is that the government must have the ability to
> intercept electronic communications, regardless of the economic or
> societal costs. The FBI's Digital Telephony proposal, and the earlier
> Senate bill 266, were based on the same assumption.
> There are a number of arguments made in defense of this
> position: that privacy rights and law enforcement needs must be
> balanced, or that the government will be unable to conduct criminal
> investigations without this capability.
> Regardless of how one views these various claims, there is one
> point about the law that should be made very clear: currently there is
> no legal basis -- in statute, the Constitution or anywhere else --
> that supports the premise which underlies the Clipper proposal. As the
> law currently stands, surveillance is not a design goal. General
> Motors would have a stronger legal basis for building cars that could
> go no faster than 65 miles per hour than AT&T does in marketing a
> commercial telephone that has a built-in wiretap capability. In law
> there is simply nothing about the use of a telephone that is inherently
> illegal or suspect.
> The federal wiretap statute says only that communication
> service providers must assist law enforcement in the execution of a
> lawful warrant. It does not say that anyone is obligated to design
> systems to facilitate future wire surveillance. That distinction is
> the difference between countries that restrict wire surveillance to
> narrow circumstances defined in law and those that treat all users of
> the telephone network as potential criminals. U.S. law takes the first
> approach. Countries such as the former East Germany took the second
> approach. The use of the phone system by citizens was considered
> inherently suspect and for that reason more than 10,000 people were
> employed by the East German government to listen in on telephone calls.
> It is precisely because the wiretap statute does not contain
> the obligation to incorporate surveillance capability -- the design
> premise of the Clipper proposal -- that the Federal Bureau of
> Investigation introduced the Digital Telephony legislation. But that
> legislation has not moved forward and the law has remained unchanged.
> The Clipper proposal attempts to accomplish through the
> standard-setting and procurement process what the Congress has been
> unwilling to do through the legislative process.
> On legal grounds, adopting the Clipper would be a mistake.
> There is an important policy goal underlying the wiretap law. The
> Fourth Amendment and the federal wiretap statute do not so much balance
> competing interests as they erect barriers against government excess
> and define the proper scope of criminal investigation. The purpose of
> the federal wiretap law is to restrict the government, it is not to
> coerce the public.
> Therefore, if the government endorses the Clipper proposal, it
> will undermine the basic philosophy of the federal wiretap law and the
> fundamental values embodied in the Constitution. It will establish a
> technical mechanism for signal interception based on a premise that has
> no legal foundation. The assumption underlying the Clipper proposal is
> more compatible with the practice of telephone surveillance in the
> former East Germany than it is with the narrowly limited circumstances
> that wire surveillance has been allowed in the United States.
>
> UNANSWERED QUESTIONS
> There are a number of other legal issues that have not been
> adequately considered by the proponents of the key escrow arrangement
> that the Subcommittee should examine. First, not all lawful wiretaps
> follow a normal warrant process. The proponents of Clipper should make
> clear how emergency wiretaps will be conducted before the proposal goes
> forward. Second, there may be civil liability issues for the escrow
> agents, if they are private parties, if there is abuse or compromise of
> the keys. Third, there is a Fifth Amendment dimension to the proposed
> escrow key arrangement if a network user is compelled to disclose his
> or her key to the government in order to access a communications
> network. Each one of these issues should be examined carefully.
>
>
> CPSR CONFERENCE
> At a conference organized by CPSR this week at the Carnegie
> Endowment for International Peace we heard presentations from staff
> members at NIST, FBI, NSA and the White House about the Clipper
> proposal. The participants at the meeting had the opportunity to ask
> questions and to exchange views.
> Certain points now seem clear:
>
> % The Clipper proposal was not developed in response to any
> perceived public or business need. It was developed solely to address
> a law enforcement concern.
> % Wire surveillance remains a small part of law enforcement
> investigations. The number of arrests resulting from wiretaps has
> remained essentially unchanged since the federal wiretap law was enacted
> in 1968.
> % The potential risks of the Clipper proposal have not been
> assessed and many questions about the implementation remain unanswered.
> % Clipper does not appear to have the support of the business or
> research community.
>
> Many comments on the Clipper proposal, both positive and
> negative as well the materials obtained by CPSR through the Freedom of
> Information Act, are contained in the Source book compiled by CPSR for
> the recent conference. I am please to make a copy of this available to
> the Subcommittee.
>
>
> NETWORK PRIVACY PROTECTION
> Communications privacy remains a critical test for network
> development. Networks that do not provide a high degree of privacy are
> clearly less useful to network users. Given the choice between a
> cryptography product without a key escrow and one with a key escrow, it
> would be difficult to find a user who would prefer the key escrow
> requirement. If this proposal does go forward, it will not be because
> network users or commercial service providers favored it.
> Even if the issues regarding the Clipper are resolved
> favorably, privacy concerns will not go away. Cryptography is a part
> of communications privacy, but it is only a small part. Rules still
> need to be developed about the collection and use of transactional data
> generated by computer communications. While the federal wiretap law
> generally does a very good job of protecting the content of
> communications against interception by government agencies, large holes
> still remain. The extensive use of subpoenas by the government to
> obtain toll records and the sale of telephone records by private
> companies are just two examples of gaps in current law.
> The enforcement of privacy laws is also a particularly serious
> concern in the United States. Good laws without clear mechanisms for
> enforcement raise over-arching questions about the adequacy of legal
> protections in this country. This problem is known to those who have
> followed developments with the Privacy Act since passage in 1974 and
> the more recent Video Privacy and Protection Act of 1988. I make this
> point because it has been the experience in other countries that
> agencies charged with the responsibility for privacy protection can be
> effective advocates for the public in the protection of personal
> privacy.
>
> RECOMMENDATIONS
> Regarding the Clipper proposal, we believe that the national
> review currently underway by the Computer Security and Privacy Advisory
> Board at the Department of Commerce will be extremely useful and we
> look forward to the results of that effort. The Panel has already
> conducted a series of important open hearings and compiled useful
> materials on Clipper and cryptography policy for public review.
> We are also pleased that the Subcommittee on Telecommunications
> and Finance has undertaken this hearing. This Subcommittee can play a
> particularly important role in the resolution of these issues. We also
> appreciate the Chairman's efforts to ensure that the proper studies are
> undertaken, that the General Accounting Office fully explores these
> issues, and that the Secretary of Commerce carefully assesses the
> potential impact of the Clipper proposal on export policy.
> We are, however, less pleased about the White House study
> currently underway. That effort, organized in large part by the
> National Security Council, has led to a series of secret meetings, has
> asked that scientists sign non-disclosure agreements and accept
> restrictions on publication, and has attempted to resolve public
> concerns through private channels. This is not a good process for the
> evaluation of a technology that is proposed for the public switched
> network. While we acknowledge that the White House has been reasonably
> forthcoming in explaining the current state of affairs, we do not think
> that this process is a good one.
> For these reasons, we believe that the White House should
> properly defer to the recommendations of the Computer System Security
> and Privacy Advisory Board and the Subcommittee on Telecommunications
> and Finance. We hope that no further steps in support of the Clipper
> initiative will be taken. We specifically recommend that no further
> purchase of Clipper chips be approved.
> Speaking more generally, we believe that a number of steps
> could be taken to ensure that future communications initiatives could
> properly be viewed as a boost to privacy and not a set-back.
>
> % The FCC must be given a strong mandate to pursue privacy
> concerns. There should be an office specifically established to
> examine privacy issues and to prepare reports. Similar efforts in
> other countries have been enormously successful. The Japanese Ministry
> of Post and Telecommunications developed a set of privacy principles to
> ensure continued trade with Europe. The Canada Ministry of
> Communications developed a set of communications principles to address
> public concerns about the privacy of cellular communications. In
> Europe, the EC put forward an important directive on privacy protection
> for the development of new network services.
>
> % Current gaps in the communications law should be filled. The
> protection of transactional records is particularly important.
> Legislation is needed to limit law enforcement access to toll record
> information and to restrict the sale of data generated by the use of
> telecommunication services. As the network becomes digital, the
> transaction records associated with a particular communication may
> become more valuable than the content of the communication itself.
>
> % Telecommunications companies should be encouraged to explore
> innovative ways to protect privacy. Cryptography is a particular
> method to seal electronic communications, but far more important for
> routine communications could be anonymous telephone cards, similar to
> the metro cards here in the District of Columbia, that allow consumers
> to purchase services without establishing accounts, transferring
> personal data, or recording personal activities. Such cards are widely
> available in Europe, Japan, and Australia.
>
> I thank you very much for the opportunity to appear before the
> Subcommittee and would be pleased to answer your questions Computer
> Professionals for Social Responsibility
>
> CPSR is a national membership organization, established in
> 1982, to address the social impact of computer technology. There are
> 2,500 members in 20 chapters across the United States, and offices in
> Palo Alto, California, Cambridge, Massachusetts, and Washington DC. The
> organization is governed by a board of elected officers and meetings
> are open to the public. CPSR sponsors an annual meeting and the
> biennial conference on Directions and Implications of Advanced
> Computing. CPSR sponsored the first conference on Computers, Freedom,
> and Privacy in 1991. CPSR also operates the Internet Library at
> cpsr.org. The library contains documents from the White House on
> technology policy and a wide range of public laws covering privacy,
> access to information, and communications law and is available free of
> charge to all users of the Internet.
>
> Marc Rotenberg is the director of the CPSR Washington office
> and an adjunct professor at Georgetown University Law Center. He is
> chairman of the ACM Committee on Scientific Freedom and Human Rights,
> an editor for the Computer Law and Security Report (London), and the
> secretary of Privacy International, an organization of human rights
> advocates and privacy scholars in forty countries. He received an A.B.
> from Harvard College and a J.D. from Stanford Law School, and is a
> member of the bar of the United States Supreme Court. His forthcoming
> article "Communications Privacy: Implications for Network Design" will
> appear in the August 1993 issue of Communications o0f the ACM.
>
> ------------------------------
>
> End of Computer Underground Digest #5.43
> ************************************