[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Technophobia & Intelligence: Forwarded article from Information Week
- To: [email protected]
- Subject: Technophobia & Intelligence: Forwarded article from Information Week
- From: [email protected] (Paul Ferguson)
- Date: Wed, 07 Jul 93 16:53:33 EDT
- Organization: Sytex Communications, Inc
reprinted from:
Information Week
July 5, 1993
(cover story)
pages 31 through 38
The Intelligence Test
Do tight funds and technophobia impede the CIA's ability to
gather information?
by Francis Hamit
As the United States turns 217 years old this week, the officials
responsible for the computers and communications of the nation's
intelligence agencies are in no mood for a party. Many of their
systems are antiquated, inefficient, and sometimes dangerously
ineffective. Their resources are being taxed by the changing
demands of post-Cold War politics. They need money to update
their systems, yet a Democratic Congress appears intent on
cutting the overall intelligence budget by more than $1 billion.
To top it all off, IS officials in the intelligence community
face an internal cultural bias against computers; some CIA
employees see the machines as little more than electronic
security leaks. "They just don't get it," says industry analyst
Esther Dyson, who recently visited the CIA with an Electronic
Frontier Foundation delegation. "It's depressing."
Yet, the U.S. intelligence community, under the leadership of the
CIA, is undergoing a quiet revolution in culture and
methodology. The IT component of the effort is being led by
Michael L. Dillard, chairman of the information policy board in
the Office of the Director of Central Intelligence, essentially
the intelligence community's CIO.
Dillard has the authority to do the job. He reports directly to
the director of central intelligence, R. James Woolsey. Dillard
and Woolsey's charter includes the CIA -- which is in the process
of trying to fill a new CIO position of their own -- as well as
government departments such as the Bureau of Research and
Intelligence in the State Department, the intelligence elements
of the various Armed Forces, the Energy Department's intelligence
component, the National Security Agency, even units of the
Treasury Department. Factor in the ad hoc task forces and working
groups set up to handle specific areas of concern such as
terrorism, narcotics, and transnational criminal activities, and
it's a potentially cacophonous collection of sources to manage
in a real-time environment -- and with an extremely limited
margin for error.
The Agency That Knew Too Much?
The intelligence community's work is breathtaking in scope. Raw
data floods in daily from every conceivable source. Technical
collection efforts such as signals interception and high-
resolution imaging from spy satellites and other sources are
combined with the reports of agents and secret sources around the
world and "open sources," such as newspaper articles and radio
broadcasts. All this information flows like a river into a system
that must select, analyze, and evaluate significant data, and
then turn it into easy-to-understand digests for policymakers.
But the overall system is not working as well as it should, and
the need for reform has long been acknowledged by members of the
intelligence community. The CIA alone runs 10 data processing
systems; under the current classification and
compartmentalization, there is virtually no interoperability
between them (see related story below). This has led to some
public embarrassments. Recently, for example, the agency was
accused of covering up part of the BNL scandal, in which an
Italian bank used U.S. Agriculture Department guarantees to help
Saddam Hussein finance Iraq's arms buildup before the Gulf War.
This accusation came after the CIA first denied knowledge of the
affair, then later found the requested documents in a file box
under a staff member's desk.
The current reforms began last year under former director of
central intelligence Robert Gates and have continued under
Woolsey, who was a member of the committee that made the
original reform recommendations. Late last year, before the
annual convention of the Association of Former Intelligence
Officers, Gates identified the targets for intelligence community
reform as nothing less than "our mission, our structure, our
long-term size and budget, and our culture."
These changes come at a time when intelligence consumers are
demanding interactive, multimedia systems that better meet their
needs and time constraints. Given the current climate of budget
cutbacks and growing demands, the community may undergo a major
restructuring that will force wider use of distributed,
multimedia computer and communications systems.
CIO Dillard is unable to detail precise changes to the
intelligence community's IS effort because information such as
various agencies' IS budget and staff size is strictly
classified. But he shared his five goals for IS in the
intelligence community:
o Increase the volume of data, especially from "open sources."
The first Open Source Coordinator has been appointed.
o Attain true connectivity and interoperability among the systems
used in the intelligence community. While some are PC- and
workstation-based and use commercially available software,
traditional approaches to security had mandated that they not
be linked.
o Reduce the growing cost of operating and maintaining legacy
systems. Today, 82 cents of every dollar spent by IS groups in
intelligence goes to maintain and operate existing systems.
"This," says Dillard,"is using up our resources and driving out
our ability to recapitalize and meet new requirements."
o Downsize systems.
o Create an equal infusion of technology throughout the
community. While some computers in use are leading edge,
others date back to the 1960s, Some software is 25 years old.
These initiatives would be difficult in any environment, But the
intelligence community also harbors a cultural bias against
electronic systems. It stems, in part, from the need to secure
information in such a way to protect sources and methods. "In the
proper-based world, this is not a problem," Dillard says. "In the
electronic one, the ability to connect and compare data can lead
to unintended compromises of security."
Indeed, the intelligence community has had an explosion of
literally thousands of databases. Open sources alone command
4,000 databases of all kinds; the most sensitive are kept
offline. Many paper files are never converted to digital form.
With the intelligence community creating an estimated 20,000
digital records a day, the job of digitizing and transferring
older paper files is relegated to the to-do pile.
The agencies are researching and developing software tools to
break through this logjam by helping analysts search very large
databases. This effort is being managed by the Intelligence
Community Management Staff, a separate entity charged with
implementing much of the reform.
Congress has had much to say about the intelligence community's
need to eliminate redundant computer systems. But unlike in
private businesses, redundant sources in intelligence may
actually help clarify information by providing additional checks
on incoming data. Redundant information also helps guard against
deception schemes by adversary intelligence services.
In addition, while the community's rapidly growing stream of data
demands the use of the latest technology, the open systems
approach that works best in the business world is unfamiliar,
possibly even threatening, to those in the intelligence community.
Past attempts to cut one type of collection in favor of another
generally have been damaging. In the late '70s, director of
central intelligence Stansfield Turner emphasized technical means
over human intelligence sources -- he was uncomfortable with
spies and forced out many veteran covert operatives. Turner's
critics say the efforts may have led to an inability to respond
to anti-American terrorist operations in the Middle East,
such as the 1983 bombing of the U.S. Marines barracks in Beirut,
which was aggravated by the bombing and subsequent kidnapping and
murder of the CIA's local station chief.
Satellites Alone Don't Fly
Only 30% to 40% of all intelligence gathering is the result of
technical means such as satellite surveillance and signals
interception. Another 30% comes from open sources, while an
overwhelming 80% is derived from human sources (the total exceeds
100% to account for overlap between sources). Many in the
intelligence community believe there is no substitute for the
human analyst.
Funding for the intelligence community's new IT efforts may be
scarce. Despite Clinton administration efforts to expand the
overall intelligence budget to more than $28 billion in order to
cope with the changes caused by the collapse of the Soviet
Union, Congress seems intent upon cutting more than $1 billion
from current levels.
Not surprisingly, intelligence professionals are horrified by
this prospect in the midst of the agencies' most profound
cultural change and organizational restructuring since World War
II. They fear that vital programs may be damaged, eroding the
nation's ability to cope with new challenges.
At the same time, the intelligence community is trying to
downsize by attrition and has cut expenditures by 17.5%. Hiring
has been cut back both for career and contract agents, and many
veterans are being offered early retirement.
Some intelligence officers feel budget cuts could interfere with
the community's recruiting ability. "The lifeblood of the
intelligence community is bringing in new people and giving them
experience and training," says David Whipple, a CIA veteran and
now executive director of the association of Former Intelligence
Officers.
The demands upon the intelligence community since the end of the
Cold War have grown more complex. Veterans of the Cold War era
sometimes even wax nostalgic. "The Cold War simplified things
into a bipolar world," says one CIA veteran analyst. "It froze a
lot of things, like the situation in the Balkans, which have now
erupted with a vengeance."
In the 1980s, nearly 60% of the overall intelligence budget was
focused upon the Soviet Union and the Warsaw Pact nations. At
first glance, it would seem that this amount could now be cut.
But with the fluid geopolitical situation and the emergence of
dozens of new players, the requirements in Eastern Europe are
increasing, the agencies argue.
Not surprisingly, so is the use of computing. "We've all had to
develop an understanding of computing and how to use it in out
day-today work," says a CIA public affairs officer. While
mainframes still dominate, PCs are appearing on intelligence
desktops, joining older systems rather than replacing them.
There's still a long way to go for real change. And the
intelligence community's wary attitude could mean necessary
changes are made later rather than sooner. "There's this sort of
intellectual understanding of change, but there's none of that
understanding, somewhere between emotional and intellectual,
where you 'get it,'" says analyst Dyson. "Some of them do, but to
me a good intelligence service is smarter than everybody else."
[ related story ]
Downsizing: Is It Safe?
An ongoing debate is raging within the U.S. intelligence
community about large-scale computer systems. Michael Dillard,
chairman of the information policy board in the Office of the
Director of Central Intelligence, talks about reviewing
standalone systems to see if they can be combined, or at least
made co-resident, with other systems on similar hardware. This
would cut operations and maintenance staffing, but it would also
make such systems more vulnerable to compromise. Such a melding
of data sets violates the well-established culture of keeping
secrets by separating them on a "need-to-know" basis.
Given the literally millions of people who have Confidential,
Secret, Top Secret, and higher clearances, the real surprise is
not that there is an occasional traitor such as Jonathan Pollard
or John Walker, but that there are not more such breaches of
security. Of course, for the intelligence community, one is too
many. Pollard, for instance, is said to have given 85,000
documents to his Israeli handlers. And the full extent of the
damage done by Walker during his 20 years of spying for the
Soviets may never be known, but certainly codes and other vital
intelligence sources and methods were compromised.
"Sources and methods" are, of course, the most closely held
secrets of any intelligence service. While former director of
central intelligence Robert Gates initiated a vigorous
declassification program, a National Archives official recently
complained that the review and declassification of documents from
the 1960s alone would take nearly 20 years to complete at the
present rate. In fact, the U.S. government still holds classified
documents that date back to World War I.
"Why shouldn't there be one national policy concerning the
protection of valuable national assets?" asks Maynard C.
Anderson, an assistant deputy undersecretary of defense, in a
recent letter to Security Management magazine. He notes that laws
such as the Atomic Energy Act, the Arms Control Act, and the
Privacy Act have added categories of information to be protected
but not a mechanism for the overall administration of information
security. "The lack of a single, coordinated, national
information policy has resulted in the fragmentation of
policy-making and prevented the allocation of resources where
they are needed most."
Such issues are consciously avoided by both civilian and military
intelligence officers who view themselves as the implementors
rather than the makers of policy. The highly compartmentalized
approach of sharing information only with those who "need to
know" is the ultimate protection of sources and methods.
Dire Consequences
More important, it saves lives. An agent-in-place can be run for
years with his or her true identity known only to a handful of
people within one agency. In such a circumstance, the data from
the source must be heavily filtered to avoid compromising the
source's identity, which could have fatal consequences for the
operation and the agent. The downside is that it allows ad hoc
operations to take place, such as Iran-Contra, which was mounted
from within the basement of the National Security Council offices
in the White House. (It also explains why Robert Gates was not
informed about the operation despite his position at the time as
deputy director of intelligence.)
Computer networks have not proven themselves to be absolutely
secure, so the creation of an electronic system vulnerable to
compromise goes very much against the grain of senior officers.
But the need for quicker processing is apparent, as is the need
for absolute security. It is a big problem not easily resolved.
In fact, resolution may depend upon software yet to be
developed, possibly by a new generation of programmers who will
be offered well-paying jobs by private enterprise at a time
when government research dollars are being absorbed by current
program needs.
-F.H.
8<------- End forwarded article --------
Paul Ferguson | "Confidence is the feeling you get
Network Integrator | just before you fully understand
Centreville, Virginia USA | the problem."
[email protected] | - Murphy's 7th Law of Computing
Quis Custodiet Ipsos Custodes?