Sci-Tech and the Future of Intelligence
Wikileaks, Tunisia’s Jasmine Revolution, Egypt and how to narrow the range of uncertainty in an ever uncertain world
Retired General Brent Scowcroft, National Security Adviser to former US President George H.W. Bush, once remarked that the “role of intelligence is to narrow the range of uncertainty when difficult national security decisions have to be made.” By that standard, the job of intelligence is getting dramatically more difficult as international uncertainties multiply.
One of the underlying forces contributing to this state of affairs is a ‘revolution’ in science and technology. Technological advances throughout history have continually altered the landscape for this ancient craft. Something as basic as the invention of the wheel or gunpowder had an immediate impact on conflict and power projection, and therefore altered the focus of espionage.
But until the invention of the telegraph in the mid-19th century, the basic techniques of intelligence collection remained essentially unchanged from their description in the sixth century BC by Sun Tzu. Sun Tzu’s writings still provide the core precepts for the oldest part of the profession – the recruitment and handling of human spies.
Once adversaries were able to move information rapidly and invisibly through scientific means like the telegraph, the telephone and radios, intelligence officers were faced with new, scientifically-based challenges. These challenges grew exponentially during the 20th century, with unprecedented advances in physics, engineering, communications and photography.
In the 21st century, intelligence will be aided and challenged as never before by technological and scientific advances – advances distinguished by their accelerating pace, the growing synergy among disciplines, and the shrinking time between scientific discoveries and their application.
These trends are evident in fields like information technology, biology and nanotechnology. Computing power is doubling every 18 months, and the world has gone from about 5,000 computers worldwide in the 1950s to an Internet (‘wired’) population today of more than 1 billion people with access to more than 300 billion webpages. In biology, something that might have earned a doctorate 10 years ago is now the work of technicians.
Undergirding much of the progress in these and other fields is the revolution in nanotechnology, where miniaturization has yet to reach the limits posed by the laws of physics. A metaphor for all of this is the miniaturization of electronic circuitry: a microchip that contained only 29,000 transistors in the 1980s now houses more than a billion – thereby powering many of today’s technological ‘miracles,’ from communications to precision weaponry.
The enhancements that these changes bring to intelligence capabilities are offset by myriad challenges. The most basic challenge is one of innovation, because the intelligence discipline must always strive to be a lap ahead, technologically – something much harder in an era when more and more sophisticated technology is commercially available, and when adversaries have easy access to it.
An important corollary is that today’s technology gives adversaries the potential to acquire new weapons more easily and rapidly. The advances in biology, for example, mean that the traditional barriers to non-state creation of biological weapons – strain availability, weaponization technology, and means of delivery – have fallen away.
Along with communications advances come problems of volume. Today’s intelligence officers run the risk of missing clues as they struggle to mine important nuggets buried in thousands of messages daily. If they are lucky enough to capture, say, a terrorist’s electronic media, they will probably have the digital equivalent of a small public library, and will need sophisticated algorithms to isolate the key data.
A related challenge calls for greater precision in many parts of the discipline, especially those operating in direct support of the military. Case in point: today’s B-2 bomber can simultaneously deliver 16 2,000-pound bombs with pinpoint accuracy on 16 different targets in one pass – making it mandatory that supporting intelligence be accurate to a degree of precision not imagined before in history.
In the end, intelligence is often about assessing and affecting power relationships in the world, so practitioners of the craft must be aware of how technology is redefining these relationships. When historians look back 100 years hence, some may call this the ‘age of asymmetry,’ because the central impact of modern technology has been to erode conventional means of exerting influence, putting greater power for good and evil – the power to persuade, to create and destroy – into the hands of smaller numbers of people. This may be the single most important thing for intelligence officers to keep in mind as they strive to ‘narrow the range of uncertainty’ in an increasingly uncertain world.
John E. McLaughlin, formerly Deputy Director and Acting Director of the Central Intelligence Agency (CIA), is Distinguished Practitioner in Residence at the Paul H. Nitze School of Advanced International Studies (SAIS) of the Johns Hopkins University.