CNAS report recommends new approach to US national security

One person's intelligent predictions are another's guesses, Infosecurity notes, but the 'fuzzy science' that is intelligent predictions is widely used by intelligence agencies the world over.

The paper, however, has some strong credentials, as it is written by Richard Danzig, Chairman of the CNAS Board of Directors and former the Secretary of the US Navy.

Entitled 'Driving in the Dark: Ten Propositions About Prediction and National Security', Danzig's paper is billed as looking at the nature of prediction in national security and offers a number of recommendations on how the US Department of Defense can improve its predictive capabilities.

According to the report, the DoD relies on predictions about future threats and potential scenarios to forecast needs and select and acquire major weapons systems. Despite this, however, history has shown that an uncertain national security environment dictates the need for adaptability and flexibility when predictions are incorrect, and, because of this, the US military must be better prepared when predictions are wrong.

Danzig recommends that the DoD adopt new strategies to improve its predictive abilities while also preparing to be unprepared. He suggests narrowing the time between conceptualizing programs and bringing them to realization, as well as building more for the short-term and designing operationally flexible equipment.

He also advises valuing diversity and competition as, while policymakers will always 'drive in the dark', by adopting these recommendations, they may better respond to unpredictable conditions and prepare the US for unforeseen threats.

The report concludes that the DoD’s systems for selecting and designing major weapons systems rely too heavily on successful prediction. Based on both the Department’s track record and social science research, we should expect frequent error in decisions premised on long-term predictions about the future.

“This high rate of error is unavoidable. Indeed, it is inherent in predictions about complex, tightly intertwined
technological, political and adversarial activities and environments in a volatile world. Accordingly, we should balance efforts to improve our predictive capabilities with a strong recognition of the likelihood of important predictive failures. We should identify, improve and implement strategies to design processes, programs and equipment to prepare us for those failures”, the paper notes.

“Policymakers will always drive in the dark. However, they must stop pretending that they can see the road. A much better course is to adopt techniques to compensate for unpredictable conditions and, in so doing, better prepare us for perils that we will not have foreseen”, the report concludes.

What’s hot on Infosecurity Magazine?