Part 1 -- Introduction
There are a number of good analytic methods available. If you are ever at a loss for a method (or just want to see a really good taxonomy of methods) check out the Principles of Forecasting site. Specifically, look at the methodology tree and the selection tree (You can see a screen shot below but you really owe it to yourself to look at the interactive site or, at least, download the PDF).
While I strongly support the International Institute of Forecasters and all of their good work, I have rarely had the kind of data in the real-world intelligence problems on which I have worked that would allow me to be comfortable using many of the methods that they have listed. I'll be honest; these guys have spent a lifetime thinking about forecasting and deriving a taxonomy of methods so I am probably the one who is wrong but the methods I find most useful -- over and over again -- are simply not on their list.
What makes for a useful intelligence analysis method? Based primarily on my experience with real-world intelligence problems and with teaching entry-level analysts a wide variety of methods, I think there are four primary factors: Validity, simplicity, flexibility and the method's ability to work with unstructured data.
- Validity. There needs to be at least some evidence to suggest that the method actually improves the intelligence estimate and there should not be strong evidence suggesting that the method does not work. Many of today's "generally accepted" methods and multipliers fail to meet this test. Developing and analyzing scenarios and devil's advocacy are two examples. Tetlock took a hard look at one kind of common scenario development method and found it wanting yet this research is almost universally unknown to intelligence analysts. As Steve Rieber has pointed out, there is no real research to support the use of Devil's Advocacy despite its support by the Senate Select Committee on Intelligence. It is surprising to find that many of today's commonly used intelligence analysis "methods" are, in reality, little more than tribal lore passed down from one generation to another.
- Simplicity. All successful intelligence analysts are smart but even when they have PHDs, you find a reluctance to use complex and, more importantly, time consuming methods. Due to the error inherent in the data available to most intelligence professionals, the benefit derived from using these methods simply doesn't appear to most analysts to outweigh their costs. To be "simple" by my definition, a method should be able to be taught in a reasonable amount of time and the analyst should be able to see themselves using the method in real-world situations. Analytic methods that actually help communicate the analysis to the decisionmaker or that help evaluate the intelligence process after the fact get extra credit.
- Flexibility. Analysts consistently find themsleves in a wide variety of situations. Sometimes these sistuations are tactical and sometimes they are strategic; sometimes the analyst is a subject matter expert and sometimes they are not. In this post Cold War world, it seems to me that national security analysts are getting dragged from one portfolio to another at an accelerating pace. I remember, for example, when all sorts of Russian analysts were re-branded as newly minted Balkans analysts in the 90's and I suspect that several months ago a number of African or Korean analysts suddenly found themselves on a Georgia-Russia Analytic Team trying to figure out what was likely to happen next in South Ossetia. A really good method should work in all these types of situations and across all the disciplines of intelligence as well.
- Works With Unstructured Data. One of the things that distinguishes, in my mind, intelligence work from other analytic work is that intelligence deals primarily in unstructured data. Intelligence data does not come in neat columns and rows on Excel spreadsheets. It comes in a variety of forms and is often wrong, incomplete or even deliberately deceptive. An intelligence method that fails to acknowledge this, that needs "clean" data to get good results, is a less useful method in my mind.
I am sure that there are other factors that one should consider when selecting an analytic method (and, please, put yours in the comments!) but these are the ones that seem most important to me.
Monday: Method #5...
1 comment:
Another good factor when considering an analysis tool may be one that inherently invites a critique from others. In other words, it tries get analysis in a collaborative environment and invites efforts to disprove the forecast (like ACH with evidence being consistent or inconsistent with the forecast) in a collaborative environment to determine the assessment’s validity.
Post a Comment