Monday, January 4, 2010

Heuer: How To Fix Intelligence Analysis With Structured Methods (NationalAcademies.org)

Richards Heuer (of Psychology Of Intelligence Analysis fame...) spoke last month at the National Academy of Sciences regarding his thoughts on how to improve intelligence analysis through the increased use of structured methods.

In the aftermath of the attempted bombing of Flight 253 on Christmas Day, it is worth reading Dick's words on how to improve the analytic side of the intelligence equation. I don't agree with everything he says (and say so in italicized parenthetical comments below) but he has been thinking clearly about these kinds of things for far longer than most of us. If you are concerned at all with reforming the way we do analysis, then this is a much better place to start than with all the noise being generated by the talking heads on TV.

I have embedded the whole document below this post or you can go to the National Academies site and listen to Dick's speech yourself. For those of you with too much to do and too little time, I have tried to pull out some of the highlights from Dick's paper below. I am not going to do it justice though, so, if you have the time, please read the entire document.
  • "If there is one thing you take away from my presentation to day, please let it be that structured analytic techniques are enablers of collaboration. They are the process by which effective collaboration occurs. Structured techniques and collaboration fit together like hand in glove, and they need to be promoted and developed together." (This is very consistent with what we see with our students here at Mercyhurst. Dick reports some anecdotal evidence to support his claim and it is exactly the same kinds of things we see with our young analysts).
  • "Unfortunately, the DNI leadership has not recognized this. For example, the DNI’s National Intelligence Strategy, Enterprise Objective 4 on improving integration and sharing, makes no mention of improving analytic methods."
  • "CIA, not the DNI, is the agency that has been pushing structured analysis. One important innovation at CIA is the development in various analytic offices of what are called tradecraft cells. These are small groups of analysts whose job it is to help other analysts decide which techniques are most appropriate, help guide the use of such techniques by inexperienced analysts, and often serve as facilitators of group processes. These tradecraft cells are a very helpful innovation that should spread to the other agencies." (Interesting. We called these "analytic coaches" and tried to get funding for them in our contract work for the government in 2005 -- and failed).
  • "I understand you are all concerned about evaluating whether these structured techniques actually work. So am I. I’d love to see our methods tested, especially the structured analytic techniques Randy and I have written about. The only testing the Intelligence Community has done is through the experience of using them, and I think we all agree that’s not adequate." (I suppose this is the comment that bothers me at the deepest level. It implies, to me, at least, that the IC doesn't know if any of its analytic methods work. What other 75 billion dollar a year enterprise can say that? What other 75 billion dollar a year enterprise wants to say that?)
  • "Some of you have emphasized the need to test the accuracy of these techniques. That would certainly be the ideal, but ideals are not always achievable." (Here I have to disagree with Dick. Philip Tetlock and Bruce Bueno De Mesquita have both made progress in this area and there is every reason to think that, with proper funding, such an effort would ultimately be successful. Tetlock recommended as much in a recent article. The key is to get started. The amount of money necessary to conduct this research is trivial compared to the amount spent on intel overall. Likewise, the payoff is enormous. As an investment it is a no-brainer, but until you try, you will not know)
  • "Unfortunately, there are major difficulties in testing structured techniques for accuracy, (for an outline of some of these, see my series of posts on evaluating intelligence) and the chances of such an approach having a significant favorable impact on how analysis is done are not very good. I see four reasons for this."
  • "1. Testing for accuracy is difficult because it assumes that the accuracy of intelligence judgments can be measured."
  • "2. There is a subset of analytic problems such as elections, when a definitive answer will be known in 6 or 12 months. Even in these cases there is a problem in measuring accuracy, because intelligence judgments are almost always probabilistic."
  • "3. A third reason why a major effort to evaluate the accuracy of structured analytic techniques may not be feasible stems from our experience that these techniques are most effective when used as part of a group process."

  • "4. If you are trying to change analysts’ behavior, which has to be the goal of such research, you are starting with at least one strike against you, as much of your target audience already has a firm opinion, based on their personal experience that they believe is more trustworthy than your research." (Here I have to disagree with Dick again. I think the goal of this research has to be to improve forecasting accuracy. If you can show analysts a method that has been demonstrated to improve forecasting accuracy -- to improve the analyst's "batting average" -- in real world conditions, I don't think you will have any problem changing their behavior.)
  • "As with the other examples, however, the Intel Community has no organizational unit that is funded and qualified to do that sort of testing."
  • "It (a referenced Wall St. Journal article) suggested that instead of estimating the likelihood that their plans will work, financial analysts should estimate the probability they might fail. That’s a good idea that could also be applied to intelligence analysis." (I am not sure why we can't do both. We currently teach at Mercyhurst that a "complete" estimate consists of both a statement of probability (i.e. the likelihood that X will or will not happen) and a statement of analytic confidence (i.e. how likely is that you, the analyst, are wrong in your estimate.)
  • "The kind of research I just talked about can and should be done in-house with the assistance of those who are directly responsible for implementing the findings." (I think that Dick is correct, that, at some point, it has to be done in-house. I do think, however, that the preliminary testing could be effectively done by colleges, universities and other research institutions. This has three big benefits. First, it means that many methods could be tested quickly and that only the most promising would move forward. Second, it would likely be less expensive to do the first stage testing in the open community than in the IC. Third, it allows the IC to extend its partnering and engagement activities with colleges, universities and research institutions.)
  • "Our forthcoming book has two major recommendations for DNI actions that we believe are needed to achieve the analytic transformation we would all like to see."
  • "1. The DNI needs to require that the National Intelligence Council set an example about the importance of analytic tradecraft. NIC projects are exactly the kind of projects for which structured techniques should always be used, and this is not happening now."
  • "2. The second recommendation is that the DNI should create what might be called a center for analytic tradecraft."


Complete text below:

The Evolution of Structured Analytic Techniques -- Richards Heuer -- 8 DEC 2009
Reblog this post [with Zemanta]

No comments:

Post a Comment