Monday, January 4, 2010

Heuer: How To Fix Intelligence Analysis With Structured Methods (NationalAcademies.org)

Richards Heuer (of Psychology Of Intelligence Analysis fame...) spoke last month at the National Academy of Sciences regarding his thoughts on how to improve intelligence analysis through the increased use of structured methods.

In the aftermath of the attempted bombing of Flight 253 on Christmas Day, it is worth reading Dick's words on how to improve the analytic side of the intelligence equation. I don't agree with everything he says (and say so in italicized parenthetical comments below) but he has been thinking clearly about these kinds of things for far longer than most of us. If you are concerned at all with reforming the way we do analysis, then this is a much better place to start than with all the noise being generated by the talking heads on TV.

I have embedded the whole document below this post or you can go to the National Academies site and listen to Dick's speech yourself. For those of you with too much to do and too little time, I have tried to pull out some of the highlights from Dick's paper below. I am not going to do it justice though, so, if you have the time, please read the entire document.

  • "If there is one thing you take away from my presentation to day, please let it be that structured analytic techniques are enablers of collaboration. They are the process by which effective collaboration occurs. Structured techniques and collaboration fit together like hand in glove, and they need to be promoted and developed together." (This is very consistent with what we see with our students here at Mercyhurst. Dick reports some anecdotal evidence to support his claim and it is exactly the same kinds of things we see with our young analysts).
  • "Unfortunately, the DNI leadership has not recognized this. For example, the DNI’s National Intelligence Strategy, Enterprise Objective 4 on improving integration and sharing, makes no mention of improving analytic methods."
  • "CIA, not the DNI, is the agency that has been pushing structured analysis. One important innovation at CIA is the development in various analytic offices of what are called tradecraft cells. These are small groups of analysts whose job it is to help other analysts decide which techniques are most appropriate, help guide the use of such techniques by inexperienced analysts, and often serve as facilitators of group processes. These tradecraft cells are a very helpful innovation that should spread to the other agencies." (Interesting. We called these "analytic coaches" and tried to get funding for them in our contract work for the government in 2005 -- and failed).
  • "I understand you are all concerned about evaluating whether these structured techniques actually work. So am I. I’d love to see our methods tested, especially the structured analytic techniques Randy and I have written about. The only testing the Intelligence Community has done is through the experience of using them, and I think we all agree that’s not adequate." (I suppose this is the comment that bothers me at the deepest level. It implies, to me, at least, that the IC doesn't know if any of its analytic methods work. What other 75 billion dollar a year enterprise can say that? What other 75 billion dollar a year enterprise wants to say that?)
  • "Some of you have emphasized the need to test the accuracy of these techniques. That would certainly be the ideal, but ideals are not always achievable." (Here I have to disagree with Dick. Philip Tetlock and Bruce Bueno De Mesquita have both made progress in this area and there is every reason to think that, with proper funding, such an effort would ultimately be successful. Tetlock recommended as much in a recent article. The key is to get started. The amount of money necessary to conduct this research is trivial compared to the amount spent on intel overall. Likewise, the payoff is enormous. As an investment it is a no-brainer, but until you try, you will not know)
  • "Unfortunately, there are major difficulties in testing structured techniques for accuracy, (for an outline of some of these, see my series of posts on evaluating intelligence) and the chances of such an approach having a significant favorable impact on how analysis is done are not very good. I see four reasons for this."
  • "1. Testing for accuracy is difficult because it assumes that the accuracy of intelligence judgments can be measured."
  • "2. There is a subset of analytic problems such as elections, when a definitive answer will be known in 6 or 12 months. Even in these cases there is a problem in measuring accuracy, because intelligence judgments are almost always probabilistic."
  • "3. A third reason why a major effort to evaluate the accuracy of structured analytic techniques may not be feasible stems from our experience that these techniques are most effective when used as part of a group process."

  • "4. If you are trying to change analysts’ behavior, which has to be the goal of such research, you are starting with at least one strike against you, as much of your target audience already has a firm opinion, based on their personal experience that they believe is more trustworthy than your research." (Here I have to disagree with Dick again. I think the goal of this research has to be to improve forecasting accuracy. If you can show analysts a method that has been demonstrated to improve forecasting accuracy -- to improve the analyst's "batting average" -- in real world conditions, I don't think you will have any problem changing their behavior.)
  • "As with the other examples, however, the Intel Community has no organizational unit that is funded and qualified to do that sort of testing."
  • "It (a referenced Wall St. Journal article) suggested that instead of estimating the likelihood that their plans will work, financial analysts should estimate the probability they might fail. That’s a good idea that could also be applied to intelligence analysis." (I am not sure why we can't do both. We currently teach at Mercyhurst that a "complete" estimate consists of both a statement of probability (i.e. the likelihood that X will or will not happen) and a statement of analytic confidence (i.e. how likely is that you, the analyst, are wrong in your estimate.)
  • "The kind of research I just talked about can and should be done in-house with the assistance of those who are directly responsible for implementing the findings." (I think that Dick is correct, that, at some point, it has to be done in-house. I do think, however, that the preliminary testing could be effectively done by colleges, universities and other research institutions. This has three big benefits. First, it means that many methods could be tested quickly and that only the most promising would move forward. Second, it would likely be less expensive to do the first stage testing in the open community than in the IC. Third, it allows the IC to extend its partnering and engagement activities with colleges, universities and research institutions.)
  • "Our forthcoming book has two major recommendations for DNI actions that we believe are needed to achieve the analytic transformation we would all like to see."
  • "1. The DNI needs to require that the National Intelligence Council set an example about the importance of analytic tradecraft. NIC projects are exactly the kind of projects for which structured techniques should always be used, and this is not happening now."
  • "2. The second recommendation is that the DNI should create what might be called a center for analytic tradecraft."


Complete text below:

The Evolution of Structured Analytic Techniques -- Richards Heuer -- 8 DEC 2009
Reblog this post [with Zemanta]

Tuesday, December 29, 2009

Change Blindness Plus (YouTube via Schneier)

The Schneier On Security blog points today to an interesting video (embedded below) that demonstrates and discusses the phenomena of Change Blindness:



There are several good videos and resources on this effect and you can find them all here.

Monday, December 28, 2009

The Biggest News Stories Of The Year Are Not What You Think (GOOD via Neatorama)

If you are a regular reader of this blog...well, thank you!

More importantly, though, you are probably better informed about what is going on in the world than the average Joe or Jane.

While this is a good thing, it is not a universal good. It probably also creates a cognitive blind spot as well. What do I mean? Well, most people think that other people think the same way they do. This applies particularly to what is "important".

For example, I remember coming back to the US after having worked the Balkans issue for a number of years and being totally taken aback by the lack of news on the situation there. It was the most important thing in my life and I did not understand why it was not important to people in the US as well.

To put things into perspective, then, (and, note, I did not say proper perspective...) comes the "Biggest Stories Of The Year" infographic from the online magazine, GOOD (via Neatorama). The screenshot below just gives you a taste of the level of detail embedded in this image (you can either click on the image or go to the site to see the chart in all its full, glorious interactivity-ness).



The data for this chart comes from US media exclusively but it includes a wide variety of sources (Fox News and MSNBC, The Washington Post and the Wall Street Journal, Limbaugh's radio show, Beck, O'Reilly, Hannity and Olbermann's TV shows, etc.).

Check out the relative importance placed on various stories. It is no surprise that the Economic Crisis and Health Care received a lion's share of the time but look at how they dominated print and the air waves. Eyeballing it, I would say that those two topics alone accounted for 40% of the stories. Worth it? Maybe...

Some of the comparisons are even more revealing. Iraq got 3-4 times less coverage than the death of Michael Jackson, Russia got about as much coverage as the White House Party Crashers and Tiger Wood's Adultery was about as important as the Mexican Drug War in the eyes of the press.

This is not, in my mind, the result of some vast international conspiracy of either the left or the right. It represents, I think, what Americans wanted. These media outlets are supported by advertising and advertising is supported by eyeballs and ears. If people aren't reading, watching or listening then these outlets can't survive.

It is, then, a reasonably accurate reflection of what Americans were interested in and, arguably, what they cared about in 2009.

Something worth thinking about in the new year...

Thursday, December 24, 2009

Monday, December 21, 2009

MethHunter: Find, Fix And Incarcerate! (DagirCo)

Want to cut your town's meth problem in half next year? You can. Really.

The guys over at Dagirco have finally done it! MethHunter is on the streets and check out these early stats: One of our small, local police departments is using it and they have had over a dozen busts and over a half dozen convictions in the last six months. The best part? The program has cut the time it takes to conduct an investigation in half!

One of local TV stations (WICU) recently did a five part special series on meth in NW Pennsylvania in general and on the MethHunter in particular. The video below is only one part of the five part series. The other four parts can be accessed at the end of this video:



I am particularly proud to say I know the team at DagirCo that put this piece of software together. Most of them, including Mark Blair, the CEO, are Mercyhurst grads and I have had the great good fortune to have many of them in class over the years.

Beyond this, though, I am particularly impressed with the way they tackled the problem. They have essentially created an expert system that looks at pill shopping patterns and "thinks" about them the same way a meth expert might think about them. It automates, as Mark puts it in one of the videos, the "80% of the tedious, time-consuming" work of analyzing purchase records. It even examines the shopping patterns for evidence of denial and deception on the part of the pill shoppers!

It is easy for you to imagine this all getting very technical and difficult to interpret. That is where this program really shines. The DagirCo team has worked particularly hard to make the program user friendly for small town and rural police forces. Their thinking was that large cities often have dedicated meth experts but that small and rural police forces may not have the resources for such a position. This meant that the program had to produce something that was understandable and actionable by a police officer who had received no special training.

It turns out that the straightforward, cop-friendly way the program generates output benefits both small and large police forces. It gives the smaller police force a capability where it had none before and it saves the experts in the larger forces time that can be better spent looking at aspects of the meth problem that the software cannot address.

The DagirCo team is also particulaly proud of the "engine" they developed to drive the MethHunter. They think it can be used in a broad range of applications. Currently, they are thinking about how it might be applied to other problem drugs and even other crimes in general (they are working on an anti-graffitti version right now). Mark, a former marine, also believes a modified version of this software could be useful in analyzing insurgent attack patterns in Afghanistan and elsewhere.