Part 1 -- Introduction
Part 2 -- What Makes A Good Method
(Note: Bayesian statistical analysis is virtually unknown to most intelligence analysts. This is unfortunate but true. At its core, Bayes is simply a way to rationally update previous beliefs when new information becomes available. That sounds like what intelligence analysts do all the time but it has that word "statistics" associated with it, so, even analysts who have heard of Bayes often decide to give it a miss. If you are interested in finding out more about Bayes, you can always check out Wikipedia but I find even that article a bit dense. I prefer Bayes For Beginners -- which is what I am.)
Bayes is the “Gold Standard” for analytic conclusions under conditions of uncertainty and probably ought to be closer to -- if not at the -- top of this list. It provides a rigorous, logical and thoroughly validated process for generating a very specific estimative judgment. It is also enormously flexible and can, theoretically, be applied to virtually any type of problem.
Theoretically. Ahhh... There, of course, is the rub. The problem with Bayes lies in its perceived complexity and, to a lesser degree, the difficulty in using Bayes with large sets of unstructured, dynamic data.
- Bayes, for many people, is difficult to learn. While the equation is relatively simple, its results are often counterintuitive. This is true, unfortunately, for both the analysts and the decisionmakers that intelligence analysts support. It doesn't really matter how good the intelligence analyst is at using Bayes if the decisionmaker will not trust the results at the end of the process because they come across as a lot of statistical hocus-pocus or, even worse, simply "seem" wrong.
- While Sedlmeier and Gigarenzer have had some luck teaching Bayes using so-called natural frequencies (and we, at Mercyhurst, have had some luck replicating his experiments using intelligence analysts instead of doctors), the seeming complexity of Bayes is one of the major hurdles to overcome in using this method effectively.
- In addition to the complexities of Bayes, it appears that this method, which works well with small, well-defined sets of any kind of data, does not handle large volumes of dynamic, unstructured data very well.
- Bayes seems to me to work best as an intelligence analysis method when an analyst is confronted with a situation where a new piece of information appears to significantly alter the perceived probabilities of an event occurring. For example, an analyst thinks that the odds of a war breaking out between two rival countries are quite low. Suddenly, a piece of information comes in that suggests that, contrary to the analyst’s belief, the countries are, in fact, at the brink of war. A Bayesian mindset helps to ratchet back those fears (which are actually best described by the recency and vividness cognitive biases).
- The real world doesn't present its data in ones, however, and not all data should be weighted the same. When analysts try to go beyond having a "Bayesian mindset" and apply Bayes literally to real world problems (as we have on several occasions), they run into problems. Think about the recent terrorist attacks in Mumbai. Arguably, the odds of war between India and Pakistan were quite low before the attacks. As each new piece of data rolled in, how did it change the odds? More importantly, how much weight should the analyst give that piece of data (particularly given that the analyst does not know how much data, ultimately, will come in before the event is "over")? Bayes is easier to apply if we treat "Mumbai Attack" as a single data point but does that make sense given that new data on the attack continues to come in even now?
- Bayes, in essence, is digital but life is analog. Figuring out how to "bin" or group the data rationally with respect to real-world intelligence problems is one of the biggest hurdles to overcome, in my estimation, with using Bayes routinely in intelligence analysis
Bayesian statistical analysis has enormous potential for the intelligence community. 20 years from now we will all likely have Bayesian widgets on our desktops that help us figure out the odds -- the real odds, not some subjective gobbledy-gook -- of specific outcomes in complex problems (In much the same way that Bayes powers some of the most effective and powerful spam filters today). The research agenda to get closer to this "Golden Age Of Bayesian Reasoning" is straightforward (but difficult):
- Figure out how to effectively and efficiently teach the basics of Bayes to a non-technical audience.
- Actually teach those basics to both analysts and decisionmakers so that both will have an appropriate "comfort level" with the fundamental concepts.
- Develop Bayesian-based tools (that are reasonably simple to use and in which analysts can have confidence) that deal with large amounts of unstructured information in a dynamic environment.
Anyone got any extra grant money lying around?
Tomorrow -- Method #4...
6 comments:
I came across an old de-class description of some of the IC's first uses of Bayesian methods. You may find it interesting: http://blogs.nyu.edu/blogs/agc282/zia/2008/11/inital_forays_into_bayesian_an_1.html
Nice article
I know here in Canada they teach Bayesian reasoning to first and second-year Psychology majors as part of their statistics.
Perhaps this would qualify as a "non-technical" audience and the training methods could be expanded to help Analysts?
Dustin,
Many thanks! I will look into it. What texts do they use?
Kris
Hi, thanks for these posts on IAMs. I am a soldier, and in my experience, there are emerging scenarios to which the IPB doesn't lend itself efficiently or effectively (because it doesn't involve engaging an 'enemy' on a 'battlefield'. I now have different ways of examining a situation.
Thanks for these posts on IAMs. I have found it very useful in my military career.
Post a Comment