"The most important failure was one of imagination." -- 9/11 Report
This sentence and the reforms that it (and others like it) compelled after the attack on the twin towers have driven many of the changes in the way intelligence analysts do their jobs over the last 13 years.
Fundamental to these changes were (and are) attempts to get analysts to think differently. Specifically, most of the discussion and many of the efforts were aimed at increasing divergent thinking abilities among intelligence professionals. Red teaming, brainstorming, and the ubiquitous informal encouragement to "think outside the box" are all, to one degree or another divergent thinking strategies.
There are good reasons, however, for analysts to master the flip side of divergent thinking - convergent thinking -- as well.
There is quite a bit of excellent research that suggests that having a strong divergent thinking skillset is not enough. In fact, the research goes further. Having only strong divergent thinking skills likely lowers forecasting accuracy.
That's right - lowers.
Psychologists, for example, have long known that having too many choices is not only unproductive but counterproductive. In 2000, Sheena Iyengar and Mark Lepper showed the effects of too many options with respect to consumer products. Participants in their experiments showed more interest in the huge selection of jams with which they were presented but were more likely to actually make a decision and buy one (and to be more satisfied with their purchase) if presented with a smaller assortment. Don't understand how this works? Just take a look at the clip with Robin Williams as a recent Soviet emigre in the movie Moscow On The Hudson at the top of this post...
Beyond the realm of jam and much more directly relevant to intel professionals, Philip Tetlock, in his groundbreaking work on the correlates of forecasting accuracy, Expert Political Judgement, found that one popular analytic methodology, Scenarios Analysis, doesn't work at all. Generating more and more plausible scenarios is actually counterproductive. His experiments showed that "such exercises will often fail to open the mind of inclined-to-be-closed-minded hedgehogs but succeed in confusing already-inclined-to-be-open-minded foxes... (p. 199 of the 2005 edition for those interested in such things)"
Finally, research conducted by Mercyhurst's own Shannon (Ferrucci) Wasko using a real world intelligence problem and a controlled experiment showed much the same effect: Divergent thinking alone lowers forecasting accuracy.
What's an analyst to do?
While divergent thinking is useful for developing concepts, ideas or hypotheses, convergent thinking is useful for focusing the analytic effort. I have found that there are three crucial convergent thinking techniques:
- Grouping. Grouping (and its corollary, Establishing Relationships) is probably the most useful of the convergent thinking techniques. In order to get a handle on all of the ideas that typically emerge from any divergent thinking exercise, it is important to be able to group similar ideas or hypotheses together. Critical to this effort are the labels assigned to the various groups. All sorts of cultural and cognitive biases can easily come into play with poorly chosen group names (For example, think how easily the labels "terrorist", "freedom fighter", "good" or "evil" can influence future analysis). Mindmapping and other concept mapping techniques are very useful when attempting to use grouping as a way to deal with an overabundance of ideas.
- Prioritizing. Deciding which ideas, concepts or hypotheses deserve the most emphasis is crucial if collection and analytic resources are to be used efficiently. Treating every idea as if it is equal to all the others generated by the divergent thinking process makes no sense. Yet, as with any convergent thinking process, the decision regarding which concept is first among the putative equals should be made carefully. Problems typically arise when the team setting the priorities is not diverse enough. For example, a team of economists might well give economics issues undue emphasis.
- Filtering. Filtering, as a convergent thinking technique, explicitly recognizes the awful truth of intelligence analysis - there is never enough time. Filtering can be used to eliminate, in its extreme application, some possibilities entirely from further consideration. Typically, however, analysts will use filtering to limit the level and extent of collection activities. For example, intel professionals looking at pre-election activity in a certain country might decide to focus their collection activities at the county rather than at the city or town level. As with grouping and prioritizing, where to drawn these kinds of lines is fraught with difficulty and should not be done lightly.
5 comments:
These domains are, indeed, critical to the analyst as is one additional factor...quoted from the book by Col Pete Blaber, "Don't get treed by a chihuahua..." (npn). Basically, we become entrapped by what we know that we ignore what we do not know. Thereby, we automatically establish replacements and accept these replacements by default; ignoring any other fact to the contrary. Basically, we 'know' because we 'think we know' and truly ignore that we are without any true support for this 'knowledge.' Thus, when the reality of the situation finally begins to exert itself, we forcefully reject it.
Military history is resplendent with such situations. Nijmegan Bridge in WWII; Pearl Harbor; Bay of Pigs; 9/11; Operation Anaconda (Afghanistan - we ignored the failure of the Soviets in the same valley using the same tactics and in spite of photographic evidence obtained by a Delta Force group that went in weeks prior to the operation contrary to orders...and almost were wiped out...); and many, many more...
Very good article and great comment as well. These days with the abundance of information flying about, it's important to practice the above techniques in order to focus and filter through garbage and good intel that at the immediate moment, may not be as critical. As the comment stated, the above technique can help bring light other possibilities that an analyst unconsciously rejects.
Phil Tetlock has done marvelous work but it is a trap to think scenarios generation is a forecasting tool. The value of scenarios is the identification of indicators that would otherwise lack context and meaning. Opening the mind to multiple possibilities is not the same as choosing one as right or even more likely right than others. It is designed to provide insight into the driving forces that one can track; not to pick winners and losers.
See, e.g., http://www.usip.org/sites/default/files/resources/SRoct06_3.pdf
Related to Grouping is Abstraction, especially in relation to evidence and argument. It enables us to ID the more general point being made/addressed, see the bigger picture, and so not get lost among the trees.
Post a Comment