Wednesday, May 25, 2011

Part 4 -- The "Traditional" Intelligence Cycle And Its History (Let's Kill The Intelligence Cycle)

Part 2 -- "We''ll Return To Our Regularly Scheduled Programming In Just A Minute..."
Part 3 -- The Disconnect Between Theory And Practice

Finding descriptions of the intelligence cycle is not difficult. Virtually every organization, company or law enforcement agency that has even a modest intelligence capability has a picture, much like the one to the right (which, until recently, graced the US national security intelligence community’s main web site, Intelligence.gov). 

So pervasive is this traditional image of the intelligence process that it comes across as generally accepted theory. Indeed, many private sector practitioners have built much of their marketing campaigns on touting the benefits of the cycle.

Likewise, it is commonplace to see the cycle featured prominently in government publications, statements of doctrine, training publications and even in critiques of intelligence. The entire architecture of intelligence, across all three major sub-disciplines of intelligence, is caught up in this more or less common vision of how intelligence professionals perform the functions of intelligence.

Despite its popularity, the history of the cycle is unclear.  US army regulations published during WWI identify collection, collation and dissemination of military intelligence as essential duties of what was then called the Military Intelligence Division (Fun fact:  According to Congressional testimony in 1919 the whole budget for military intelligence in 1913 was $10,000 -- or roughly $227,000 in 2011 dollars) but there was no suggestion that these three functions happen in a sequence, much less in a cycle.

By 1926, military intellgence officers were recommending four distinct functions for tactical combat intelligence:  Requirements, collection, "utilization" (i.e. analysis), and dissemination, though, again, there was no explicit mention of an intelligence cycle.

The first direct mention of the intelligence cycle (see image to the right) I could find is from the 1948 book, Intelligence Is For Commanders (more on this book here).  I hypothesize that the intelligence cycle probably came into use during WWII as a training aid but I have not been able to find any evidence to corroborate this bit of speculation on my part.

Since that time, the cycle, as a model of how intelligence works, has become pervasive.  A simple Google image search on the term, "Intelligence Cycle" rapidly gives one a sense of the wide variety of agencies, organizations and businesses that use some variant of the cycle.

(Note:  Experienced intelligence professionals might want to skip the next few paragraphs, which outline a more or less generic version of the intelligence cycle based on the image at the top of the post.  I include it here for readers who are not familiar with the cycle or for students of intelligence who might need a refresher.)
While the actual details vary dramatically (something we will turn to in the next post) in a typical description of the process, the intelligence cycle usually begins with planning and direction or similar language. Direction usually comes from the decisionmakers that the intelligence activity supports although it can also come from the senior leaders of the intelligence activity or even from the analysts themselves.

It is important to note here that direction and planning can be formal but is often done informally (most variants of the cycle make no distinction). This is seen most often when there is no time for a more formal process. Less formal tasking is also often seen in smaller intelligence units, such as in business environments, or in units where the intelligence professionals and decisionmakers have a long-term relationship.

After the planning and direction phase, the collection phase, in a typical version of the intelligence cycle, begins. Here the intelligence professional begins to execute the plan to collect the types of information necessary to understand and answer the requirement. 

Once the intelligence unit collects the information necessary, other intelligence professionals within the same unit might have to process and exploit it. Processing and exploitation takes on a number of forms including decrypting encrypted transmissions, turning a variety of conversations into a single cohesive report, translating source information from one language into another or identifying buildings and other features in an aerial image, among others. In short, processing and exploitation is the phase where very raw information becomes usable to the largest number of people authorized to view it within the intelligence organization.

With planning, direction, collection, exploitation and processing complete, the focus of the traditional intelligence cycle shifts to analysis and production, or the interpretation of the collected data and the creation of a product that best meets the decisionmaker’s needs.

Analysts need the widest possible variety of information sources in order to be able to corroborate other information and to answer the requirement with which they are dealing.  The notional source of the information is much less important than its relevance to the requirement at hand and that source’s reliability.
 
Production carries its own concerns and they are often independent of the analysis. If the analyst is concerned with the content of the analysis, the intelligence production specialist is concerned with its form. Appropriate forms, in turn, depend on the needs and desires of the decisionmaker the intelligence unit supports. For example, while the traditional intelligence product within many areas of the US national security community is a written document with a smattering of pictures and graphs to explain key points, business decisionmakers tend to be much more comfortable with charts, graphs, and numerical data accompanied by a few explanatory bullets.

The final phase according to the diagram is dissemination. This is where the intelligence specialist delivers the final product to the decisionmaker. While this sounds fairly easy, it, too, has a number of pitfalls associated with it. Questions concerning exactly to whom an intelligence document should go to and exactly how it should get there are fundamental to this phase. For example, classification, or the level of secrecy or confidentiality associated with the intelligence product, is one such issue.

Likewise, many people in developed countries take high-speed voice and data communications capabilities for granted. Yet, in many cases, particularly in intelligence work, such capabilities are scarce or degraded due to geographic isolation. In these cases, the bandwidth available may determine where the intelligence product is sent or even if it is sent via electronic means at all.

Next:  Critiques Of The Cycle

Tuesday, May 24, 2011

Part 3 -- The Disconnect Between Theory And Practice (Let's Kill The Intelligence Cycle)

Previous Posts In This Series:

Part 1 -- Let's Kill The Intelligence Cycle
Part 2 -- "We''ll Return To Our Regularly Scheduled Programming In Just A Minute..."

Intelligence is not something that appears, autogenously; it is something that gets done, a process. This idea, that intelligence is a process, is one of the least controversial among intelligence professionals.

However, a general description of the process of intelligence -- that is, the best way to characterize and classify those consistent elements across intelligence sub-disciplines --is still very much an open theoretical question.

Intelligence professionals have long known that the traditional way of describing the intelligence process, the so-called "intelligence cycle", is flawed; yet none of the alternatives proposed has yet captured the nuance of the process as practiced or, for that matter, the mind of the intelligence community. 

This disconnect between theory and practice, between the imperfections of the intelligence cycle and the way intelligence is actually done, has real-world consequences.  While I will return to this theme many times throughout this series of posts, it is useful to get a sense of the costs associated with perpetuating a faulty model of the process.

  • For example, without a consensus on the way in which intelligence "happens" that works across the various sub-disciplines of law enforcement, business and national security intelligence, it is impossible to study the process for potential improvements.
  • In addition, reforms proposed under flawed models are likely to be flawed reforms, incapable of solving systemic problems because the system itself is so poorly understood. 
  • Furthermore, training students with models of a process that falls apart when first touched by reality reduces the perceived value of training as well as the morale of those trained. 
  • Budgets built around a flawed model are likely to mis-allocate funds and require work-around solutions that consume even more scarce resources. 
  • Hiring people to fill positions created under an unsound model of the process is nearly certain to create a mismatch in terms of skills and competencies needeed vs. skills and competencies acquired. 
The list goes on.

In this series of posts, I will begin by examining the intelligence cycle and some of the critiques of it. Next, I will examine the alternatives to the intelligence cycle. Finally, I will lay out my own understanding of the process. While every intelligence project is different, my own experience and the evidence I have collected over the last eight years indicates that there are patterns in this activity, whether in the national security, business or law enforcement fields, that are consistent across the entire intelligence profession.

The final goal of this exercise, then, is to outline this new description of the process as clearly as possible based on intelligence, as it is practiced, across all its sub-disciplines and regardless of the size of the intelligence activity involved.  Furthermore, I want to balance the need for both simplicity and detail such that this explanation of the process is accessible to all students of intelligence -- at whatever age or level of experience.

Next:  The Generic Cycle

Monday, May 23, 2011

"We'll Return To Our Regularly Scheduled Programming In Just a Minute..." (Let's Kill The Intelligence Cycle).

My intent today was to jump right into my series on the intelligence cycle and why we should get rid of it (put a wooden stake through its heart were the exact words I think I used...).

However, over the weekend, I received a torrent of emails and the post received a number of comments and it occurred to me that, before I got started in earnest, it might be useful to do a little wholly unscientific sentiment analysis on this issue.

Using the Swayable tool (which many of you have already tested here and here), I intend to first test the underlying assumption behind this study and second to ask two related but independent questions about your perceptions of the intelligence cycle and its place in intelligence theory.

The Assumption Check

The first question is:  "Is the traditional intelligence cycle a perfect representation of the current intelligence process?  By "perfect" I mean perfect -- does the intelligence cycle accurately model the intelligence process as it is currently done?  Trivial issues count here (we will deal with them later).


Something A Bit More Substantial


The second question addresses the degree to which the cycle is imperfect (assuming you thought it was imperfect in the first place): "Do the benefits derived from continuing to use the intelligence cycle as a depiction of the intelligence process outweigh the costs?" I would ask you to think carefully about both the costs and the benefits before answering.



Finally, I want to get at your beliefs: "Without reference to perfection (or imperfection), costs or benefits, do you believe that a better general description of the modern intelligence process is possible?" (Note: Extra credit for guessing why I chose pictures of Leibniz and Voltaire and double secret extra credit for knowing which is which...)



That's it. Please do not hesitate to pass this post and the series on to anyone who might be interested. In addition, please do not hesitate to join in the discussion by dropping me an email or posting a comment (comments are better as they can be seen by all but I understand if that is not possible).

Next: The Disconnect Between Theory And Practice

Friday, May 20, 2011

Let's Kill The Intelligence Cycle (Original Research)

I mean it.

The "intelligence cycle", as a depiction of how the intelligence process works, is a WWII era relic that is way past its sell-by date.  It has become toxic.  It no longer informs as much as it infects.  It is less a cycle than a cyclops -- ancient, ugly and destructive.

I want it dead and gone, crushed, eliminated.

I don't care, frankly, what we have to do.  Remove it from every training manual, delete it from every slide, erase it from every website.

Shoot it with a silver bullet, drive a wooden stake through its heart, burn the remains without ceremony and scatter the ashes.

(Geez, Kris, why don't you tell us how you really feel...)

OK, OK, so, yes, I am being intentionally provocative but I have been doing quite a bit of research on the intelligence process over the last several years and have come to the conclusion -- as have others before me -- that our current best depiction of this process, the so-called "intelligence cycle" is fatally flawed.  Moreover, I believe these flaws have become so severe, so grievous, that continued adherence to and promotion of the cycle is actually counterproductive.

My intent, beginning on Monday and over the next several weeks, is to lay out the evidence I have gathered about the cycle itself, about attempts to save it from its worst flaws, about attempts to replace it altogether and let you decide for yourselves. 

In the end, I intend to recommend (with no hubris intended and well aware of the possibility of hamartia) my own generalized version of the intelligence process; one which I think is more appropriate for the intelligence tasks of the 21st Century and which works, in both theory and practice, across all three major sub-disciplines of intelligence -- national security, business and law enforcement.

Next:  The Disconnect Between Theory And Practice

Thursday, May 19, 2011

Why Good Data Isn't Enough (British Medical Journal And The University Of Michigan)

You are briefing the boss today and you are pretty excited.  You were tasked to take a hard look at two different ways of doing the same thing -- the "old way" and the "new way".  The old way was OK but your research clearly shows that the new way is much better.

You stand up in front of the boss.  You know you are speaking a little quickly (you may not even be pausing all that much) and your voice is probably a little higher than it usually is -- but none of that matters.  Your data is rock solid.

In fact, you have even put your great data into a pie graph that clearly identifies the validity of your position.  This is your ace in the hole because you know the boss loves pie graphs.

All of this explains why you are stunned when the boss decides to continue to do things the old way.

Two interesting studies, one quite old and one brand new, explain why what you said mattered far less than how you said it.

http://www.bmj.com/content/318/7197/1527.full

The first study, from 1999,"Influence of data display formats on physician investigators' decisions to stop clinical trials: prospective trial with repeated measure" from the British Medical Journal (hat tip to social network analysis expert Valdis Krebs and his prolific Twittering) asked a number of physicians to look at the exact same data using one of four different visualization techniques -- bar graph, pie graph, chart or "icons".  You can see the four different charts in the picture to the right.  Note:  The test subjects only saw one of these, not all four together at once.

Now, I admit, these charts are a little dense at first.  Basically you have 2 different groups, those who started the study with a good prognosis and those who started the study with a poor prognosis.  You also have those who received the old treatment and those that received the new treatment.

The question was, based on these results, do you continue this study or not?  The doctors involved in the study were all research physicians and used to seeing this kind of data and making these kinds of decisions. 

Despite the fact that the data was exactly the same in all four images and that the data was overwhelmingly in support of the new treatment option, there was a satistically significant difference in the accuracy rate of the physician's decisions based exclusively on how the data was presented.

The least accurate?  Pie and bar graphs.  Charts did OK but the best option was the "icons". 

This kind of iconic chart is probably new to many readers.  It shows the impact of the treatments on every single patient in the study.  While this kind of display yielded the most accurate results in the study, it was also the most disliked by the test subjects.

The overwhelming preference was for the chart, while a minority preferred the bar or pie graphs.  Not only did none of the participants indicate that they preferred the icons, a significant number of them expressed derision at the format in their after action comments.

This study reminds me of a series of studies conducted by Ulrich Hoffrage and Gerd Gigerenzer at the Max Planck Institute in Berlin that demonstrate that expressing statistics using "natural frequencies" (e.g. 2 out of 20 instead of the more common 10%) leads to better understanding and better (i.e. more "Bayesian") reasoning (Jen Lee, Hema Deshmukh and I were able to replicate these results using a typical analytic problem so I believe that this effect is important in the context of intelligence as well).

The second piece of research is from the University of Michigan's Institute For Social Research and is still in pre-publication review.  In what appears to be a very cleverly designed study, researchers looked at 200 telephone interviewers (100 male and 100 female).

They found that interviewers who spoke moderately fast, with lower pitched voices (if male) and with 4 to 5 natural pauses per minute were the most effective at getting people to listen to them.

Combining the results of these studies, it is easy to imagine that the most powerful presentation would be one using icons combined with a proficient speaker.  The opposite (as demonstrated in the story that started this post) could reasonably be expected to perform less well -- even if the information were exactly the same.

As I have said before, like it or not, it is not enough to have good info, you have to be able to communicate it effectively as well.  The flip side of this coin is equally important for intelligence professionals -- we may well be hard-wired to be biased towards high quality forms of communication, even if the quality of the content is second rate.