I am looking for relatively recent, short descriptions of the intelligence process from as many different sources as possible. An example (from US Joint Publication 2) of the kind of thing I am looking for is in the image to the right.
I am NOT looking for images, just descriptions. My first preference would be from official (public, obviously) documents but I will accept anything that has been published.
I don't care what language it is in. In fact, I would LOVE descriptions of the process from other countries or disciplines (e.g. Law enforcement or business). You can attach the sources in the comments to this post or send them to me at my university email (kwheaton at mercyhurst dot edu). Please do not hesitate to share!
Thanks!
Monday, August 28, 2017
RFI: Looking For Descriptions Of The Intelligence Process
Posted by
Kristan J. Wheaton
at
10:35 AM
8
comments
Links to this post
Labels: intelligence, intelligence process, rfi
Tuesday, June 10, 2014
Thinking in Parallel (Part Three - Testing The Mercyhurst Model Against The Real World)
Part 2 -- The Mercyhurst Model
For the last 11 years, I have been using the model described in Part 2 to structure my Strategic Intelligence class at Mercyhurst University. This is a capstone class for seniors and 2nd year graduate students within the Intelligence Studies program at Mercyhurst. This class is centered on a real world project for a real-world decisionmaker, often within the US National Security Community. To date, I have overseen 133 of these types of projects.
The broad parameters of the projects have remain unchanged since 2003. Students in the class are divided into teams and are assigned by the instructor to one of 4-5 projects available during that term. Each project is sponsored by a national security, business, or law enforcement organization that has a strategic intelligence question. To date, sponsors of these questions have included organizations such as the National Geospatial-intelligence Agency, the Defense Intelligence Agency, the National Intelligence Council, the National Security Agency, 66th Military Intelligence Group, and the Navy’s Criminal Investigative Service to name just a few. To give readers a sense of the wide variety of questions intelligence studies students are expected to answer in this course, I have listed a few recent examples of them below:
1. What role will non-state actors (NSAs) play and what impact will NSAs have in Sub-Saharan Africa over the next five years?o What is the likely importance of NSAs vs. State Actors, Supra-State Actors and other relevant categories of actors in sub Saharan Africa?o What are the roles of these actors in key countries, such as Niger?o Are there geographic, cultural, economic or other patterns of activity along which the roles of these actors are either very different or strikingly similar?o What analytical processes and methodologies were applied to the questions above and which proved to be effective or ineffective?
2. What are the most important and most likely impacts on, and threats to, US national interests (including but not limited to political, military, economic and social interests) resulting from infectious and chronic human disease originating outside the US over the next 10-15 years?
3. What are the likely trends in Brazil’s oil/liquid fuel market and electric power sector in the next ten years? Where will these trends likely manifest themselves?o What energy capacity and security issues are likely to be the most significant to Brazil’s economy in the next ten years?o How will Brazil likely address current and/or future energy security issues over the next ten years?o Where will Brazil address these energy shortfalls?In each case, students had only 10 weeks to conduct the research, write the analysis and present the final product to the decisionmaker. The students had no additional financial resources available to them and, other than the question itself, received no support directly from the decisionmaker. Students rarely had any subject matter expertise in the area under question and were only allowed to use open sources. Students were expected to integrate lessons learned from all previous intelligence studies classes and to manage all aspects of the project without significant supervision. Finally, all the students, in addition to this project, were taking a full academic load at the same time.
After all of the deliverables had been produced and disseminated, the decisionmakers sponsoring the projects were asked to provide objective feedback directly to the course instructor. This feedback, in turn, was evaluated on a five point scale correlated with traditional grading practices and professional expectations. In short, a 3 on this scale is roughly equivalent to a "B" and a “4” on this scale is roughly equal to “A” work in a university setting. A “5”, on the other hand, is the kind of work that would be expected from a working (albeit junior) intelligence professional. The chart below indicates how the annual averages have changed over time.
(Note: While this chart may appear to reflect grade inflation more than any other suggested effect, it should be noted that “A” is essentially “average” among Mercyhurst University Intelligence Studies seniors and 2nd Year graduate students and has been for the entire time frame shown above. The current dropout rate from the program is approximately 50% and much of that is due to a strict 3.0 minimum GPA in order to stay in the program. As a result, seniors and second year graduate students (the only students allowed to take the class), typically have GPAs that average 3.6 or above. For example, two years ago, 18 of the top 20 GPA’s in the entire University belonged to Intelligence Studies students.)Anecdotally, it is possible to state the exact impact of these reports within national security agencies in only a few cases. For example, the report that answered the question on global health mentioned earlier earned this praise from the National Intelligence Council:
“Although the Mercyhurst "NIE" should not be construed as an official U.S. government publication, we consider this product an invaluable contribution to the NIC's global disease project: not only in terms of content, but also for the insights it provides into methodological approaches. The Mercyhurst experience was also an important lesson in how wikis can be successfully deployed to facilitate such a multifaceted and participatory research project.”Likewise, in David Moore’s book Sensemaking: A Structure for an Intelligence Revolution (published by the National Defense Intelligence College in 2011), the study on non-state actors in sub-Saharan actors produced in answer to the question mentioned above was judged more rigorous than a similar study conducted by the National Intelligence Council (in cooperation with the Eurasia Group).
Beyond the national security community, however, the impact of these reports on various businesses and other organizations is often easier to determine. For example, senior managers at Composiflex, a mid-sized composites manufacturer, indicated, “We used this project as a seed for our new marketing plan in 2007 and now an industry that we had not even tapped before is 30% of our business.”
Likewise, Joel Deuterman, the CEO of Velocity.net, an Internet Service Provider, stated, “The analysts discovered that our approach was actually a cutting-edge, developing standard in our industry…What really substantiated the data for us was to see many of our existing customers on the list. Then we knew we could rely on the validity of the ones they had found for us.”
Even foreign organizations have seen the benefit of these products including Ben Rawlence, the Advisor for Foreign Affairs and Defense in the Whip’s Office of the Liberal Democrat Party in the UK, stating, “The research carried out by your students was first class, and has been of substantial use to Members of Parliament… It was comprehensive, well sourced and intelligently put together. I have had no hesitation recommending it to our MPs and Lords in the same way that I recommend briefings provided for us by professional research organisations…”
While it is possible to imagine more rigorous testing of this model of the intelligence process, the long term success of the process in generating actionable intelligence for a wide variety of customers on a range of difficult problems in a very short time using limited resources is hard to ignore. More importantly, not only has the process proven itself successful but this success has trended upwards as improvements have been made over the years in terms of structuring the course and teaching material consistent with this approach to the intelligence process.
Intelligence in the 21st century is best thought of as a series of sub-processes operating interactively and in parallel.
This conclusion, by itself, has significant implications for the training and education of intelligence professionals. In the first place, it suggests that it is no longer possible to specialize in one area to the exclusion of another. Intelligence professionals will have to be trained to think more broadly, to be able to jump more fluidly from modeling to collection to analysis to production and back as the process of creating intelligence moves forward over time.
Likewise, hardware and software support systems will need to be designed that facilitate this leaping back and forth between the various sub-processes. Designing products that work sequentially in a parallel world will not only frustrate but will also slow down the process of generating intelligence – a result that is absolutely counter to the intelligence needs of modern decisionmakers.
Finally, as dramatic as this type of change might appear to be, it is, perhaps, better thought of as merely aligning the training and education of intelligence professionals with what it is they already do.
Posted by
Kristan J. Wheaton
at
10:07 AM
1 comments
Links to this post
Labels: education, intelligence, intelligence process, Mercyhurst Model, training
Friday, May 27, 2011
Part 6 -- Critiques Of The Cycle: The Intelligence Cycle Vs. Reality (Let's Kill The Intelligence Cycle)
Part 1 -- Let's Kill The Intelligence Cycle
Part 2 -- "We''ll Return To Our Regularly Scheduled Programming In Just A Minute..."
Part 3 -- The Disconnect Between Theory And Practice
Part 4 -- The "Traditional" Intelligence Cycle And Its History
Part 5 -- Critiques Of The Cycle: Which Intelligence Cycle?
Were the lack of precision the only criticism of the intelligence cycle, it might be able to weather the storm. As suggested previously, there do appear to be general themes that are relevant, and the cycle’s continued existence suggests that its inconsistencies are outweighed, to some extent, by its simplicity.
Unfortunately, the second type of criticism typically leveled against the cycle is much more damning. In fact, it is fatal. Simply put, there is virtually no knowledgeable practitioner or theorist who claims that the cycle reflects, in any substantial way or in any sub-discipline, the reality of how intelligence is actually done.
Consider these quotes from some of the most authoritative voices in each of the three intelligence communities:
"When it came time to start writing about intelligence, a practice I began in my later years at the CIA, I realized that there were serious problems with the intelligence cycle. It is really not a very good description of the ways in which the intelligence process works." Arthur Hulnick, "What's Wrong With The Intelligence Cycle", Strategic Intelligence, Vol. 1 (Loch Johnson, ed), 2007.Once you start looking for them, it is easy to find detailed critiques of the intelligence cycle (and, please, don't hesitate to add your own). The only argument that still seems worth debating is whether or not the cost of maintaining this flawed model of the process is worth the benefit (a question about which readers of this blog were almost evenly split).
"Although meant to be little more than a quick schematic presentation, the CIA diagram [of the intelligence cycle] misrepresents some aspects and misses many others." -- Mark Lowenthal, Intelligence: From Secrets to Policy (2nd Ed.,2003)
"We must begin by redefining the traditional linear intelligence cycle, which is more a manifestation of the bureaucratic structure of the intelligence community than a description of the intelligence exploitation process." -- Eliot Jardines, former head of the Open Source Center, in prepared testimony in front of Congress, 2005.
"The traditional intelligence cycle has been described as an "ideal-type" process that will always be subject to the real constraints of time." -- Jerry Ratcliffe, Strategic Thinking In Criminal Intelligence, 2004
"The classic intelligence cycle is neat, easily displayed, and quickly understood. The problem is that it doesn't really work that way. It's too static, too rigid, with too much distance between leaders and intelligence professionals." -- T.J. Waters, Hyperformance: Using Competitive Intelligence For Better Strategy and Execution, 2010
"Over the years, the intelligence cycle has become somewhat of a theological concept: No one questions its validity. Yet, when pressed, many intelligence officers admit that the intelligence process, 'really doesn't work that way.'" -- Robert Clark, Intelligence Analysis: A Target-centric Approach, 2010.
In addition to the quotes above, my colleague, Steve Marrin, provided me with an interesting update shortly after I started this series. According to him, the intelligence cycle was the subject of "vigorous discussion" at a 2005 RAND/ODNI Conference on intelligence theory and that this topic will also be the subject of a panel at the 2012 International Studies Association Conference. For a carefully crafted and articulate dissection of the intelligence cycle, I don't think I could recommend a better article than Steve's own chapter, "Intelligence Analysis and Decision-making: Methodological Challenges from the 2009 book, Intelligence Theory: Key Questions and Debate).Once again, themes emerge from the general discontent with the inadequacies of the intelligence cycle. Many of these themes I will touch upon as I discuss alternatives to the intelligence cycle in later posts. One theme, however, leaps off each page and tends to dominate the discussion: The intelligence cycle is linear and intelligence, as practiced, is not. Tasks move from one part of the cycle to another like an assembly line, where parts are bolted on in a specific order to create a consistent product.
While this approach might be appropriate for early 20th century manufacturers, it doesn’t work with intelligence, where each product, ideally, contains information that is somehow unique. Consider, for example, this hypothetical dialogue between Mary, the CEO of Acme Widgets and Joe, her chief of competitive intelligence:
Mary: I need to know everything there is to know about the Zed Widgets Company.While this example is simplistic, it makes the point. Intelligence, even in this one minor example within only one of the many parts of the traditional intelligence cycle is, or should be, at least, interactive, simultaneous, iterative. In the above example, this interaction between the intelligence professional and the CEO resulted in a more detailed and nuanced intelligence requirement going, as it did, from the very general, “Tell me everything…” requirement to the highly focused, “Tell me about Zed Company’s Material X costs and give me an estimate of where the price of Material X is likely to go.”
Joe: Sure. What’s up?
Mary: We are thinking about introducing a new widget and I want to know what the competition is up to.
Joe: Anything in particular you are interested in?
Mary: Well, I can see their marketing efforts on the TV every day, so I am not really interested in that. I guess the most important thing is their cost structure. I want to know how much it costs them to make their widgets and where those costs are.
Joe: Right. Labor, overhead, materials. Got it. Is one part of the cost structure more important than another to you?
Mary. They pay about the same amount in labor and overhead that we do so I guess I am most interested in the materials; particularly Material X. That is our most expensive material.
Joe: I just read a report that indicated that the cost of material X is set to rise worldwide. Would you also like us to take a harder look at that and give you our estimate?
Mary: Absolutely.
It is equally easy to imagine this kind of interaction within and between parts of the cycle as well. Collectors and analysts will inevitably go back and forth as the analysts attempt to add depth to their reporting and as the collector develops new collection capabilities. It is even likely that parts of the cycle that are not adjacent to one another will work very closely together, such as an analyst and the briefer responsible for the final dissemination of the product (in its oral form). Decisionmakers, too, may well remain involved throughout the process, seeking status reports and perhaps even modifying the requirement as new information or preliminary analysis becomes available.
The US military's Joint Staff Publication 2.0, Joint Intelligence, states the case more strongly:
"In many situations, the various intelligence operations occur nearly simultaneous with one another or may be bypassed altogether. For example, a request for imagery will require planning and direction activity but may not involve new collection, processing, or exploitation. In this example, the imagery request could go directly to a production facility where previously collected and exploited imagery is reviewed to determine if it will satisfy the request. Likewise, during processing and exploitation, relevant information may be disseminated directly to the user without first undergoing detailed all-source analysis and intelligence production. Significant unanalyzed combat information must be simultaneously available to both the commander (for time-critical decision-making) and to the intelligence analyst (for production of current intelligence assessments). Additionally, the activities within each type of intelligence operation are conducted continuously and in conjunction with activities in each of the other categories of intelligence operations. For example, intelligence planning is updated based on previous information requirements being satisfied during collection and upon new requirements being identified during analysis and production."The situation is even more complex when you imagine an intelligence unit without teams of people working each of the discrete parts of the cycle. In situations involving small intelligence shops, where a single indivdual collects, processes, translates, analyzes, formats and produces the intelligence, the cycle breaks down completely.
The human mind simply does not work in this strictly linear fashion. Instead, it jumps from task to task. Imagine your own habits when researching a topic. You think a bit, search a bit, get some information, integrate that into the whole and then search some more. This approach inevitably leads to analytic dead ends, requiring more collection. At the same time, you are thinking about the form of the final report. If you are putting together an intelligence product that will use multimedia in its final form, for example, you are constantly on the lookout for relevant graphics or film footage you can use, regardless of its analytic value. To even suggest that you should collect all of your information, stop, and then go and do analysis without ever doing any further collection, is absurd.
One of the most recent and widely publicized innovations within the US national security community is the advent of “Intellipedia”, a Wikipedia-like tool for the intelligence community. Wikipedia, of course, is the online encyclopedia that is free to use and editable by anyone. It is one of the most popular sites on the web and, according to at least some research, is as accurate as other generally accepted encyclopedias. It has become, in its short lifespan, the tertiary source of first resort for both analysts and academics.
One of the things it is not is linear. There is no "Table of Contents" and researchers, authors and editors choose their own path through the resource. Some people generate full articles; others only dive in occasionally to fix a particular fact or even a grammatical or spelling error. There are even full-fledged “edit wars” where a particular version of an especially hot topic changes back and forth between competing points of view until either one side gets tired and gives up or, more likely, the sides reach a version acceptable to all. In the end, it is openness and interactivity that give Wikipedia its strength.
The US national security community acknowledged the value of such a tool, at least with respect to its descriptive products, when it launched Intellipedia. Begun in April, 2006, Intellipedia, according to information from June, 2010, now has 250,000 registered users and is accessed over 2 million times per week. This effort, which is clearly far beyond the experimental stage, plainly shows that collaboration and interactivity – the anti-intelligence cycle -- are core to any modern description of the intelligence process.
Next: Cycles, Cycles And More Damn Cycles!
Posted by
Kristan J. Wheaton
at
3:53 PM
3
comments
Links to this post
Labels: intelligence, intelligence cycle, intelligence process, Let's Kill The Intelligence Cycle, LKTIC, original research
Thursday, May 26, 2011
Part 5 -- Critiques Of The Cycle: Which Intelligence Cycle? (Let's Kill The Intelligence Cycle)
Part 1 -- Let's Kill The Intelligence Cycle
Part 2 -- "We''ll Return To Our Regularly Scheduled Programming In Just A Minute..."
Part 3 -- The Disconnect Between Theory And Practice
Part 4 -- The "Traditional" Intelligence Cycle And Its History
Which Intelligence Cycle?
![]() |
| FBI Version Of The Intelligence Cycle |
![]() |
| Recent DNI Version Of The Intelligence Cycle |
A student of intelligence, particularly a new student, might legitimately question this explanation, however. Perhaps there is a difference. Perhaps the FBI’s characterization represents a new way of thinking about intelligence as a process.
It gets worse.
On the same page that contains the graphic above, the DNI promotes not one but two additional variations of the cycle. In the first, more modest, variation (contained in the text that describes the picture), the DNI says, "The process begins with identifying the issues in which policy makers are interested and defining the answers they need to make educated decisions regarding those issues. We then lay out a plan for acquiring that infromation and go about collecting it." If this is true, then why doesn't the graphic also "begin" with requirements? Why does the graphic seem to begin with planning?
It gets even worse.
The third variation of the cycle (all on the same page) from the DNI comes at the very top of the page. Here one finds five links, "Management", "Data Gathering", "Interpretation", "Analysis and Reporting", and "Distribution". Clicking on the "Management" link indicates that management -- not requirements, not planning -- "is the initial stage of the intelligence cycle".
Sigh.
I wonder which version is taught in the Intel 101 courses?
I wonder how you grade a student who uses an "alternative" cycle as an answer on a test?
I wonder, if the intelligence cycle is perfect (as about 15% of the people I have polled indicate), which of these cycles is perfect-est?
Were these differences the only differences within the US national security intelligence community, they might be explained away more simply but they are not. In fact, there is very little consistency across and even within a number of important elements of the US national security community. These inconsistencies also exist across disciplines as well.
Examine the chart below. Only one function, collection, is universally attributed to intelligence across all 10 organizations examined.
Within the DNI, CIA and FBI there are minor but important differences – not one of the three is exactly like either of the other two.
Even more baffling are the differences within the US military, however. The Defense Technical Information Center (“the premier provider of Department Of Defense technical information”) has a streamlined four-part description of the cycle, one that largely (but not completely) agrees with the cycle as taught at Fort Huachuca, the Army’s home for its military intelligence professionals. This cycle, however, is substantially different from the process defined in the US Military’s highest-level publication on intelligence doctrine, Joint Publication 2.0.
The differences evident in the US military may well be due to different publation dates or my own lack of access to the most recent revisions of some of these documents. In this regard, though, the 2007 Joint Pub is worthy of further commentary. In it, the US military seems to abandon the intelligence cycle in favor of a more generic intelligence "process". Some have suggested that this proves the military has already killed the intelligence cycle (but it just didn't get the memo...).
While it is (from my viewpoint, at least) a step in the right direction, it only exacerbates the impression that either the left hand is not speaking to the right in the US national security intelligence community or that the DNI doesn't control or doesn't care what the Joint Staff puts out with respect to the intelligence process. All of those alternatives make the US IC look sloppy and disorganized.
I also think the Joint Staff is trying to have its cake and eat it, too. Compare the two images below. The first is from the most recent public version of Joint Pub 2.0. The second is from the 1990 version of the US Army's Field Manual 34-3, Intelligence Analysis. While the words in the two publications contain many significant differences, the pictures seem to say that the military has not backed too far away from its conception of the process as a cycle.
![]() |
| Join Pub 2.0 Intel Process 2007 |
![]() |
| FM 34-3 Intel Cycle 1990 |
These descriptions of the cycle differ, again, in significant ways from the descriptions provided by two oversight bodies commissioned to examine intelligence activities listed on the chart, the 1996 Graham Rudman Commission and the 2004 Weapons of Mass Destruction Commission. To round out the confusion, the description of the cycle offered by the International Association Of Law Enforcement Intelligence Analysts and the classic competitive intelligence model (as described by longtime private sector intelligence specialist, John McGonagle) also differ from each other and from the other 8 examples.
This analysis, while interesting, comes across as perhaps a bit more picky than it should. Other processes in other disciplines lend themselves to various descriptions. Indeed, despite the differences, there are clear themes that emerge even from this analysis. Few, for example, would question whether requirements, needs, direction, and planning fell into a single, generic category.
Themes, however, is all these are. A rigid approach to intelligence, implied visually in the pictures above and in many of the descriptions of these processes by each of these intelligence organizations, seems inappropriate under these conditions for teaching these concepts to new members of the intelligence profession or, indeed, explaining the process to the decisionmakers that intelligence supports. Instead, a more nuanced and less absolutist approach appears to be called for.
There is one specific area where this analysis does create cause for concern, however. Only three of the 10 organizations examined include a feedback or evaluation function within their versions of the cycle.
While some of the other organizations did include feedback as a subset of the dissemination process, subordinating this crucial evaluative process is not likely to endear the intelligence function to the decisionmakers that intelligence supports. It seems much better practice to include explicitly the role of feedback in the process, whether the decisionmaker chooses to take advantage of it or not.
Next: The Intelligence Cycle vs. Reality
Posted by
Kristan J. Wheaton
at
2:10 PM
2
comments
Links to this post
Labels: intelligence, intelligence cycle, intelligence process, Let's Kill The Intelligence Cycle, LKTIC, original research







