Showing posts with label intelligence methods. Show all posts
Showing posts with label intelligence methods. Show all posts

Friday, June 5, 2009

Producing Intelligence Analysis From Patents (Original Research)

Jack Sandeen's exploration of patent analysis was a very interesting project to come out of my Advanced Analytic Techniques class this last term. Jack's case study focused on patents surrounding hybrid electric vehicles (HEVs) but the really interesting stuff he discovered had to do with the process of patent analysis itself.

His report, built using Google Sites, provides a concise, useful overview of patents and patent analysis. He also provides a good strengths, weaknesses and how-to section and some very valuable resource pages (here and here). Some of the more interesting aspects of his case-study are the different ways he was able to visualize and analyze the data he collected (See one image below).

Patent Analysis clearly has its strengths and weaknesses and it certainly isn't appropriate for all types of intelligence problems. Where it is applicable, however, it seems to me that it is very powerful and can provide deep insights into technology trends and corporate capabilities. There is a pretty steep learning curve associated with its many tricks and traps, however, but I am glad Jack decided to tackle it. He has provided us all with a useful overview.

Related Post:
Using Search Engine Optimization Tools To Do Intel Analysis
Introduction To Pivot Tables in Intelligence Analysis
Visual Analysis Everywhere!

Reblog this post [with Zemanta]

Tuesday, May 19, 2009

Towards An Unruly World: Ideas Of Interest (Day 1 -- International Security Forum)

Related Posts:
Live-blogging The ISF
BBC Monitoring

The ISF Conference began yesterday at the International Conference Centre in Geneva. It is a magnificent facility, with 640 attendees, and the speaker's list looks very interesting but the rules of the conference are going to cramp my style.

The whole ISF is being held under the Chatham House Rule and this prohibits me from citing who said what. That said, I can report some of the ideas to emerge from the conference.

Some of these ideas were unsurprising but some of them either had surprising twists or were brand new ideas entirely. I will start with the unsurprising ones first:

The Parade of Horribles. Climate change, war, terrorism, energy, food and water shortages, the demographic time bomb, WMDs, cyberwar, economic collapse... Lawyers call a lengthy string of terrible things like this the "Parade of Horribles". It is a rhetorical device designed to engage attention and compel action and it was used in much the same way here. What was interesting was the degree of consistency -- the same list of threats came up in multiple different contexts.

China and India. There seemed to be a good bit of concern and a certain sense of inevitability that China and India would emerge as future powerhouses. Nothing new here, of course, but the degree of certainty about this shift in power was noticeable. Certainly I heard more of this type of talk here in 24 hours than you do in the States over the course of several months. In addition, it was rarely just "China" and mostly "China and India".

Solutions. There was also a remarkable consensus about solutions: Reinforce international institutions (including the UN Security Council and regional initiatives); establish the "right amount" of regulation for markets; and increased coordination between defense, diplomacy and development agencies/organizations.

Some of the surprises:

Cyberwar/crime. There was far more talk concerning this issue than I thought there would be. If you take a look at the panels in detail, you can see that cyber takes up quite a bit of real estate. Interestingly, there is not a single panel devoted exclusively to terrorism (though it permeates the discussions here) but there are several devoted exclusively to cybersecurity issues. At one point there was even talk of Alternate Reality Games!

Emphasis on intelligence analysis tools and methods. No one called it that, of course. No, they used code words such as "security foresight" but they couldn't fool me -- they were talking about intelligence. Collection of information did not seem to be an issue; figuring out what the info we have actually means did.

"Resiliency". I think I witnessed the birth of a buzzword. The word "resiliency" kept coming up. The need to build resilient societies, have resilient systems. I think that the use here is similar to the way environmentalists use the word "sustainable" and the way network theorists use the word "robust". There was a sense that the runaway process of globalization had sacrificed too much of the world's resiliency for the sake of efficiency and that at least part of the current set of problems was due to this imbalance (Note: John Robb has been talking about the "resilient community" for quite some time but it was interesting to see the same term crop up here).

Reblog this post [with Zemanta]

Tuesday, May 5, 2009

Using Search Engine Optimization (SEO) Tools To Do Intel Analysis (Original Research)

In my Advanced Analytic Techniques class this term, we have been taking a quick look at a number of different intel analysis methods (with the results posted on our ADVAT blog) but we have also been examining some methods in a bit more detail.

One of the more interesting experiments was Jeff Welgan's attempt to do competitive intelligence analysis using free, online search engine optimization tools. Search Engine Optimization (SEO), for those of you new to the term, is basically about trying to make your website, blog, etc. easier to find.

Jeff noticed that there are many, many tools to help people do this on the cheap. His thought was that you could use these tools not to aid your own efforts but rather to gather data about a competitor company, organization or even a terrorist group through their web presence. This, in turn, might allow an analyst to gain insight into their strategies and, possibly, their next moves.

For his purposes, he focused on two competitors, Starbucks and Caribou Coffee for his case study. His site, however contains a good bit more data, however. He has included basic background material on SEO, a list of operational definitions, a fairly comprehensive list of online tools, and a concise section on how-to use these tools as an intelligence method.

I think Jeff would be the first to admit that relying on SEO analysis exclusively is kind of like relying on HUMINT exclusively when you have SIGINT and IMINT as well. That said, this approach certainly has the potential to add a rich, structured source of data to bounce off the anecdotal and unstructured stuff that makes up most of what is available to the intel analyst.
Reblog this post [with Zemanta]

Monday, April 27, 2009

Want To Learn ACH? Watch More House! (Fox.com/house)

I really like the television show House. This surprises me as I am not a big fan of medical mysteries generally and do not find the House character particularly likable. It occurred to me recently, however, that I might like the show more for its method than its characters or story.

The main method used in each of the shows -- differential diagnosis -- bears a remarkable resemblance to one of my favorite intelligence analysis techniques, Analysis of Competing Hypotheses (ACH).

OK, OK, I know this sounds weird, but bear with me for a moment. In the first place, making the connection between medicine and intel is old hat. My colleague Steve Marrin does it all the time.

In the second place, these two methods even sound similar. Differential diagnosis "involves first making a list of possible diagnoses, then attempting to remove diagnoses from the list until at most one diagnosis remains." ACH "requires an analyst to explicitly identify all the reasonable alternatives and have them compete against each other for the analyst's favor, rather than evaluating their plausibility one at a time." Sort of sounds the same...

The true test, I thought, would be to use an ACH matrix to map a differential diagnosis. The medical lingo is beyond me, though, so I figured I would search out someone who had summarized a particularly good House episode and use their abstract.

Enter Scott.

Scott (FNU) is a former Air Force doctor now in general practice in Illinois who summarizes and critiques every episode of House over at his blog, Polite Dissent. Very cool, slightly obsessive and enormously useful for this exercise.

I chose the "Let Them Eat Cake" episode from Season 5 as Scott said the medicine was pretty good (he gave it an "A").

Scott starts his summary like this: "Emmy, a thirty year-old fitness instructor, is filming an infomercial when she experiences sudden difficulty breathing and collapses, breaking her ankle in the fall. She is admitted to House’s service for evaluation and all the initial tests were normal. Taub suspects her of steroid use, Kutner mentions environmental allergies, and Cuddy suspects exercise induced asthma." (Note: Taub and Kutner are both members of Dr. House's team. Cuddy is house's boss)

Mapping this to an ACH Matrix (and I am using the PARC ACH 2.0.3 software here), looks something like this:



I am making some assumptions here, of course. The team considers the asthma hypothesis to be the best but Scott is quiet as to why. I am just guessing that the fact that the initial tests were all normal was inconsistent with the other two hypotheses.

Anyway, Scott continues: "The last (i.e. exercise induced asthma) seems the most likely, so the team sets about to recreate Emmy’s episode, the best they can with her broken ankle. Sure enough, while in the middle of exercising, she once again collapses and is found to be pulseless." Scott doesn't say this explicitly but I have to assume that this extreme reaction is not what they expected and is inconsistent with any of the current hypotheses.

This means it is back to the drawing board for House and the team. Interestingly, an intel analyst in a similar position would do exactly the same thing. Scott continues his narration, "Kutner suggests she may have Carcinoid syndrome." None of the existing evidence rules this out so, Scott continues, "A CT is obtained which shows no carcinoid tumor..." Intel analysts in a similar position would also task additional collection in an attempt to disprove the unclear hypothesis.

The IMINT -- sorry, I mean CT Scan -- reveals something new, though. Emmy has had gastric bypass surgery! This has, in Scott's words, "them rethinking their differential diagnosis: now diabetic neuropathy (nerve damage caused by diabetes) and sleep apnea are added."

Reviewing the bidding after the most recent round shows a matrix that looks something like this:


As the show progresses both the diabetic neuropathy and the sleep apnea diagnoses/hypotheses are eliminated and several more (well, LOTS more) hypotheses are added to the mix. If you are interested in how all this plays itself out, you can read the rest on Polite Dissent along with Scott's deconstruction and evaluation of the episode.

As I thought about other episodes of House, I realized that many of the common problems that ACH deals with repeatedly come up. For example, it is not unusual for the doctors in House to suspect deception (e.g. patients that lie) when it comes to some of the information they collect. Likewise, the story often turns on an assumption concerning a patient that turns out to be false. Deliberately deceptive information is daily fare for the intel analyst and we came up with Structured ACH as a way to get at underlying assumptions.

Once you start to think about this idea, it really starts to grow on you. Next time you watch House, think about ACH. If you come up with anything neat, post it in the comments!

Friday, April 24, 2009

Amazing Resource On Intel Analysis Methods (FOR-LEARN)

One of the sites my students came across in the course of our studies into advanced analytic techniques is the very good FOR-LEARN online guide to analytic methods. It is a site that ought to be bookmarked by every analyst -- business, law enforcement and national security types included.

Put together under the auspices of the European Commission's Joint Research Center, FOR-LEARN seeks to guide "users throughout the critical steps of design, implementation and follow-up of a Foresight project and gives a description of the main methods that can be used." Simply replace the word "Foresight" in the previous sentence with "intelligence", and this resource becomes an invaluable (and free) starting point for research into all sorts of analytic methods.

There is a ton of good info here but the first golden nugget is the "methods table" the authors have put together (See static image below. Click on the image to go to the interactive version).

Each of the methods listed, in turn, has a broad, structured overview of the method, along with a brief how-to, its strengths and weaknesses, a case study and some links for additional research. The outline is remarkably similar to the one we used in The Analyst's Cookbook and continue to use in our Advanced Analytic Techniques course (clearly a case, by the way, of great minds thinking alike...).

The site contains much more than just methodologies, however. There are sections on how to scope and run an analytic project as well as extensive additional resources included on almost every page of the guide.

The guide does not contain everything, of course. There is a good bit more to many of these approaches than what the authors have chosen to cover here. There are also a number of intelligence methods that are not mentioned here. Despite these quibbles, it is an excellent product overall and deserves some attention from any serious analyst.

Tuesday, February 24, 2009

Free Intel Studies Book! (MCIIS Press)

We have just published a new book, Walking Through The Halls Of Intelligence, and, in keeping with our philosophy, we have made a PDF version free to download.

The book summarizes the results of 12 recent graduate theses and covers topics such as:

  • A Predictive Model For Clandestine Nuclear Programs

  • Is China Stable?

  • Accountability: Process And Outcome Systems For Intelligence Analysts

  • Personality Types Of Intelligence Analysts
You can also buy a copy of the book (it is a great coffee table book -- with hard cover and dust jacket). All of the profits from the sale of the hard copy go to support graduate level research here at Mercyhurst.

The book was originally a class project in my graduate level class in Intelligence Communications last year. I wanted students to become familiar with the thesis process and the thesis results of previous graduate students (much of which covers topics not previously or, at least, not widely covered in the intelligence studies literature). I also wanted the students to produce an integrated document and to make that document readable.

As a result, I had them write a book...

Their guidance for the individual chapters was to mimic the feature writing style from good news and science magazines (such as Newsweek and New Scientist) and to compile all of the articles into a book length document.

In addition to reading the theses, contacting the original authors for more info, speaking with the thesis advisors and others regarding the importance of the work, they also had to make a variety of group decisions such as settling on a common format and make numerous production decisions (such as cover artwork and layout).

In the end, the book was good enough that we decided to publish it using the same model that we used with the Analyst's Style Manual (still available for both free download and purchase).

It is licensed under a Creative Commons license which means you can download it, share it, use it in classes -- whatever.

While you are there, you might also want to check out the new Mercyhurst College Institute Of Intelligence Studies website. We are still adding content but the look and feel are entirely different.

We have also done a better job of integrating the MCIIS website with the college's main site (for example, if you know a high school student that you think should be thinking about a career in intelligence, there is a link directly from the MCIIS website to a "request for information" page).

Finally, the last time we did this, you crushed our servers with requests, so, if you can't get to the download page immediately, I suggest you wait ten minutes and try again.

Friday, December 19, 2008

Top 5 Intelligence Analysis Methods: Analysis Of Competing Hypotheses (#1)

(Note: I apologize for how long it has taken me to get here. Conferences, classes and life in general conspired to get in the way this week. For the patient, here is the last in this series of posts...)

Part 1: Introduction
Part 2: What Makes A Good Method?
Part 3: Bayesian Analysis (#5)
Part 4: Intelligence Preparation Of The Battlefield/Environment (#4)
Part 5: Social Network Analysis (#3)
Part 6: Multi-Criteria Decision Making Matrices/Multi-Criteria Intelligence Matrices (#2)

Analysis Of Competing Hypotheses (ACH) is probably the best-known intelligence analysis method today. Invented by Richards Heuer over 30 years ago and made famous in his intelligence classic, the Psychology of Intelligence Analysis, ACH is widely taught and conceptually easy even for entry-level analysts.

In addition, it was specifically designed to work in all kinds of situations with any kind and quality of data. What is less clear is that the method produces unequivocally better estimative results. While the method is rooted fundamentally in the scientific method, studies testing the value of the method as a way to improve forecasting have been few and the results have been mixed (In no study have the results been worse than without the method but some studies have shown that the method only helps certain subsets of analysts. For a good list of these studies see the Notes List at the end of the ACH article on Wikipedia).

I am not sure why this is so. My own impression is that a well done ACH provides a better estimate in less time and with more nuance than virtually any other method available.

We teach ACH here in our freshman classes. I see many, many students struggle not with the basic concepts of ACH but with the details. I see countless examples each year of student projects where they have improperly executed the method (in much the same way a student gets their first attempts at a calculus or chemistry problem incorrect).

In most cases, it is fairly easy to correct the mistakes and the students rarely have a problem seeing what they did wrong or in making the appropriate adjustments. It is less clear to me that, at this early stage in their education, they are able to transfer this knowledge from one type of problem to another, however. We try to reinforce all our methods in upper level classes but the opportunities for reinforcement in the real world are slim (we rarely find, for example, that students are required to use structured methods in their internships).

My own instincts tell me that ACH (and many of the experiments involving it -- including our own) is a powerful method but won't get a fair test until such a test is done with analysts who have worked with the method on multiple problems and in multiple circumstances. To be honest, I suspect that this is true with all of the methods I have discussed in this series. Deliberate practice seems to be a key component of expertise in multiple other fields and I imagine this is true when it comes to intelligence analysis methods as well.

Improving the quality of the final estimate is only one (albeit an important) way that a method should contribute to a quality intelligence product, however. ACH brings much more to the table in my estimation and it does this immediately, in even the earliest projects.

ACH can help the analyst at every stage of the problem, including modelling, collection and collection planning, and preparing a document for dissemination. It is a wholly transparent method and can very easily be used collaboratively. Its transparency is also crucial in helping instructors or managers identify problems in the analysis of the data. The transparency is also of enormous benefit in understanding and improving the analytic process after the fact as well. It integrates extremely well with various data resources and is very suitable for automation. We find that it is actually faster to use, particularly in a group setting, than most other methods (including intuitive analysis).

The way ahead is a little different here than with the other methods. We think we have a pretty good handle on how to teach ACH. The key, in my estimation, is to create opportunties to reinforce that teaching in and outside the confines of the classroom.

Friday, December 12, 2008

Top 5 Intelligence Analysis Methods: Multi-Criteria Decision Making Matrices/Multi-Criteria Intelligence Matrices (#2)

Part 1 -- Introduction
Part 2 -- What Makes A Good Method?
Part 3 -- Bayesian Analysis (#5)
Part 4 -- Intelligence Preparaton Of The Battlefield/Environment (#4)
Part 5 -- Social Network Analysis (#3)

Multi Criteria Decision Making (MCDM) is a well-known and widely studied decision support method within the business and military communities. Some of the most popular variants of this method include the analytic heirarchy process, multi-attribute utility analysis and, in the US Army, at least, the staff study (see Annex D). There is even an International Society on Multiple Criteria Decision Making.

At its most basic level, MCDM is used to evaluate a variety of courses of action against a set of established criteria (see the image below for what a simple matrix might look like). One can imagine, for example, considering three different types of car and evaluating them for criteria such as speed, cost and fuel efficiency. MCDM would suggest that the car that has the highest total rating across those three categories would be the best car to buy. In fact, Consumer Reports uses exactly this type of method in its famous "circle charts" of everything from cars to hair care products.


MCDM is flexible and works with a wide variety of data but there are numerous devils in the details of its implementation. The simple example above gets increasingly complicated when we start to examine the many other criteria that someone might use to evaluate a car. Likewise, there are problems with rating each of these criteria (Is a car that gets 27 miles to the gallon really worse than a car that gets 27.1 miles to the gallon?). Even worse is when the analyst starts to think about abstract evaluation criteria such as which of the three cars is "coolest"? You start thinking like this and you begin to understand why they have an international society dedicated to this method...

MCDM is, at its heart, an operational methodology, not an intelligence method, however. That said, we have had very good luck “translating” it into an intelligence analysis method (i.e. MCIM) in our strategic intelligence projects. These projects have covered the gamut from large-scale national security studies to small-scale business studies. The matrices can be simple (I actually used such a matrix to evaluate the 5 methods mentioned in this series) or enormously complex (The MCIM matrix on the likely impact of chronic and infectious disease on US national security interests clocks in at 15 feet long when printed).

The key difference between the operational variant and the intelligence variant is perspective. In the operatonal variant, the analyst is trying to figure out his or her organization's best course of action. In the intelligence variant, the analyst puts him or herself in the shoes of the adversary and attempts to envision how the other side might see both the courses of action available and the criteria with which the adversary will evaluate them. The intelligence variant has not, to the best of my knowledge, been validated but we have a grad student working on it.

The research agenda for this method (as with many of the other methods discussed so far) is straightforward. First, it has to be validated as an intelligence specific methodology. The anecdotal evidence and the evidence in the operational literature is good but further testing needs to be done. Second, analysts need to figure out which variants of MCIM work best in which types of intelligence situations. Finally, we need to get the method out of the school house and into the field.

Next Week: #1!

Thursday, December 11, 2008

Top 5 Intelligence Analysis Methods: Social Network Analysis (#3)

Part 1 -- Introduction
Part 2 -- What Makes A Good Method?
Part 3 -- Bayesian Analysis (#5)
Part 4 -- Intelligence Preparation Of The Battlefield/Environment (#4)

Social Network Analysis (SNA) is fundamentally about entities and the relationships between them. As a result, this method has a number of variations within the intelligence community ranging from techniques such as association matrices through link analysis charts right up to the validated mathematical models. It is most commonly used as a way to picture a network, however, and is rarely used in the more formal way envisioned by the sociologists who created the method. In other words, while SNA is a very powerful method, intelligence professionals rarely take advantage of its full potential.

Because it is primarily a visual method, most analysts (and the decisionmakers they support) immediately grasp the value of the method. Likewise, some variation of SNA will likely work with any data set where the entities and the relationships among those entities are important (in other words, almost every problem). Parsing all the relevant attributes can be difficult, however, and there are few automated solutions that work well with unstructured data sets. Likewise, at the higher levels of analysis, where the analyst is trying to do more than merely visualize a network, a good bit of special knowledge is required to understand the results.


Talking about SNA doesn't make a lot of sense though. It is much easier to grasp its value as an intel method by looking at some examples. I2's Analyst Notebook is widely available and examining their case studies is helpful in seeing what that tool can do. Another widely used tool, particularly for more formal analyses, is UCINET. For example, some of our students used this tool last year to examine the "social network" of government and non-governmental organizations engaged in security sector reform in sub-Saharan Africa (the image above comes from their study). Recently, my students and I have been playing around with a new tool from Carnegie Mellon University called ORA. It is very easy to use and very powerful.

Of the 5 methods I intend to discuss, SNA is the one with the widest visibility in all three major intelligence communities (national security, business and law enforcement). As such, there is already a good bit of research activity into how to better use this method in the intelligence analysis process. The big challenge, as I see it, is to design educational programs and tools that help analysts move away from the "pretty picture" possibilities presented by this method and toward the more rigorous results generated by the more formal application of the method.

Tomorrow: #2...

Tuesday, December 9, 2008

Top 5 Intelligence Analysis Methods: Intelligence Preparation Of The Battlefield/Environment (#4)

Part 1 -- Introduction
Part 2 -- What Makes A Good Method?
Part 3 -- Bayesian Analysis (#5)

Intelligence Preparation of the Battlefield (IPB) is a time and battle tested method used by military intelligence professionals. Since its development over 30 years ago by the US Army, it has evolved into an increasingly useful and sophisticated analytic method. (Note: If you are interested in the Army's Field Manual on IPB, it is available through the FAS and many other places. We teach a very simplified version of IPB to our freshmen and you can download examples of their work on Algeria and Ethiopia (They are .kmz files, so you will need Google Earth to view the files).)

IPB is noteworthy for its flexibility. Its success in the field led to a variety of modifications and extensions of its basic concepts. The Air Force, for example, expanded IPB to what it has called Intelligence Preparation of the Battlespace and I know that, several years ago, the National Drug Intelligence Center developed a similar method for use in counter-narcotic operations. Today, the broadest variation on the IPB theme seems to be what NGA calls Intelligence Preparation of the Environment (IPE or sometimes Intelligence Preparation of the Operational Environment -- IPOE).

IPB/IPE/IPOE reminds me a bit of Sherlock Holmes in the The Sign of the Four: "Eliminate all other factors and the one which remains must be the truth." The fundamental concept of IPB – using overlapping templates to define a physical or conceptual space into go, slow-go and no-go areas – can clearly be applied to a variety of situations. It is obvious that this method is particularly useful in situations where geography is important. Mountains, rivers, etc. restrict movement options while roads and the like facilitiate movement. Overlaying weather effects and opposing forces' doctrine on top of this geography (combined with several other factors) can give a commander a good idea of what is possible, impossible and likely.

Simplify the concept even more (and remove it from its traditional military environment) and it begins to look like a Venn diagram with intersecting circles useful in any situation that can be thought of, either concretely or abstractly, as a landscape. Imagine, for example, a business competitor in which we are interested. We suspect that it is preparing to launch a new product. How would we translate IPB into this environment? Perhaps we could see the various product lines where our competitor operates as "avenues of approach". The competitor's capabilities could be defined by its patent portfolio and financial situation. The competitor's "doctrine" could be extrapolated, perhaps, from its historical approach to new product launches. The validity of this and other similar approaches in other fields is, however, largely untested.

By defining, in advance, the relevant ways to group the data available for analysis, IPB is able to effectively deal with large quantities of both strutured and unstructured data. While these groupings are typically quite general, they are finite and it is possible for relevant data to fall through the cracks between the groups. Likewise, as the relevant groupings of data begin to proliferate, the method quickly moves from one which is simple in concept to one which is complex in applicaton (The US Army's IPB manual is 270+ pages...).

For me, the research challenges here are straightforward. The military has a clear lock on developing this method within its environment; there is little value added for academia here. Beyond the military confines, however, the research possibilities are wide open. Does this method or some variation of it work in business? How best to define it in law enforcement situations? Could it work against gangs? In hostage situations? Crime mappers, in particular, might be able to utilize some of these concepts to further refine their art.

Tomorrow: Method #3...

Monday, December 8, 2008

Top 5 Intelligence Analysis Methods: Bayesian Analysis (# 5)

Part 1 -- Introduction
Part 2 -- What Makes A Good Method

(Note: Bayesian statistical analysis is virtually unknown to most intelligence analysts. This is unfortunate but true. At its core, Bayes is simply a way to rationally update previous beliefs when new information becomes available. That sounds like what intelligence analysts do all the time but it has that word "statistics" associated with it, so, even analysts who have heard of Bayes often decide to give it a miss. If you are interested in finding out more about Bayes, you can always check out Wikipedia but I find even that article a bit dense. I prefer Bayes For Beginners -- which is what I am.)

Bayes is the “Gold Standard” for analytic conclusions under conditions of uncertainty and probably ought to be closer to -- if not at the -- top of this list. It provides a rigorous, logical and thoroughly validated process for generating a very specific estimative judgment. It is also enormously flexible and can, theoretically, be applied to virtually any type of problem.

Theoretically. Ahhh... There, of course, is the rub. The problem with Bayes lies in its perceived complexity and, to a lesser degree, the difficulty in using Bayes with large sets of unstructured, dynamic data.

  • Bayes, for many people, is difficult to learn. While the equation is relatively simple, its results are often counterintuitive. This is true, unfortunately, for both the analysts and the decisionmakers that intelligence analysts support. It doesn't really matter how good the intelligence analyst is at using Bayes if the decisionmaker will not trust the results at the end of the process because they come across as a lot of statistical hocus-pocus or, even worse, simply "seem" wrong.

    • While Sedlmeier and Gigarenzer have had some luck teaching Bayes using so-called natural frequencies (and we, at Mercyhurst, have had some luck replicating his experiments using intelligence analysts instead of doctors), the seeming complexity of Bayes is one of the major hurdles to overcome in using this method effectively.

  • In addition to the complexities of Bayes, it appears that this method, which works well with small, well-defined sets of any kind of data, does not handle large volumes of dynamic, unstructured data very well.

    • Bayes seems to me to work best as an intelligence analysis method when an analyst is confronted with a situation where a new piece of information appears to significantly alter the perceived probabilities of an event occurring. For example, an analyst thinks that the odds of a war breaking out between two rival countries are quite low. Suddenly, a piece of information comes in that suggests that, contrary to the analyst’s belief, the countries are, in fact, at the brink of war. A Bayesian mindset helps to ratchet back those fears (which are actually best described by the recency and vividness cognitive biases).

    • The real world doesn't present its data in ones, however, and not all data should be weighted the same. When analysts try to go beyond having a "Bayesian mindset" and apply Bayes literally to real world problems (as we have on several occasions), they run into problems. Think about the recent terrorist attacks in Mumbai. Arguably, the odds of war between India and Pakistan were quite low before the attacks. As each new piece of data rolled in, how did it change the odds? More importantly, how much weight should the analyst give that piece of data (particularly given that the analyst does not know how much data, ultimately, will come in before the event is "over")? Bayes is easier to apply if we treat "Mumbai Attack" as a single data point but does that make sense given that new data on the attack continues to come in even now?

    • Bayes, in essence, is digital but life is analog. Figuring out how to "bin" or group the data rationally with respect to real-world intelligence problems is one of the biggest hurdles to overcome, in my estimation, with using Bayes routinely in intelligence analysis

Bayesian statistical analysis has enormous potential for the intelligence community. 20 years from now we will all likely have Bayesian widgets on our desktops that help us figure out the odds -- the real odds, not some subjective gobbledy-gook -- of specific outcomes in complex problems (In much the same way that Bayes powers some of the most effective and powerful spam filters today). The research agenda to get closer to this "Golden Age Of Bayesian Reasoning" is straightforward (but difficult):

    • Figure out how to effectively and efficiently teach the basics of Bayes to a non-technical audience.
    • Actually teach those basics to both analysts and decisionmakers so that both will have an appropriate "comfort level" with the fundamental concepts.
    • Develop Bayesian-based tools (that are reasonably simple to use and in which analysts can have confidence) that deal with large amounts of unstructured information in a dynamic environment.

    Anyone got any extra grant money lying around?


    Tomorrow -- Method #4...

    Friday, December 5, 2008

    Top 5 Intelligence Analysis Methods: What Makes A Good Method? (List, Part 2)

    Part 1 -- Introduction

    There are a number of good analytic methods available. If you are ever at a loss for a method (or just want to see a really good taxonomy of methods) check out the Principles of Forecasting site. Specifically, look at the methodology tree and the selection tree (You can see a screen shot below but you really owe it to yourself to look at the interactive site or, at least, download the PDF).


    While I strongly support the International Institute of Forecasters and all of their good work, I have rarely had the kind of data in the real-world intelligence problems on which I have worked that would allow me to be comfortable using many of the methods that they have listed. I'll be honest; these guys have spent a lifetime thinking about forecasting and deriving a taxonomy of methods so I am probably the one who is wrong but the methods I find most useful -- over and over again -- are simply not on their list.

    What makes for a useful intelligence analysis method? Based primarily on my experience with real-world intelligence problems and with teaching entry-level analysts a wide variety of methods, I think there are four primary factors: Validity, simplicity, flexibility and the method's ability to work with unstructured data.

    • Validity. There needs to be at least some evidence to suggest that the method actually improves the intelligence estimate and there should not be strong evidence suggesting that the method does not work. Many of today's "generally accepted" methods and multipliers fail to meet this test. Developing and analyzing scenarios and devil's advocacy are two examples. Tetlock took a hard look at one kind of common scenario development method and found it wanting yet this research is almost universally unknown to intelligence analysts. As Steve Rieber has pointed out, there is no real research to support the use of Devil's Advocacy despite its support by the Senate Select Committee on Intelligence. It is surprising to find that many of today's commonly used intelligence analysis "methods" are, in reality, little more than tribal lore passed down from one generation to another.
    • Simplicity. All successful intelligence analysts are smart but even when they have PHDs, you find a reluctance to use complex and, more importantly, time consuming methods. Due to the error inherent in the data available to most intelligence professionals, the benefit derived from using these methods simply doesn't appear to most analysts to outweigh their costs. To be "simple" by my definition, a method should be able to be taught in a reasonable amount of time and the analyst should be able to see themselves using the method in real-world situations. Analytic methods that actually help communicate the analysis to the decisionmaker or that help evaluate the intelligence process after the fact get extra credit.
    • Flexibility. Analysts consistently find themsleves in a wide variety of situations. Sometimes these sistuations are tactical and sometimes they are strategic; sometimes the analyst is a subject matter expert and sometimes they are not. In this post Cold War world, it seems to me that national security analysts are getting dragged from one portfolio to another at an accelerating pace. I remember, for example, when all sorts of Russian analysts were re-branded as newly minted Balkans analysts in the 90's and I suspect that several months ago a number of African or Korean analysts suddenly found themselves on a Georgia-Russia Analytic Team trying to figure out what was likely to happen next in South Ossetia. A really good method should work in all these types of situations and across all the disciplines of intelligence as well.
    • Works With Unstructured Data. One of the things that distinguishes, in my mind, intelligence work from other analytic work is that intelligence deals primarily in unstructured data. Intelligence data does not come in neat columns and rows on Excel spreadsheets. It comes in a variety of forms and is often wrong, incomplete or even deliberately deceptive. An intelligence method that fails to acknowledge this, that needs "clean" data to get good results, is a less useful method in my mind.

    I am sure that there are other factors that one should consider when selecting an analytic method (and, please, put yours in the comments!) but these are the ones that seem most important to me.

    Monday: Method #5...

    Thursday, December 4, 2008

    Top 5 Intelligence Analysis Methods (List)

    (Note: I was recently asked to name and describe my top 5 intelligence analysis methods. As I began to think about it, what seemed like a fairly straightforward question morphed into what I could only think of a series of blog posts. So, here they are...)

    Considerable emphasis has been put on improving the methods of intelligence analysis over the last six years. The 9/11 Report alluded to the need for it, the WMD Commission addressed it more directly and the DNI recently highlighted the continued requirement for advanced analytic techniques in its Vision: 2015 document.

    Still, the intuitive method (also known as "read a bunch of stuff, think about it for a bit and then write something") remains the most popular method for producing intelligence analysis despite this method's well known tendency to permit a wide range of cognitive and systemic biases to corrupt the analytic product (see Heuer and Tetlock for excellent overviews of these problems).

    Beyond the intuitive method (and the interesting defenses of it offered by books such as Blink and Gut Feelings), what, then, are the best methods for conducting intelligence analysis? Given the wide range of intelligence analysis problems (tactical, operational, strategic) and the large number of disciplines using intelligence analysis to support decisionmaking (national security, law enforcement and business) is there any chance that I can identify the five best methods?

    My answer is, obviously, "Yes!" but before the fighting begins (and there will be fighting...), I intend to give myself a chance of convincing you by defining not only what I mean when I say "method", but also what makes for a good one.

    What Is An Intelligence Analysis Method?

    The word "method" is often used casually by analysts. When used this way, processes as different as brainstorming and Analysis Of Competing Hypotheses can both be seen as "methods" or ways to improve thinking. While such an informal definition might work at a cocktail party, it is not very helpful for professional purposes. "Method", in my opinion, should be reserved for processes that produce or substantially help the analyst produce estimative results.

    Why?

    It is simple, really. Estimative results are what decisionmakers want most from intelligence. It is nice to have a good description of an item of interest or a decent explanation of why something did or did not happen. Both provide useful context for the decisionmaker, but nothing beats a good, solid estimate of what the enemy or competitor or criminal is likely to do next. Defining method as something that produces estimative results means that I am connecting the most common term with the most desired result.

    All the processes that help the analyst think but do not, by themselves, produce estimative results (such as brainstorming) I call "analytic multipliers". I get this from my military background, I suppose, where there are elements of combat power, such as armor or artillery, and combat multipliers, such as morale.

    Analytic tools, then, are particular pieces of software, etc. that operationalize the method or the multiplier (or in some cases multiple methods and multipliers) in a particular way. For example, ACH is a method but the PARC ACH 2.0.3 software is a tool that allows the analyst to more easily do ACH.

    I find these distinctions very useful in discussing the analytic process with students. If everything is a method -- if free association exercises are treated, linguistically, the same as multi-attribute utility analysis, for example -- then nothing, in the mind of the student, is a method. Clearly, not every process falls neatly into the method or multiplier camp (what is SWOT, for example, under these definitions?) but some generally agreed upon set of words to capture the large and easily recognizable differences between things such as ACH and brainstorming seems useful.

    Tomorrow: What makes a good method?