Showing posts with label analytic methods. Show all posts
Showing posts with label analytic methods. Show all posts

Wednesday, December 9, 2020

The BPRT Heuristic: Or How To Think About Tech Trends

A number of years ago, one of my teams  was working on a series of technology trend projects.  As we looked deeply at each of the trends, we noticed that there was a pattern in the factors that seemed to be influencing the direction a particular tech trend would take.  We gave that pattern a name:  the BPRT Heuristic.  

Tech trends are always interesting to examine, so I wanted to share this insight to help you get started thinking about any developing or emerging techs you may be following.  

Caveat:  We called it a heuristic for a reason.  It isn't a law or even a model of tech trend analysis.  It is just a rule of thumb--not always true but true enough to be helpful.
  • B=the Business Case for the tech.  This is how someone can make money off the tech.  Most R and D is funded by companies these days (this was not always the case).  These companies are much more likely to fund techs that can contribute to a revenue stream.  This doesn't mean that a tech without an obvious business case can't get developed and funded, it just makes it harder.
  • P=Political/Cultural/Social issues with a tech.  A tech might be really cool and have an excellent business case, but because it crosses some political or social line, it either goes nowhere or accelerates much more quickly than it might normally.  Three examples:  
    • We were looking at 3G adoption in a country early in the 2000's.  There were lots of good reasons to suspect that it was going to happen, until we learned that the President's brother owned the 2G network already in existence in the country.  He was able to use his family connections to keep competition out of the country.  
    • A social factor that delayed adoption of a tech is the story of Google Glass in 2013.  Privacy concerns driven by the possibility of videos taken without consent led to users being called "Glassholes."  Coupled with other performance issues, this led to the discontinuation of the original product (though it lives on in Google's attempts to enter the augmented reality market).  
    • Likewise, these social or cultural issues can positively impact tech trends as well.  For example, we have all had to become experts at virtual communication almost overnight due to the COVID crisis--whether we wanted to or not.
  • R=Regulatory/Legal issues with the tech.  The best example I can think of here is electromagnetic spectrum management.  Certain parts of the electromagnetic spectrum have been allocated to certain uses.  If your tech can only work in a part of the spectrum owned by someone else, you're out of luck.  Some of this "regulation" is not government sponsored either.  The Institute of Electrical and Electronics Engineers establishes common standards for most devices in the world, for example.  For example, your wifi router can connect to any wifi enabled devices because they all use the IEEE's 802.11 standard for wifi.  Other regulations come from the Federal Communications Commission and the International Telecommunications Union.
  • T=The tech itself.  This is where most people spend most of their time when they study tech trends.  It IS important to understand the strengths and weaknesses of a particular technology, but as discussed above, it might not be as important as other environmental factors in the eventual adoption (or non-adoption...) of a tech.  That said, there are a couple of good sources of info that can allow you to quickly triangulate on the strengths and weaknesses of a particular tech:
    • Wikipedia.  Articles are typically written from a neutral point of view and often contain numerous links to other, more authoritative sources.  It is not a bad place to start your research on a tech.  
    • Another good place is Gartner, particularly the Gartner Hype Cycle.  I'll let you read the article at the link but "Gartner Hype Cycle 'insert name of tech here'" is almost always a useful search string  (Here's what you get for AI for example...).  
    • Likewise, you should keep your eye out for articles about "grand challenges" in a particular tech (Here is one about grand challenges in robotics as an example).  Grand Challenges outline the 5-15 big things the community of interest surrounding the tech have to figure out to take the next steps forward.  
    • Likewise, keep your eyes out for "roadmaps."  These can be either informal or formal (like this one from NASA on Robotics and autonomous systems).  The roadmaps and the lists of grand challenges should have some overlap, but they are often presented in slightly different ways.
Obviously, the BPRT Heuristic is not the answer to all your tech trend questions.  In providing a quick, holistic approach to tech trend analysis it does, however, allow you to avoid many of the problems associated with too much hype.  

Wednesday, September 24, 2014

Advanced Analytic Techniques (The Blog) Is Back!

Check out www.advat.blogspot.com!
Each year, I teach a class called Advanced Analytic Techniques (AAT) here at Mercyhurst.  It is a seminar-style class designed to allow grad students to dig into a variety of analytic techniques and (hopefully) master one or two.   

The students get to pick both the topic and the technique on which they wish to focus so you wind up with some pretty interesting studies at the end.  For example, we have applied the traditional business methodology of "best practices" to western European terrorist groups and the traditional military technique of Intelligence Preparation of The Battlefield to the casino industry.

As you can imagine, some of these projects gain a bit of notoriety for their unique insights.  One of my former students, Jeff Welgan, even had his AAT project written up in the book Hyperformance.

Beyond this deep dive that each student is required to do, the class is also designed to teach students how to evaluate analytic techniques for things such as validity and flexibility.  To help with this process, each week we take a quick look at an analytic technique that no one in the class is using in their projects.  

We start this process with a tour d'horizon of the available literature on the method with a particular focus on the literature that is higher up the evidence pyramid and relevant to intelligence analysis.  At the end of the week, one member of the class runs an abbreviated demo of the technique using the other half of the class as guinea pigs.  Once we are done, we all sit down and write up our thoughts about the method.  Last week, for example, we took a (quick) look at SWOT.  This week we will be examining various forms of Red Teaming.

All of this - the summaries and critiques of the articles we have found, and our overall "evaluation" of the technique - gets posted onto the Advanced Analytic Techniques blog each week.  Over the years, the blog has become increasingly popular and I certainly encourage everyone to take a look and, if you have a comment, join in!

Tuesday, February 11, 2014

Borrowing Analytic Techniques: Populations, Predictions And What Physics Tells Us About The Movement Of Alawites (Part 3 Of A Multi-Part Series)

Note: This is the third part in a multi-part series covering an extensive project on the human geography of Turkey and network analysis applications within the field of Intelligence Studies. Be sure to read the first two posts, My Conversation With The Free Syrian Army and Turkey Redrawn.


World Migration into Turkey
Source: International Organization for Migration Map
In the past, SAM has covered many different Structured Analytic Techniques (SATs), the zooming technique and Structured Role Playing among them. This post covers an analytic technique somewhat unfamiliar to the intelligence discipline, but extremely useful, especially within the realm of human geography and population movement. 

Using this technique, I was able to make predictions that read something like this:

"The ethnolinguistic populations within Turkey that will likely expand within the next 12 to 24 months are Bulgarians, Serbs, Iraqi Kurds, Iranian Arabs and Azerbaijanis."
and...
"The groups that are highly likely to impact Turkey within the next 12 to 24 months are Syrians and Kurds (from Syria, Iraq and Iran)."
Where did these predictions come from?

Gravity models.


I know what you're thinking, and yes, gravity models do have to do with gravity and yes, gravity models do happily reside within the domain of physics. 


Now, before you close the page, I am not going to spend the next four hundred or so words discussing Newton's Law of Universal Gravitation. Instead, on a one-stop flight from physics to intel, we will have a brief layover in the discipline of economics.  


I said, don't close the page!

Back in the 1950s, economists began to import gravity models into economics to predict trade flows and it looked something like this:



F_{ij} = G \frac{M_i M_j}{D_{ij}}

Where F is the flow of bilateral trade between country Mi and country Mj (measured in economic size). D is the distance between these two countries in kilometers squared and G is a constant. If it seems familiar, it should.  It is Newton's Law with the names of the variables changed.

In the 1970s, a man by the name of Peter Trudgill, a well-respected British sociolinguist, brought the gravity models to dialect geography. Trudgill, whose most recent book was published in 2011, holds many firsts within the field of linguistics, of which gravity models are one. He enacted a series of alterations to the gravity model in order to account for linguistic factors, the end goal being to create gradient zones of gradual shift from one phonetic marker to the next within geographic space (like geospatially representing the line in and around Boston that marks where people stop saying cah and start saying car). Trudgill's gravity model looked like this:


Trudgill's modifications to the original gravity model accomplish two goals:
  1. By adding the s variable measured on a scale of one to four, he factors in the well-known assumption in sociolinguistics that contact occurs more readily between groups that have "prior-existing linguistic similarity." This means that if the languages spoken by Population i and Population j are mutually intelligible (such as American English and British English), the s variable would be higher than for, say, American English and Italian). 
  2. By adding the second half of the equation, he gives the model directionality. The original gravity model was intended to predict bilateral trade, meaning it would predict the economic contact between one country and another. Trudgill's model predicts the influence country Pi will have on country Pj as opposed to the bilateral contact country Pand country Pj  will have.
Now, what does all this have to do with intel?

Within the context of my most recent project, mapping the Syrian refugee population in Turkey, gravity models provided a way to analyze the ethnolinguistic landscape in Turkey through a predictive lens. Instead of saying, "Here are where Armenians live in Turkey," I can say, "Here are where Armenians live and this is how likely there are to be more Armenians traveling to this region from Armenia in the next 24 months." 

In other words, it took the ethnolinguistic mapping project from descriptive to predictive

In order to achieve this, I added one more variable to Trudgill's altered gravity model: the c variable.


The c variable is a variable assessed on a scale of one to four that takes into account the trend line of migratory patterns from target ethnolinguistic groups over the past 10 years (data taken from the International Organization for Migration). In other words, to the degree that future migratory patterns follow historical trends for each ethnolinguistic group, the c variable takes these trends into consideration. 

What results from these models is an index score (usually a decimal) that is indicative of how likely or unlikely contact is to occur (or, in the case of some of the most advanced equations, how likely one region is to influence another region). This has the potential to  make a compelling predictive argument when translated onto a map (See Figure 1).
Figure 1. Results of gravity model analysis represented geospatially.
Green areas: Ethnolinguistic areas that are highly likely to expand in the next 12 to 24 months
Black areas: Ethnolinguistic areas that are likely to expand in the next 12 to 24 months
Click here for a map of all the ethnolinguistic areas of Turkey
These models have widespread application for predicting contact between any kinds of populations, but arguably the circumstance in which gravity models work best are when a generally homogeneous geographic region hosts pockets of ethnic diversity. Such is the case with Turkey, therefore gravity models provided a solid predictive approach. 

Gravity models were the backbone of my ethnolinguistic predictions regarding Turkey. It is the output of these models that got me to thinking about the incoming Syrian refugee population which, at the time, numbered close to 700,000 and today likely exceeds 1 million. 

With that in mind, don't miss the next post (by far the most interesting): Syrian Refugee Population Simulation: From *ORA to Istanbul.

Monday, September 9, 2013

Orienting The Intelligence Requirement: The Zooming Technique

In a world where infographics present everything from the world's biggest data breaches to the history of music media to beer varietals, its clear that data visualization has become kind of a big deal. 

But that's not what I want to talk about.

How about an interactive map presenting the age of every building in The Netherlands... or a flavor connection graphic for the more culinarily-inclined... what about a language map of New York City based on recent Tweets...

Nope.  

What about Chaomei Chen, a leading authority on information visualization, and her countless contributions to the field of functional infosthetics (see her paper on the top 10 unsolved problems of information visualization)?

Not even that.

For me, even more interesting than how to present information visually, is how to visualize information mentally.

All too often, analysts are given an intelligence requirement which they attempt to answer by searching for a specific answer or number or outcome, what I call working downward. What they forget to do is to take a more macro approach, to zoom out, if you will, and put the question into a broader context - to orient the intelligence requirement in its problem space. 

Figure 1: Working downward

Interesting.  How does this work? 

For this, I turn to Enrico Fermi... 

Fermi was a physicist who worked on the Manhattan Project.  He loved to ask his students to try to solve seemingly intractable problems using only thought experiments.  For example, Fermi would ask, "What is your best estimate for the number of piano tuners in Chicago?" Don't run to the Bureau of Labor Statistics website... as a matter of fact, don't even use a computer. 

These broad questions, known as Fermi Questions, are difficult or tedious to answer by only working downward (quick! pull out your phone book, flip to 'musical instruments' and start counting!), and almost impossible to answer without access to the internet (See Figure 1). But by zooming out, orienting yourself within the problem's scope and starting more broadly, you can actually drill down to a reasonable estimate of the number of piano tuners in Chicago in a matter of minutes. Here's how:
  • Start with the approximate population of Chicago, 3 million. 
  • On average, there are four people in a family, meaning that there are approximately 750,000 families in Chicago (3 million / 4) (See how general we are being?)
  • Not every family owns a piano, though. A high estimate could be 1 in 5, which means that there are approximately 150,000 pianos in Chicago that need tuning while a low estimate might be 1 in 50 which would mean there are only 15,000 pianos in Chicago.   
  • If each piano tuner works on 4 pianos per weekday, the average piano tuner will work on approximately 100-1,000 pianos per year. 
  • Pianos need to be tuned about once a year, therefore there should be approximately 15 to 150 piano tuners in Chicago. 
Now, I have no idea how many piano tuners there are in Chicago (Won't sleep tonight unless you know? Go to Wolfram Alpha and find out.  Once you realize that not all instrument repair people are also piano tuners, you will see that the Fermi estimate is pretty darn good!), but you can see where taking a more broad approach produced at least a viable estimate. 

This rough order of magnitude estimate is actually very useful.  First, if that is all we need, then we are done.  No money, little time, and just a bit of thought and we have an answer that fits our needs.  

Second, it helps us identify potentially wrong answers more easily.  Say we really need to know and we send someone out to collect this information and they come back as say 10,000 piano tuners work in Chicago.  Our Fermi estimate should cause us to question that number and the methods used to derive it.  

Finally, it allows us to know where we can get the biggest bang for the buck in terms of collecting additional information.  For example, we can get a more precise estimate of the number of people who live in Chicago but unless we are off by millions, it probably wont make much of a difference.  Getting more information on the number of pianos sold and to whom, might, on the other hand, really help our estimate.


Figure 2: Working on a sliding scale
This technique may seem trivial when we are talking about piano tuners but it comes in real handy when we are trying to get a rough estimate on "unknowable" numbers such as how many Taliban there are in Paktia Province or how many spies there are in the US government.

Figure 2 presents the mental image of this problem. Think of this mental image as the order of magnitude scale for any intelligence requirement (or any problem, for that matter). The red line is the starting point, or where the problem is oriented within the problem space. See how much you would miss by only working downward?

The first (and most important) step is visualizing where your problem is in the problem space, thereby determining how much you are able to zoom in or zoom out. This oftentimes helps analysts put the problem into both perspective and context. (If you have an MBA, this technique might seem similar to PESTLE or STEEP  - analytic methods designed to analyze the macro environment to put a smaller intelligence requirement into context). 

Second, it is important to zoom out and think to yourself what is "above" your problem, what is the next step up, conceptually from your problem?  For example, if you are analyzing a specific company, you will want to step back and look at the industry as well.  This zooming out exercise is also very useful at helping you spot assumptions you are making about your target.  For example, it is very easy to make a number of assumptions about the dictatorship in North Korea.  Stepping back and looking at China's interests in the region, however, adds a whole new level of nuance.

Finally, Fermi estimates are most helpful at the beginning of an analytic process, when you don't have lots of information, or when gathering the detailed information is expensive and time consuming.  But be careful!  Fermi estimates are just that - rough order of magnitude estimates that help you orient yourself and focus your collection and analysis activities.  If the situation warrants, more detailed estimates based on additional information may be required.

Independent of the intelligence requirement, the zooming technique is a beneficial way to visualize a problem space, identify information gaps, contextualize information, recognize assumptions and, above all, approximate and approximate quickly, a skill highly relevant to an intel analyst in any field.

Monday, July 29, 2013

Your New Favorite Analytic Methodology: Structured Role Playing

If you use only one analytic methodology this week, Structured Role Playing (SRP) should be it. 

SRP as a way to improve analysis boasts supportive literature from the fields of law, military, pedagogy and politics. It is also frequently compared - and compared favorably - to other approaches such as Devil's Advocacy, expert opinion judgments and game theory within the relevant academic literature (check out the list of references at the bottom for more info). 

Note: The 2011 study is the "role-thinking" study which,
although yielding better than chance results, was not as effective
as SRP.

Basically, SRP is putting yourself in another person's shoes and looking at situations from different vantage points. 

Sounds pretty straightforward. 

But it's more than just thinking through what another person might be experiencing (a process cleverly named "role-thinking"by Kesten Green and Scott Armstrong in 2011 which has shown to be less effective than SRP). 

It's sitting down with a group of analysts, assigning roles of various actors and playing out situations as if you were those people, acting with the cultural, political and personal motivations of all parties involved. It is spontaneous. And, if done properly, can be extremely revealing. 

In order to implement SRP correctly, there are certain criteria that have to be met.

The two main criteria for effective SRP are that it has to be as realistic as possible and that the scenarios be objective (not decisively biased towards one outcome or another). 

Beyond those structural tenets, other recommendations for effective SRP include:
  • Assign roles before presenting the situation (according to Linda Babcock, this makes a big difference).
  • Ensure a high level of involvement by participants (also called "active role playing" as opposed to "passive role playing" or "role thinking"). This involves personal and spontaneous interaction between participants.
  • Maintain a low degree of response specificity. It is better to let participants improvise responses than to have them select responses from a predetermined list, for example, in a multiple choice format. 
As an aside, it is relevant to mention that the 2011 role-thinking study, which yielded the lowest of the SRP results listed in the chart above, violated the last two of these implementation recommendations, possibly leading to its poor comparative performance.

As is the case with all methodologies, however, there are critics of the technique. 

The main criticism of SRP is its relative ineffectiveness when not correctly implemented. As mentioned above, "role-thinking" does not work very well at all. This point, however, is easily addressed:  Follow the simple guidelines to ensure correct implementation of SRP.

The second, less investigated criticism of SRP is overconfidence resulting from having employed the technique in the first place. This phenomenon, if replicated, is generally true for any analytic methodology.

Ultimately, SRP is an analytic technique that allows analysts the opportunity to interpret situations from distinct vantage points. It's collaborative, it's spontaneous and effective. 

Extended reading list:

Monday, March 18, 2013

Advanced Analytic Techniques Is Back Up And Posting Again!

www.advat.blogspot.com
Each spring, I teach a class called Advanced Analytic Techniques (AAT) here at Mercyhurst.  It is a seminar-style class designed to allow grad students to dig into a variety of analytic techniques and (hopefully) master one or two.   

The students get to pick both the topic and the technique on which they wish to focus so you wind up with some pretty interesting studies at the end.  For example, we have applied the traditional business methodology of "best practices" to western European terrorist groups and the traditional military technique of Intelligence Preparation of The Battlefield to the casino industry.

As you can imagine, some of these projects gain a bit of notoriety for their unique insights.  One of my former students, Jeff Welgan, even had his AAT project written up in the book Hyperformance.

Beyond this deep dive that each student is required to do, the class is also designed to teach students how to evaluate analytic techniques for things such as validity and flexibility.  To help with this process, each week we take a quick look at an analytic technique that no one in the class is using in their projects.  

We start this process with a tour d'horizon of the available literature on the method with a particular focus on the literature that is higher up the evidence pyramid and relevant to intelligence analysis.  At the end of the week, half of the class runs an abbreviated demo of the technique using the other half of the class as guinea pigs.  Once we are done, we all sit down and write up our thoughts about the method.  Last week, for example, we took a (quick) look at Decision Trees.  This week we will be examining various forms of crime mapping.

All of this - the summaries and critiques of the articles we have found, and our overall "evaluation" of the technique - gets posted onto the Advanced Analytic Techniques blog each week.  Over the years, the blog has become increasingly popular and I certainly encourage everyone to take a look and, if you have a comment, join in!

Tuesday, September 20, 2011

Analyst's Cookbook, Volume 2, Now Available!

The Analyst's Cookbook, Volume 2, is out right now(!) and can be downloaded from Amazon.com to your Kindle or, if you don't have a Kindle, to one of the free Kindle readers for PC, iPhone, etc.

We went with a Kindle edition of the Cookbook this time around for all the reasons anyone goes to digital publishing -- it is less expensive for you to buy (only $4.99) and easier for us to manage than paper books.

For those of you familiar with the first Cookbook, thanks for your support ... and for waiting so long!  Your loyalty has made The Analyst's Cookbook the best selling book in MCIIS' inventory (it is now in its third printing!).

For those of you not familiar with the first volume of The Analyst's Cookbook (still available in hardcopy here), it is a series of short articles that outline the basics of a variety of different analytic techniques.  Each chapter was written by a different analyst and addresses one specific method or technique, provides a short description, a how-to, and a sense of the pros and cons of the method.  The second volume follows the same pattern.

What really makes the chapters interesting, though, is the experience each  individual analyst had when they tried to apply the method to a particular problem.  In the past, these method/problem match-ups make for some fascinating reading (like when one analyst applied the business methodology of benchmarking to European terrorist groups).  The current collection is no exception in this regard.

The real exception in this volume is that, in the past, the Cookbook was a venue to show off graduate student writing, this volume shows off graduate student editing as well.  It was put together almost entirely by the editor for the MCIIS Press, Nicole Pillar.

Finally, while we had many good suggestions for improving the format of the Cookbook over the years since Volume 1 was published, in the end, we decided to stick with the less formal, "cookbook", approach of Volume 1.  The goal for us is to capture the experience of using a particular analytic method on a real problem, to give the reader a sense of how these methods work.  The purpose is not to provide a definitive evaluation of one approach vs. another.  It is a starting place for thinking about analytic methods, not the end point.

I hope you enjoy the new Cookbook!

To purchase The Analyst's Cookbook, Volume 2:  Go here!
To download free Kindle Reader software:  Go here!

Friday, October 23, 2009

ONI's New "Hoo-ahh!" Video, Deconstructing Analysis Techniques, The Geography Of Job Loss And The Future Of Shopping (Link List)

Lot's of interesting stuff crossing my desk this week:

  • The Office of Naval Intelligence has a new promotional (i.e. "hoo-ahh) video out. It gives a brief overview of the ONI's new organizational structure and mission. Many people don't think about ONI as an intel career option but they actually do some pretty cool stuff. The video is certainly worth 5 minutes of your time (Note: It takes a few minutes to get started (I don't understand why these guys don't just upload these videos to YouTube...). Also, if you are interested, see it quickly as Matchbox Twenty's lawyers may slap a take-down notice on the ONI for unauthorized use of copyrighted material (not even a music credit, ONI? That was cold...).
  • Visualizing information is a powerful way to communicate analysis. A good example of this is Tip Strategies infographic showing job loss and gain in the US from 2004-2009. It is both stunning and depressing but clearly shows the value of a good visual (Sorry, no embed. You will have to go to the site to see it).


Reblog this post [with Zemanta]

Wednesday, June 10, 2009

Attention!! (Psyblog.org)

Made you look, eh?

Attention and the way we focus it are incredibly important aspects of intelligence analysis.

In the first place, there is some pretty good research to suggest that we learn what we attend to. Understanding where certain analytic methods focus our attention, then, allows us to determine what we might (or might not) learn from them. This, in turn, allows us to decide more rationally which method to choose for which situation.

Psyblog (via Elearnspace) has an interesting series of short articles that discuss these effects in much more detail. Specifically, these authors delve into seven additional aspects of attention:

  • The Cocktail Party Effect
  • The Attentional Spotlight
  • Learning To Multitask
  • Can Visual Attention Be Truly Divided?
  • 18 Ways Attention Goes Wrong
  • Attentional Blink And The Stream Of Consciousness
  • How Meditation Improves Attention
I am currently working on a research project designed to develop the hardware and software for a human-computer interface that allows analysts to "optimize" their attention while doing their job on the move. The centrality of attention emerged early in this project and I suspect that the more I look at it, the more important it is going to become in a wide variety of analytic tasks. This short series of posts is a very good introduction to the topic.

Friday, April 24, 2009

Amazing Resource On Intel Analysis Methods (FOR-LEARN)

One of the sites my students came across in the course of our studies into advanced analytic techniques is the very good FOR-LEARN online guide to analytic methods. It is a site that ought to be bookmarked by every analyst -- business, law enforcement and national security types included.

Put together under the auspices of the European Commission's Joint Research Center, FOR-LEARN seeks to guide "users throughout the critical steps of design, implementation and follow-up of a Foresight project and gives a description of the main methods that can be used." Simply replace the word "Foresight" in the previous sentence with "intelligence", and this resource becomes an invaluable (and free) starting point for research into all sorts of analytic methods.

There is a ton of good info here but the first golden nugget is the "methods table" the authors have put together (See static image below. Click on the image to go to the interactive version).

Each of the methods listed, in turn, has a broad, structured overview of the method, along with a brief how-to, its strengths and weaknesses, a case study and some links for additional research. The outline is remarkably similar to the one we used in The Analyst's Cookbook and continue to use in our Advanced Analytic Techniques course (clearly a case, by the way, of great minds thinking alike...).

The site contains much more than just methodologies, however. There are sections on how to scope and run an analytic project as well as extensive additional resources included on almost every page of the guide.

The guide does not contain everything, of course. There is a good bit more to many of these approaches than what the authors have chosen to cover here. There are also a number of intelligence methods that are not mentioned here. Despite these quibbles, it is an excellent product overall and deserves some attention from any serious analyst.

Wednesday, March 25, 2009

Advanced Analytic Techniques -- The Blog! (ADVAT.blogspot.com)

Yes, Advanced Analytic Techniques now has a blog. And you can all join in!

I am teaching a graduate seminar in Advanced Analytic Techniques this term. The core of the course is a series of student projects that hyperfocus on the application of a particular analytic technique (such as patent analysis or social network analysis) to a discrete topic (such as the political situation in Turkey or the future of oil and gas exploration in the Caspian Sea). The best of these projects wind up in The Analyst's Cookbook.

Each week, however, in addition to diving deep into these individual techniques and topics, we also work as a group to come to some conclusions about a number of other techniques. In preparation, each of the students selects, reads and summarizes two articles on whichever technique is under the microscope for the week.

They then post these summaries and links to the full text of the articles on our Advanced Analytic Techniques blog. Each Wednesday, we sit down and have a discussion about the readings. We also run a short exercise using the technique. From the combination of discussion and exercise, we try to answer four questions:

  • How do we define this technique?
  • What are the strengths and weaknesses of this technique?
  • How do you do this technique (Step by step)?
  • What was our experience like when we tried to apply this technique?
Once we think we have pretty good answers to these questions, we post what we have developed to the blog in order to capture our collective thinking on the technique in question.

Obviously, this only serves to familiarize the students with the technique under consideration. The blog format, however, permits us to open this series of exercises up to practitioners, academics and intel studies students at other institutions for comment and additional insights -- which is what I am doing with this post.

Last week we took a look at SWOT and this week we are examining Dialectics/The Socratic Method (summaries of the articles are posted now but the synthesis post will be up late this afternoon).

Don't hesitate to jump in!

Tuesday, May 6, 2008

"New" Version Of Psychology Of Intelligence Analysis Released (Pherson.org)

I don't know how I missed it but Dick Heuer and Randy Pherson (of Pherson Associates) apparently teamed up some time ago to come out with a re-print of Dick's classic, The Psychology Of Intelligence Analysis. We use it (along with Clark's Intelligence Analysis: A Target Centric Approach and Lowenthal's Intelligence: From Secrets To Policy) in our freshman classes as a textbook and have had to rely on the web version of the text on the CIA's site since the original version is out of print.

If you are not familiar with this book, then you owe it to yourself to read it immediately. Our students really enjoy it as it is easy to read, interesting and informative. Most of them are only vaguely aware of how their cognitive biases can impact their analysis and the book is a real eye-opener. It also contains the most lucid description of the Analysis Of Competing Hypotheses method available.

I have heard a rumor that Dick is working on a updated version of the book but until that one comes out you can order individual copies from Amazon or Pherson Associates and bulk orders from Pherson (You'll need to get in line behind me, though...).

Related Posts:
What Do Words Of Estimative Probability Mean?

Wednesday, April 16, 2008

Improving Intelligence Analysis: Two Methods Are Better Than One (USIP)

It one of what has to be the most lucid papers I have read in a long while, Jack Goldstone, writing for the US Institute Of Peace, clearly and concisely explains the value of using multiple independent analytic methods in forecasting instability. His paper, titled "Using Quantitative And Qualitative Models to Forecast Instability" (download full text here), is a real gem that will be of use to academics, students and professionals. Goldstone is a Professor at George Mason and also the Director of the Center For Global Policy (The Center appears to do some extremely interesting research but some of it is hidden, unfortunately, behind a paywall).

Highlights from the Summary include (Boldface, hyperlinks and italics are mine):

  • For most of the post–World War II period, policymakers and intelligence agencies have relied on experts to make qualitative judgments regarding the risk of instability or violent changes in their areas of study. Yet the inability of such experts to adequately predict major events has led to efforts to use social and analytical tools to create more “scientific” forecasts of political crises. (Note: Goldstone also cites the work of Phillip Tetlock favorably in this regard).
  • Because certain models have a demonstrated accuracy of over 80 percent in early identification of political crises, some have questioned whether such models should replace traditional qualitative analysis.
  • While these quantitative forecasting methods should move to the foreground and play a key role in developing early warning tools, this does not mean that traditional qualitative analysis is dispensable.
  • The best results for early warning are most likely obtained by the judicious combination of quantitative analysis based on forecasting models with qualitative analysis that rests on explicit causal relationships and precise forecasts of its own. (Note: Such an approach was explicitly used by my students in their project on the role of non-state actors in sub-Saharan Africa).
  • Policymakers and analysts should insist on a multiple-method approach, which has greater forecasting power than either the quantitative or qualitative method alone. In this way, political instability forecasting is likely to make its largest advance over earlier practices.

Tuesday, December 11, 2007

New, Simple Prediction Market Tool (Predictify.com)

Prediction markets have been around for a long time and I have mentioned them here briefly before. Fundamentally, they operate like futures markets (some would say gambling establishments). In the simplest version of these systems, people essentially place bets on what the future will bring. The person who gets closest wins. In more complex systems, people can actually buy and sell the "bets".

The idea is that, if you have enough people involved, the going price will converge, over time, on the correct price. Of course, anything can be "valued" this way and probably the most famous predictive markets are the Iowa Electronic Markets. They have run a market on who will win the presidential election (among others) for a number of years and have been very successful at predicting the results. Currently you can buy futures -- i.e. contracts that will pay a dollar the day after the presidential election -- in the eventual Democratic candidate for about 60 cents and contracts for the eventual Republican candidate for about 40 cents. These prices predict, at this point, a Democratic victory because people are willing to pay more for a Democratic candidate than a Republican candidate while still only getting a dollar after the election.

The US government has played around with this idea (FutureMap was a predictive market idea that was linked to the ill-fated Total Information Awareness program which is a whole other story...) and probably still is in one form or another.

The goal of all these markets is to tap into the collective wisdom of many people to help make accurate predictions concerning the future or at least the odds that certain futures will occur. There are a number of books and papers that touch on this topic right now. The Wisdom of Crowds is one but my favorite is Gut Feelings.

That is a long preface to get to a new predictive market tool available at Predictify.com. I have used it to set up a market in oil prices that you can see here. It is an easy way to get input on discrete questions. The team at Predictify help you mold your question and make it more specific in addition to helping you identify the exact source you will use (in my case, Bloomberg.com) to identify the winners and losers. In all, it was painless and I already have over 50 "answers" to my question.

If you are interested in exploring the power of another predictive market, particularly one with a national security focus, see Strategypage.com. You can search for other predictive markets here. For general information about the prediction market industry (yes, it is an industry) follow the newly formed Prediction Market Industry Association.