Monday, January 9, 2012

What Makes An Easy Question Easy? (DAGGRE.org)

For most of last year I have had the privilege of working with the DAGGRE Team on the Intelligence Advanced Research Projects Activity's (IARPA's) Aggregative Contingent Estimation (ACE) Project.  While all the real scientists have been busy exploring research questions involving Bayesian networks and combinatorial markets, this old soldier has been focusing on more mundane things like "What makes an easy question easy?"

(Note:  If you have not had a chance to check out the DAGGRE.org site and its mind-numbingly cool companion blog, Future Perfect, you should.  Three reasons:  First, it is pretty interesting research that could impact the future of the intel community.  Second, you can actually participate in it.  Third (and maybe most important to many of the readers of this blog),  the DAGGRE team has gone the extra mile to make sure your personal data, etc is secure while participating (check out the FAQ page for all the details)).
As I explained in this post, having some way to evaluate and even rank intelligence reqyuirements according to difficulty is important.  Analysts are supposed to be accurate but if you aren't also evaluating the difficulty of the underlying question, two equally accurate analysts could be miles apart in terms of overall quality.  It would be kind of like saying a little leaguer who hits .400 is as good as a major leaguer hitting .400.

While that distinction is easy to see in baseball, it is much more difficult in intelligence analysis.  Questions come in all shapes and sizes and vary in an enormous number of ways.  There is also a psychological, subjective aspect to it:  Questions that seem tough to some analysts may seem very easy to others.  On its face, it appears difficult if not impossible to come up with a system that can reliably evaluate and categorize questions by difficulty level.

Which is why I want to try.

And I may be making some progress.  I think I have figured out how to spot an "easy" question.  DAGGRE, you see, is a predictive market.  This means that people assign probabilities to the outcomes of questions.  Imagine, for example, I asked if Sarkozy would still be the president of France on 1 JUN 2012 (He is running for re-election in April and May).  Now imagine that you thought the odds of Sarkozy's re-election were 80%.  You could establish your position in the market at that "price" and others would be free to do the same (The Iowa Electronic Markets do this for the US election, by the way).

The market would reward people that were right on 1 JUN and would heavily reward those that were right when lots of others were wrong.  Studies have shown that, on average, these types of markets are pretty good at making these kinds of estimates.

Now, imagine if I asked you to estimate the chances that Sarkozy would still be president of France by 1700 tomorrow?  Sarkozy is not sick (at least I hope he is not) and there are no direct, immediate threats to his presidency.  There is no reason to expect that he would not still be president tomorrow. Likewise, successfully predicting that he will still be in office tomorrow is no sign of great analytic ability.  The question is too easy.

Generalizing this pattern, I think it is worth exploring the idea that "easy" questions are those that start and end their run on a predictive market close to either 0% or 100% probability, do not vary much during the course of that run and, finally, resolve in accordance with their probabilities (i.e. they happen if close to 100% and don't happen if close to 0%.  See the picture that accompanies this post for an idea of what such patterns might look like).  Furthermore, I think that these kinds of questions will see much less trading activity than other ("not easy") questions.

Of course, the problem with this definition is that it only identifies easy questions after the fact, after the question has been resolved.  My hope, however, is that by examining the set of questions that we already know are easy (at least under this definition), that we might be able to see other patterns that will allow us to identify easy questions when they are asked rather than only after they are answered.

Our (I say "our" because I am working on this with one of our superb grad students, Brian Manning) first attempt to get at these patterns will be a simple one -- question length.  We hypothesize that, on average, questions that match the "easy" pattern I described above will be shorter than other questions.  When you think about it, it makes some sense.  After all, "What time is it?" seems like an easier question to answer than "What time is it in Nigeria?" 

Brian has found some research that says that, subjectively, people don't perceive longer questions as necessarily more difficult.  The difference, of course, is that we have a definition of "easy" that is based on objective criteria.  Still, I think it best to start with the easiest possible measurement and then go from there.  Not sure where I will end up or if this will be a dead end but I will keep you posted...

Wednesday, December 14, 2011

Is ISI Really The Best Intelligence Agency In The World?

http://nationalpostnews.files.wordpress.com/2011/11/spies.gif
According to the National Post, Canada's conservative newspaper, it is.

That is just one of the interesting tidbits reported in this graphic titled, The State of The Global Spy Game (Download the PDF here).

Following Pakistan's ISI comes Mossad in the number 2 slot, MI 6 taking third and the CIA following up in fourth place.

In addition to the Top Ten list, most of the graphic outlines a series of assassinations, explosions, spying, cyber spying and "convenient accidents" that the Post ties to various intelligence organizations over the last ten years.

Finally, there are some charts which claim to be based on some of Richards Heuer's work regarding the demographics of spies; where they come from in government, how much education they have, etc.  The graphic provides no comparative data to see if any of the categories identified are larger or smaller than they are in the relevant population from which they are drawn so it is difficult to draw conclusions but intriguing nonetheless.

Given the nature of the article and difficulty associated with making these kinds of judgements, I am not surprised at the results but it is still an interesting question to ask:  Who has the world's best intel service? 

(Hat tip to Christophe Deschamps at Outils Froid and his must follow Twitter feed!)

Friday, December 9, 2011

1st Annual Entry-level Intel Analyst Jobs Report Out Now!

How good is the job market for entry-level intelligence analysts over the next 12 months? 

Good question, right?  If you are a recent graduate from college or you are graduating in 2012, and you are interested in working as an analyst in the US national security intelligence community, it is probably one of the questions you are asking yourself.

The answer is contained in this document.

We tasked one of our outstanding grad students, Whitney Bergendahl, with examining this question back in the early fall.  He put together a survey (which some of you may remember) and conducted some fairly extensive secondary research to put together this report.  It is obviously a tough nut to crack but Whitney has done yeoman's work on this first ever, job market report for entry-level intel analysts.

Whitney and I are both interested in your feedback, of course.  After you have had a chance to read the report, please leave a comment!

But wait, there's more!

(I know that sounds cheesy but there is, in fact, more...)

This report only covers the job market for entry-level analysts in the national security intelligence community.  Between now and the end of February, we hope to publish two other reports on the job markets for entry-level law enforcement intellience analysts and for entry level intelligence analysts in the business community. 

But wait!  There's even more!

(Had to do it...)

Benjamin Wittorf, who publishes occasionally on the blog, Netzwerk-Organisatorische Formen, but is probably best followed via Twitter and makes a living as a researcher for eVenture Capital Partners, has turned my series of blog posts, How To Get A Job In Intelligence, into an epub for easy (and free!) download. (Note:  I am sure there is something clever I could say here about "the kindness of strangers" but I can't think of it so I will just say, Thanks, Ben!)

While this series is a little old, I think much of the general guidance ought to still be good.  If you want to read it, you will, of course, need to have an epub reader to access it.  If you don't, there is also a pdf version or, of course, you can still access the original series on this blog.

Tuesday, December 6, 2011

If You Think You Understand The Role Of Social Media In The Arab Spring Uprisings (And Particularly If You Don't), Watch This Video...

 

This video is an hour long and, frankly, I didn't think I would have the time this morning.  Started watching nonetheless and became riveted by one of the most cogent explanations of the role of social media in activism I have heard.  Even if you disagree (and this is not my area of expertise so I hope those that do disagree will do so in the comments so we can all learn), it is well worth the hour it takes to watch.

(Many thanks to my friends at Sharp for this!)

Thursday, December 1, 2011

Testing A New Mindmapping Tool...And You Can Join In! (Popplet.com)

One of our amazing alums (Thanks, Justin!) sent me a link to a beta version of a new, web-based, collaborative mindmapping tool called Popplet.

I have been playing  around with it for the last half hour or so and found it easy to use and potentially very useful.

What I don't know is how well it works as a collaborative tool (which is where my real interest lies -- there is plenty of good stand-alone mindmapping software), so I thought I would throw it out there for anyone to examine (Popplet makes this simple with an embed code.  Yeah!). 

If you are interested in trying it out, however, you will need to drop me an email (kwheaton at Mercyhurst dot edu) and I will send you an invite.

These kinds of open, collaborative tools that are easy to set up and quick to learn are great for classroom exercises; they are interactive and engaging.  In my experience, students love them (If you are looking for another example, try Willyou.typewith.me)

It is also a great way to build a mental model of an intelligence problem. This app is in beta though and has no way (that I could find) to safely share or export the data. There is an offline reader application called Popplet Presenter which would allow a single individual to show his/her work securely (-ish) to others, I suppose.

I suspect these features are coming (offering these features for a modest price is the way most of these kinds of apps, like Mindmeister or Webspiration, make their money) but until then, this is probably best confined to the classroom.