Wednesday, July 17, 2013

Do Intel Analysts Believe Info Is Good Just Because It's Secret?

(Note:  This post introduces a new author to Sources and Methods, Melonie Richey.  Mel and I will be working on a number of projects over the next year focused on the intersection of game-based learning, cognitive bias and intelligence analysis.  In her first post for SAM, she introduces us to a new form of bias...)

Heuristics (or "rules of thumb") and biases influence both decisions and the analysis on which they are based.

This is an inescapable fact.

Anchoring bias causes us to hyperfocus sometimes on irrelevant information, confirmation bias leads us to marry ourselves to the first probable conclusion we reach and bias blind spot allows us to do all this operating under the assumption that we, as trained analysts and generally educated people with inquisitive minds, are not biased at all.


These biases are extensively studied and appear in a myriad of interdisciplinary literature ranging from Wizards Under Uncertainty: Cognitive Biases, Threat Assessment, and Misjudgments in Policy Making (cognitive biases in terms of Harry Potter) to The Big Idea (Harvard Business Review) to The Mind’s Lie.

A recent paper entitled The Secrecy Heuristic - authors Mark Travers, Leaf Van Boven, and Charles Judd from the University of Colorado - presents research substantiating a new heuristic that likely affects both intelligence professionals and decisionmakers: The idea that we “infer quality from secrecy” when it comes to intelligence analysis.  In other words, we will give the same information more value just because it is secret.

This paper presents the three reasons that we fall victim to the The Secrecy Heuristic, and outlines the experimental evidence that validates the presence of this heuristic in information quality evaluation.
  • “First, secret information is sometimes genuinely better information than public information, particularly in strategic contexts.” As the article elaborates, secret information is something valuable simply because it is secret (think about financial investors and stock market prices – the information is only valuable because not a lot of people are privy to it – this is where the whole concept of insider trading originates). The fact that secret information is sometimes better, leads us to associate that quality with all secret information.
  • “Second, people may view secret information as being of higher quality than public information because of personal experience with their own and others’ secrets.” Think about gossip. In a social context, what is a secret? Usually, it is something personal or embarrassing that an individual doesn’t want everyone else to know, but which everyone else (everyone else defined as the immediate social network) would likely take interest in.  Again, this reinforces our bias in favor of secret information.
  • “Finally, governments often behave, in foreign policy contexts, as though secret information is valuable and of high quality.” After all, we have multiple intelligence agencies with 100's of thousands of employees dedicated to secrecy; both maintaining it on our soil and revealing it on others. Uncovering secrets led to, as the article references, tracking down the leader of the Pan Am flight 103 bombing, the 1993 World Trade Center bombing and Osama bin Laden. It’s not hard to see why secrets are so …well, secret.  This, in turn, reinforces our bias in favor of secret information even when non-secret information is just as valid.
The argument in The Secrecy Heuristic is that “these factors may lead people to use informational secrecy as a cue to infer informational quality” and, lo and behold, they found just that.

In the first of three experiments, the researchers tested whether or not people weighed secret information more heavily than public information in hypothetical foreign policy recommendations. In experiments two and three, the researchers tested whether or not secret information is perceived as higher quality than public information, and in experiment three, how secrecy impacted how favorably the foreign policy recommendation was rated.

On aggregate, “secrecy led to higher judgments of information quality.” For example, on a scale of 1 to 11, the mean judgment of secret information quality was 7.46 where the mean judgment of quality for public information was only 6.93.

In short, secrecy does not necessarily equate to importance, relevance or reliability - we just think it does.

(H/T to Tammy G for pointing us towards this article!)

10 comments:

Leslie G said...

I didn't know there was a book about this. Thanks for highlighting it. I'd love for these experiments to also be conducted for leaders in intel agencies and for politicians. Then, as a follow up, an explanation of how, when and why to label items secret can be covered. My impression is that it's an overused tool. I'd be interested in knowing what other analysts believe.

Dick Heuer said...

I question the accuracy of calling this a heuristic. A heuristic or bias is something that causes you to be wrong considerably more frequently than being right. Where is the evidence that placing more confidence in secret information than in open information is wrong. What is your basis for saying that a classified report on event xyz is less likely to be accurate than a newspaper report on the same situation?

Kristan J. Wheaton said...

Dick,

I am not sure the authors are saying that secret info is more or less likely to be correct. I think their experiments showed that people, when presented with the exact same info, and one time they are told it is "secret" and one time they are told it is public, they consider the secret info inherently more reliable. There is, in short, a bias in favor of secret information in humans, regardless of the objective reliability of the info.

Kris

Dick Heuer said...

Kris,
As I understand it, there is a little difference between heuristic and bias. A heuristic is a quick and easy way of making a decision based on past experience. It doesn't necessarily have to be right or wrong. A bias is a systematic error, something that is usually wrong. It's a pejorative term. You shouldn't make a biased judgment. To describe a preference for classified over unclassified information as bias, I believe you have to show that the unclassified information is more likely to be correct than the classified information.

Dick Heuer

ctwardy said...

DH,

I'd rather identify "heuristic" or "bias" with potential for error, not actual error rate. It may be that they help more than they hurt.

Kahneman & Tversky noted that the heuristics and biases they identified usually work well, and Gigerenzer goes so far as to call them adaptive: "simple heuristics that make us smart". Kevin Korb has shown formally that what we usually teach as "fallacies" are often very good Bayesian arguments.

This doesn't reduce caution -- we might expect the worst biases to be ones that normally lead us to the right answer, because we'll be so much more surprised when they fail us.

Maybe in an analytical setting the traditional biases normally mislead. But then maybe we can take a cue from Mercier & Sperber and adapt the setting to use the bias to our advantage.

In many settings there may be good reason to favor secret info -- but I'd wager that if so, it's because it has typically been a good proxy for reliability or timeliness. So favoring it is another useful heuristic which can go wrong if those connections are weakened -- and they will weakened be if "secret" is overused, eg by "erring on the side of caution" when labeling.

Kel McClanahan said...

So what you're saying is that all the overclassification is actually a clever Jedi mind trick to make it so our enemies can't figure out what is actually important because we classify everything. Pretty sneaky...

Anonymous said...

The "egomaniac bias" probably affects everyone at some point. Some research has discovered that the natural condition for people is to share information - as much as possible, as soon as possible (it's likely that it has some utility when it comes to group coordination and survival) - and particularly intense concentration has to be employed in order not to do so. The pathologies arising from this have been well documented - they are aptly called "intelligence failures." One may well be wrongly tempted to conclude that one who is able to successfully maintain the ruse is a cognitive superstar, for this arrogance produces blind spots. But these cognitive resources are not unlimited, and eventually humility is delivered through some Higher Power. Generally, as principles for analytic success, we hope that there's 1) sufficient redundancy to overcome the cognitive fatigue concern, and 2) sufficient diversity to overcome the arrogance concern. Secrecy generally operates against these principles.

A possible corollary to Sun Tzu's maxim that "no nation in history ever benefited from protracted conflict" might be "no nation in history ever benefited from prolonged secrecy" - in other words, operations security or "the probability of operational success over time" will probably look something like e^-x as a function of time/duration of operation. Minimizing the possibilities of failure automatically increases the chances for success (this should be intuitive). Although secrecy is naturally unsustainable, there are two possible solutions to optimize operations security: 1) reduce the size of the pool of those who possess high-level security clearances, or 2) classify less/declassify more sooner. If you opt for the former, you will run afoul of the two analytic success principles above. This framework should restore some humility to intelligence and establish some natural limits on what intelligence can do and do successfully.

Emmett Fitz-Hume said...

I can't get to a copy of the article, but I wonder how meaningful is the difference between 7.46 and 6.93. I assume the result passes a difference of means test since it was published in a journal, but I wonder how close the result came to missing statistical significance. Any insights into that someone can share?

Melonie Richey said...

The article actually encompassed three studies. The 7.46 and 6.93 numbers came from the second study and were reported to be statistically significant at the 0.05 level. p=0.24, so it really only narrowly missed being statistically significant at the 0.01 level.

Results from studies 1 and 3 also reported results significant at the 0.05 level, with the majority at the 0.01 level.

The article also presented a statistically significant correlation; the correlation between "decision quality" and "information quality" (r=.41 and p=0.003).

Unknown said...

With all due respect to Mr. Heuer, not only are heuristics and biases entirely different, neither is inherently good or bad. Only our thought-less application of them is wrong. A heuristic is a guide to exploration, discovery, or problem-solving, consciously chosen for application in a given context. A bias is a slant or inclination in our thinking. It is impossible for us not to have biases. They are essential for human functioning and even survival. Through critical thinking, we choose heuristics wisely, and check our biases to determine how we are thinking, and the appropriateness of our specific thoughts to the task at hand. We take into account our assumptions and point of view. Rather than rejecting our biases, we should embrace them, for they are the basis of insight. Blindly following biases is wrong, but acknowledging them and actively choosing whether to go where they lead us is critical thinking.