
Thursday, March 31, 2011
How To Negotiate A Ceasefire (HDCentre.org)

Posted by
Kristan J. Wheaton
at
2:06 PM
0
comments
Labels: ceasefires, Centre for Humanitarian Dialogue, document summary, intelligence, mediation
Friday, December 10, 2010
Intelligence Issues For Congress (OpenCRS.org)
Image via WikipediaOne of my favorite sites for finding interesting new documents, Docuticker, recently highlighted an October, 2010 Congressional Research Service (CRS) report, Intelligence Issues For Congress.
With a new Congress about to be sworn in and the balance of power shifting in the House, this list of issues is worth examining by anyone interested in the direction of the US national security intelligence community.
For those of you unfamiliar with the CRS, it is one of the most reputable sources of information and informed analysis currently available on the planet.
Unfortunately, it is exclusively for the use of members of Congress and, unless a member of Congress releases a report, the CRS's analysis does not see the light of day. Organizations like the Federation Of American Scientists and OpenCRS.org pick up on any reports that are made public and host them (which is how Docuticker got this one).
Below is my edited version of the summary (You can download the full PDF here). I cut out the stuff that I thought would be familiar to SAM's readers and have highlighted those parts (in bold) that I thought would most interesting. Other than that, the words below are direct quotes:
- Making cooperation effective presents substantial leadership and managerial challenges. The needs of intelligence “consumers”—ranging from the White House to Cabinet agencies to military commanders—must all be met, using the same systems and personnel. Intelligence collection systems are expensive and some critics suggest there have been elements of waste and unneeded duplication of effort while some intelligence “targets” have been neglected.
- The DNI has substantial statutory authorities to address these issues, but the organizational relationships remain complex, especially for Defense Department agencies. Members of Congress will be seeking to observe the extent to which effective coordination is accomplished.
- International terrorism, a major threat facing the United States in the 21st century, presents a difficult analytical challenge, vividly demonstrated by the attempted bombing of a commercial aircraft approaching Detroit on December 25, 2009. Counterterrorism requires the close coordination of intelligence and law enforcement agencies, but there remain many institutional and procedural issues that complicate cooperation between the two sets of agencies.
- Techniques for acquiring and analyzing information on small groups of plotters differ significantly from those used to evaluate the military capabilities of other countries. U.S. intelligence efforts are complicated by unfilled requirements for foreign language expertise. Whether all terrorist surveillance efforts have been consistent with the Foreign Intelligence Surveillance Act of 1978 (FISA) has been a matter of controversy.
- Intelligence on Iraqi weapons of mass destruction was inaccurate and Members have criticized the performance of the intelligence community in regard to current conditions in Iraq, Iran, and other areas. Improved analysis, while difficult to mandate, remains a key goal. Better human intelligence, it is widely agreed, is also essential.
- Intelligence support to military operations continues to be a major responsibility of intelligence agencies. The use of precision guided munitions depends on accurate, real-time targeting data; integrating intelligence data into military operations challenges traditional organizational relationships and requires innovative technological approaches. Stability operations now underway in Afghanistan may require very different sets of intelligence skills.
Posted by
Kristan J. Wheaton
at
12:39 PM
0
comments
Labels: Congress, Congressional Research Service, document summary, intelligence
Wednesday, November 17, 2010
Does The Future Belong To Robots? (Institute For the Future)
Posted by
Kristan J. Wheaton
at
2:09 PM
0
comments
Labels: document summary, future, Institute For The Future, intelligence, intelligence analysis, robot, Robotics
Wednesday, October 27, 2010
US Slighty Less Corrupt Than Uruguay, Slightly More Corrupt Than Chile (Transparency International)
![]() |
http://www.transparency.org |
Posted by
Kristan J. Wheaton
at
2:34 PM
1 comments
Labels: document summary, Index, Political corruption, Resource, Transparency International
Friday, October 22, 2010
"Effective Intelligence Analysis Is A Concept-driven Activity Rather Than A Data-driven One" (DTIC)
"The most telling result of the research is the clear implication that intelligence analysis is conceptually driven as opposed to data driven. What is critical is not just the data collected, but also what is added to those data in interpreting them via conceptual models in the analyst's store of knowledge."
- Is a technologist
- Is either a specialist or a generalist but not both
- Is an "information entrepreneur"
- Is comfortable with changing roles
- Can communicate (oral and written)
- Is a detective
- Is imaginative
- Is self-starting
- Has a profession (Intelligence analysis)
Posted by
Kristan J. Wheaton
at
1:47 PM
1 comments
Labels: Cognition, Defense Technical Information Center, document summary, intelligence analysis, psychology, Scribd, United States Army Intelligence and Security Command
Monday, January 4, 2010
Heuer: How To Fix Intelligence Analysis With Structured Methods (NationalAcademies.org)
Richards Heuer (of Psychology Of Intelligence Analysis fame...) spoke last month at the National Academy of Sciences regarding his thoughts on how to improve intelligence analysis through the increased use of structured methods.
In the aftermath of the attempted bombing of Flight 253 on Christmas Day, it is worth reading Dick's words on how to improve the analytic side of the intelligence equation. I don't agree with everything he says (and say so in italicized parenthetical comments below) but he has been thinking clearly about these kinds of things for far longer than most of us. If you are concerned at all with reforming the way we do analysis, then this is a much better place to start than with all the noise being generated by the talking heads on TV.
I have embedded the whole document below this post or you can go to the National Academies site and listen to Dick's speech yourself. For those of you with too much to do and too little time, I have tried to pull out some of the highlights from Dick's paper below. I am not going to do it justice though, so, if you have the time, please read the entire document.
- "If there is one thing you take away from my presentation to day, please let it be that structured analytic techniques are enablers of collaboration. They are the process by which effective collaboration occurs. Structured techniques and collaboration fit together like hand in glove, and they need to be promoted and developed together." (This is very consistent with what we see with our students here at Mercyhurst. Dick reports some anecdotal evidence to support his claim and it is exactly the same kinds of things we see with our young analysts).
- "Unfortunately, the DNI leadership has not recognized this. For example, the DNI’s National Intelligence Strategy, Enterprise Objective 4 on improving integration and sharing, makes no mention of improving analytic methods."
- "CIA, not the DNI, is the agency that has been pushing structured analysis. One important innovation at CIA is the development in various analytic offices of what are called tradecraft cells. These are small groups of analysts whose job it is to help other analysts decide which techniques are most appropriate, help guide the use of such techniques by inexperienced analysts, and often serve as facilitators of group processes. These tradecraft cells are a very helpful innovation that should spread to the other agencies." (Interesting. We called these "analytic coaches" and tried to get funding for them in our contract work for the government in 2005 -- and failed).
- "I understand you are all concerned about evaluating whether these structured techniques actually work. So am I. I’d love to see our methods tested, especially the structured analytic techniques Randy and I have written about. The only testing the Intelligence Community has done is through the experience of using them, and I think we all agree that’s not adequate." (I suppose this is the comment that bothers me at the deepest level. It implies, to me, at least, that the IC doesn't know if any of its analytic methods work. What other 75 billion dollar a year enterprise can say that? What other 75 billion dollar a year enterprise wants to say that?)
- "Some of you have emphasized the need to test the accuracy of these techniques. That would certainly be the ideal, but ideals are not always achievable." (Here I have to disagree with Dick. Philip Tetlock and Bruce Bueno De Mesquita have both made progress in this area and there is every reason to think that, with proper funding, such an effort would ultimately be successful. Tetlock recommended as much in a recent article. The key is to get started. The amount of money necessary to conduct this research is trivial compared to the amount spent on intel overall. Likewise, the payoff is enormous. As an investment it is a no-brainer, but until you try, you will not know)
- "Unfortunately, there are major difficulties in testing structured techniques for accuracy, (for an outline of some of these, see my series of posts on evaluating intelligence) and the chances of such an approach having a significant favorable impact on how analysis is done are not very good. I see four reasons for this."
- "1. Testing for accuracy is difficult because it assumes that the accuracy of intelligence judgments can be measured."
- "2. There is a subset of analytic problems such as elections, when a definitive answer will be known in 6 or 12 months. Even in these cases there is a problem in measuring accuracy, because intelligence judgments are almost always probabilistic."
- "3. A third reason why a major effort to evaluate the accuracy of structured analytic techniques may not be feasible stems from our experience that these techniques are most effective when used as part of a group process."
- "4. If you are trying to change analysts’ behavior, which has to be the goal of such research, you are starting with at least one strike against you, as much of your target audience already has a firm opinion, based on their personal experience that they believe is more trustworthy than your research." (Here I have to disagree with Dick again. I think the goal of this research has to be to improve forecasting accuracy. If you can show analysts a method that has been demonstrated to improve forecasting accuracy -- to improve the analyst's "batting average" -- in real world conditions, I don't think you will have any problem changing their behavior.)
- "As with the other examples, however, the Intel Community has no organizational unit that is funded and qualified to do that sort of testing."
- "It (a referenced Wall St. Journal article) suggested that instead of estimating the likelihood that their plans will work, financial analysts should estimate the probability they might fail. That’s a good idea that could also be applied to intelligence analysis." (I am not sure why we can't do both. We currently teach at Mercyhurst that a "complete" estimate consists of both a statement of probability (i.e. the likelihood that X will or will not happen) and a statement of analytic confidence (i.e. how likely is that you, the analyst, are wrong in your estimate.)
- "The kind of research I just talked about can and should be done in-house with the assistance of those who are directly responsible for implementing the findings." (I think that Dick is correct, that, at some point, it has to be done in-house. I do think, however, that the preliminary testing could be effectively done by colleges, universities and other research institutions. This has three big benefits. First, it means that many methods could be tested quickly and that only the most promising would move forward. Second, it would likely be less expensive to do the first stage testing in the open community than in the IC. Third, it allows the IC to extend its partnering and engagement activities with colleges, universities and research institutions.)
- "Our forthcoming book has two major recommendations for DNI actions that we believe are needed to achieve the analytic transformation we would all like to see."
- "1. The DNI needs to require that the National Intelligence Council set an example about the importance of analytic tradecraft. NIC projects are exactly the kind of projects for which structured techniques should always be used, and this is not happening now."
- "2. The second recommendation is that the DNI should create what might be called a center for analytic tradecraft."
Complete text below:
The Evolution of Structured Analytic Techniques -- Richards Heuer -- 8 DEC 2009
Posted by
Kristan J. Wheaton
at
8:01 AM
0
comments
Labels: document summary, intelligence, intelligence analysis, Intelligence Community, psychology, Richards Heuer, Social Sciences