Showing posts with label document summary. Show all posts
Showing posts with label document summary. Show all posts

Thursday, March 31, 2011

How To Negotiate A Ceasefire (HDCentre.org)

The Centre For Humanitarian Dialogue, located in Geneva, Switzerland, has done a really good job of pulling together a concise monograph called "Negotiating Ceasefires".  

Only 44 pages from start to finish (including endnotes and a comprehensive list of suggested additional readings), this guidebook is filled with practical advice, concise case studies and quotes from practitioners about the risks and rewards inherent in negotiating a ceasefire.

The author, Luc Chounet-Cambas (who has worked these issues in Afghanistan, Indonesia and the Sudan), does not see the ceasefire process through rose-colored glasses:  "Negotiating ceasefires does not imply," he states up front, "that armed groups no longer see their military capability as a core source of leverage with the state."  This, instead, is a practical volume, written for someone who needs actionable advice.

Of most interest to me personally were the insights on when to negotiate a ceasefire (when the political situation suggests the ceasefire might be sustainable) and the checklist of things to think about putting into a ceasefire agreement (such as de-escalation measures and the extent to which the ceasefire will extend to non-military activities).  Finally, there is a must-read section on options for those thrust into the position of mediator.

I don't know much about the Centre for Humanitarian Dialogue but I intend to continue to watch their activities closely (They have two other volumes in this series, Engaging With Armed Groups and A Guide To Mediation:  Enabling Peace Processes In Violent Conflicts for those interested).  

Negotiating Ceasefires is a good, solid volume filled with practical advice from someone who has had to do it, not just talk about it.   Recommended reading.

Friday, December 10, 2010

Intelligence Issues For Congress (OpenCRS.org)

US Intelligence Community SealImage via WikipediaOne of my favorite sites for finding interesting new documents, Docuticker, recently highlighted an October, 2010 Congressional Research Service (CRS) report, Intelligence Issues For Congress

With a new Congress about to be sworn in and the balance of power shifting in the House, this list of issues is worth examining by anyone interested in the direction of the US national security intelligence community.

For those of you unfamiliar with the CRS, it is one of the most reputable sources of information and informed analysis currently available on the planet.

Unfortunately, it is exclusively for the use of members of Congress and, unless a member of Congress releases a report, the CRS's analysis does not see the light of day.  Organizations like the Federation Of American Scientists and OpenCRS.org pick up on any reports that are made public and host them (which is how Docuticker got this one).

Below is my edited version of the summary (You can download the full PDF here).  I cut out the stuff that I thought would be familiar to SAM's readers and have highlighted those parts (in bold) that I thought would most interesting.  Other than that, the words below are direct quotes:

  • Making cooperation effective presents substantial leadership and managerial challenges. The needs of intelligence “consumers”—ranging from the White House to Cabinet agencies to military commanders—must all be met, using the same systems and personnel. Intelligence collection systems are expensive and some critics suggest there have been elements of waste and unneeded duplication of effort while some intelligence “targets” have been neglected.
  • The DNI has substantial statutory authorities to address these issues, but the organizational relationships remain complex, especially for Defense Department agencies. Members of Congress will be seeking to observe the extent to which effective coordination is accomplished.
  • International terrorism, a major threat facing the United States in the 21st century, presents a difficult analytical challenge, vividly demonstrated by the attempted bombing of a commercial aircraft approaching Detroit on December 25, 2009. Counterterrorism requires the close coordination of intelligence and law enforcement agencies, but there remain many institutional and procedural issues that complicate cooperation between the two sets of agencies.
  • Techniques for acquiring and analyzing information on small groups of plotters differ significantly from those used to evaluate the military capabilities of other countries. U.S. intelligence efforts are complicated by unfilled requirements for foreign language expertise. Whether all terrorist surveillance efforts have been consistent with the Foreign Intelligence Surveillance Act of 1978 (FISA) has been a matter of controversy.
  • Intelligence on Iraqi weapons of mass destruction was inaccurate and Members have criticized the performance of the intelligence community in regard to current conditions in Iraq, Iran, and other areas. Improved analysis, while difficult to mandate, remains a key goal. Better human intelligence, it is widely agreed, is also essential.
  • Intelligence support to military operations continues to be a major responsibility of intelligence agencies. The use of precision guided munitions depends on accurate, real-time targeting data; integrating intelligence data into military operations challenges traditional organizational relationships and requires innovative technological approaches. Stability operations now underway in Afghanistan may require very different sets of intelligence skills.
Enhanced by Zemanta

Wednesday, November 17, 2010

Does The Future Belong To Robots? (Institute For the Future)

The Institute For The Future does some interesting long-range analysis. In the past they have focused on a variety of issues including things such as health and food but recently they decided to take on robots.

The method in this most recent effort seems to be a more or less straight line extrapolation based on existing trends but, as with all deep-future work, one of the real benefits of the analysis is the mental model of the question.

The Institute sees robots participating in our lives at three levels (see the embedded graphic below for more details or download the PDF). The first, automation, appears to be where we are now, with robots automating processes that were formerly done by humans. The second level is augmentation, where robots add to our existing capabilities, such as driving the car for us. The final level is understanding, where robots begin to interact with us in ways that are indistinguishable from the ways we interact with other humans.

The Institute is also very good at visualizing their data and this chart is no exception. I think visualizing the results of analysis is a pretty important skill for all analysts, so I always take a look at their stuff for new ideas. The small embed below may be difficult to read or navigate so I strongly suggest downloading the PDF file so you can examine the style of report more easily and in more detail.

Another thing that might be interesting to readers who don't track this technology very closely is how many examples of each of these levels (and in how many areas) the analysts at the Institute were able to find. It seems that the robot future may be closer than we think. It is thought-provoking analysis on many levels.


Enhanced by Zemanta

Wednesday, October 27, 2010

US Slighty Less Corrupt Than Uruguay, Slightly More Corrupt Than Chile (Transparency International)

http://www.transparency.org
Transparency International (TI), the anti-corruption watchdog, has just issued their 2010 findings regarding perceptions of government corruption worldwide.  The map to the right gives you a feel for the findings but to get the full story (as well as maps you can actually read, you have to go to TI's website.

The US dropped to 24th (of 178) for its worst showing ever on the Corruption Perceptions Index (CPI). Previously, the worst performance was in 2006-07 when the US came in 20th. The best the US has ever placed was in 2001-02 when it was 16th.

Since the CPI has grown in terms of number of countries, perhaps a better way to look at a country is by way of the absolute score. Even here, though, the US has seen some degradation over the years. In 2001, the score was 7.6 (out of 10 - high numbers are better) while the 2010 score was just 7.1.

Still, the US is in the top 13% of the world. While there are a few seemingly counter-intuitive results (as my headline suggests), most of the least corrupt countries are in the OECD. The most corrupt places on the planet continue to be concentrated in Africa (Sudan, Chad, Burundi, Somalia) and Central Asia (Turkmenistan, Uzbekistan). Both Afghanistan (with a 1.4 out of 10) and Iraq (with a 1.5) are listed among the most corrupt countries on earth.

According to TI, "the surveys and assessments used to compile the index include questions relating to bribery of public officials, kickbacks in public procurement, embezzlement of public funds, and questions that probe the strength and effectiveness of public sector anti-corruption efforts."  A very complete run-down of their methods and sources is available on their website.
Enhanced by Zemanta

Friday, October 22, 2010

"Effective Intelligence Analysis Is A Concept-driven Activity Rather Than A Data-driven One" (DTIC)

"The most telling result of the research is the clear implication that intelligence analysis is conceptually driven as opposed to data driven. What is critical is not just the data collected, but also what is added to those data in interpreting them via conceptual models in the analyst's store of knowledge."
In other words, how you think is more important than what you know.  This is one of the big take-aways from Philip Tetlock's wonderful Expert Political Judgment and you would be forgiven if you thought I was just touting his 2005 book again.

No, the quote above is from a 1979(!) INSCOM sponsored study into cognitive processes in intelligence analysis (called, amazingly enough, Cognitive Processes In Intelligence Analysis:  A Descriptive Model And Review Of The Literature.  It, and its companion piece, Human Processes In Intelligence Analysis:  Phase 1 Overview are available through DTIC or you can download them here and here from Scribd.  I wish I could say that I found them on my own but they come to me courtesy of Dalene Duvenage, who teaches intel analysis in South Africa, and the always useful IAFIE mailing list).

While much of the content in these two papers is probably more easily accessed by way of Dick Heuer's Psychology of Intelligence Analysis, there are some new gems here. One of my favorites is the reason for the studies.  

According to Dr. Joseph Zeidner, Chief Psychologist for the US Army in 1979, and MG William Rolya, the Commander of INSCOM at that time, "Intelligence collection systems have proliferated over the past several years, increasing in complexity and in volume of output.  However, there has been no corresponding improvement in the ability of intelligence personnel to analyze this flood of data."  Sound familiar?

Another interesting tidbit comes from the Human Processes paper which lays out the personality attributes of the ideal analyst.  These include:
  • Is a technologist
  • Is either a specialist or a generalist but not both
  • Is an "information entrepreneur"
  • Is comfortable with changing roles
  • Can communicate (oral and written)
  • Is a detective
  • Is imaginative
  • Is self-starting
  • Has a profession (Intelligence analysis)
These criteria seem to strike a chord as well.  All in all, both papers are worth a look, if only because they seem to prove that the more things change, the more they stay the same...
Enhanced by Zemanta

Monday, January 4, 2010

Heuer: How To Fix Intelligence Analysis With Structured Methods (NationalAcademies.org)

Richards Heuer (of Psychology Of Intelligence Analysis fame...) spoke last month at the National Academy of Sciences regarding his thoughts on how to improve intelligence analysis through the increased use of structured methods.

In the aftermath of the attempted bombing of Flight 253 on Christmas Day, it is worth reading Dick's words on how to improve the analytic side of the intelligence equation. I don't agree with everything he says (and say so in italicized parenthetical comments below) but he has been thinking clearly about these kinds of things for far longer than most of us. If you are concerned at all with reforming the way we do analysis, then this is a much better place to start than with all the noise being generated by the talking heads on TV.

I have embedded the whole document below this post or you can go to the National Academies site and listen to Dick's speech yourself. For those of you with too much to do and too little time, I have tried to pull out some of the highlights from Dick's paper below. I am not going to do it justice though, so, if you have the time, please read the entire document.

  • "If there is one thing you take away from my presentation to day, please let it be that structured analytic techniques are enablers of collaboration. They are the process by which effective collaboration occurs. Structured techniques and collaboration fit together like hand in glove, and they need to be promoted and developed together." (This is very consistent with what we see with our students here at Mercyhurst. Dick reports some anecdotal evidence to support his claim and it is exactly the same kinds of things we see with our young analysts).
  • "Unfortunately, the DNI leadership has not recognized this. For example, the DNI’s National Intelligence Strategy, Enterprise Objective 4 on improving integration and sharing, makes no mention of improving analytic methods."
  • "CIA, not the DNI, is the agency that has been pushing structured analysis. One important innovation at CIA is the development in various analytic offices of what are called tradecraft cells. These are small groups of analysts whose job it is to help other analysts decide which techniques are most appropriate, help guide the use of such techniques by inexperienced analysts, and often serve as facilitators of group processes. These tradecraft cells are a very helpful innovation that should spread to the other agencies." (Interesting. We called these "analytic coaches" and tried to get funding for them in our contract work for the government in 2005 -- and failed).
  • "I understand you are all concerned about evaluating whether these structured techniques actually work. So am I. I’d love to see our methods tested, especially the structured analytic techniques Randy and I have written about. The only testing the Intelligence Community has done is through the experience of using them, and I think we all agree that’s not adequate." (I suppose this is the comment that bothers me at the deepest level. It implies, to me, at least, that the IC doesn't know if any of its analytic methods work. What other 75 billion dollar a year enterprise can say that? What other 75 billion dollar a year enterprise wants to say that?)
  • "Some of you have emphasized the need to test the accuracy of these techniques. That would certainly be the ideal, but ideals are not always achievable." (Here I have to disagree with Dick. Philip Tetlock and Bruce Bueno De Mesquita have both made progress in this area and there is every reason to think that, with proper funding, such an effort would ultimately be successful. Tetlock recommended as much in a recent article. The key is to get started. The amount of money necessary to conduct this research is trivial compared to the amount spent on intel overall. Likewise, the payoff is enormous. As an investment it is a no-brainer, but until you try, you will not know)
  • "Unfortunately, there are major difficulties in testing structured techniques for accuracy, (for an outline of some of these, see my series of posts on evaluating intelligence) and the chances of such an approach having a significant favorable impact on how analysis is done are not very good. I see four reasons for this."
  • "1. Testing for accuracy is difficult because it assumes that the accuracy of intelligence judgments can be measured."
  • "2. There is a subset of analytic problems such as elections, when a definitive answer will be known in 6 or 12 months. Even in these cases there is a problem in measuring accuracy, because intelligence judgments are almost always probabilistic."
  • "3. A third reason why a major effort to evaluate the accuracy of structured analytic techniques may not be feasible stems from our experience that these techniques are most effective when used as part of a group process."

  • "4. If you are trying to change analysts’ behavior, which has to be the goal of such research, you are starting with at least one strike against you, as much of your target audience already has a firm opinion, based on their personal experience that they believe is more trustworthy than your research." (Here I have to disagree with Dick again. I think the goal of this research has to be to improve forecasting accuracy. If you can show analysts a method that has been demonstrated to improve forecasting accuracy -- to improve the analyst's "batting average" -- in real world conditions, I don't think you will have any problem changing their behavior.)
  • "As with the other examples, however, the Intel Community has no organizational unit that is funded and qualified to do that sort of testing."
  • "It (a referenced Wall St. Journal article) suggested that instead of estimating the likelihood that their plans will work, financial analysts should estimate the probability they might fail. That’s a good idea that could also be applied to intelligence analysis." (I am not sure why we can't do both. We currently teach at Mercyhurst that a "complete" estimate consists of both a statement of probability (i.e. the likelihood that X will or will not happen) and a statement of analytic confidence (i.e. how likely is that you, the analyst, are wrong in your estimate.)
  • "The kind of research I just talked about can and should be done in-house with the assistance of those who are directly responsible for implementing the findings." (I think that Dick is correct, that, at some point, it has to be done in-house. I do think, however, that the preliminary testing could be effectively done by colleges, universities and other research institutions. This has three big benefits. First, it means that many methods could be tested quickly and that only the most promising would move forward. Second, it would likely be less expensive to do the first stage testing in the open community than in the IC. Third, it allows the IC to extend its partnering and engagement activities with colleges, universities and research institutions.)
  • "Our forthcoming book has two major recommendations for DNI actions that we believe are needed to achieve the analytic transformation we would all like to see."
  • "1. The DNI needs to require that the National Intelligence Council set an example about the importance of analytic tradecraft. NIC projects are exactly the kind of projects for which structured techniques should always be used, and this is not happening now."
  • "2. The second recommendation is that the DNI should create what might be called a center for analytic tradecraft."


Complete text below:

The Evolution of Structured Analytic Techniques -- Richards Heuer -- 8 DEC 2009
Reblog this post [with Zemanta]