There are a number of great sites (mostly libraries) where there is good guidance on how to evaluate internet based sources. I intend, in this post, to list some of those sites and identify some up-and-coming tools. Finally, I want to highlight an important contribution to this literature that specifically pertains to intelligence analysis.
Virtually every good research library has a page dedicated to evaluating internet based sources. Some good examples include, among others:
- The Library of Congress
- Purdue's Library and Online Writing Lab
- U Cal Berkeley's Library
- and checklists from New Mexico State University Library, the Milner Library at Illinois State and the University of Wisconsin - Eau Claire's 10 C's
Beyond these resources there are also a couple of new automated tools that are available for checking the accuracy and reliability of some internet sites. Beyond those that look for malware (such as McAfee's Site Advisor), there are two products which I have found particularly interesting as they are primarily designed to examine content.
The first is SpinSpotter (a firefox extension). SpinSpotter (which is "very" beta right now) allows you to annotate web sites for "spin" and to view other people's evaluations on websites that have already been evaluated. At some point (although it is unclear when), a computer algorithm will kick in (once it has learned enough about how to spot spin from thousands of reader's input) and begin to automatically mark up pages. This is when the tool will get really interesting...
The second project, WikiTrust, developed by the University of California, Santa Cruz WikiLab, is designed to use data from any MediaWiki based product (such as Wikipedia or Intellipedia) and, in turn, be able to automatically indicate how "trustworthy" the content of that wiki is. You can actually see a demo of it here based on 2007 data. You can also download the software that will allow you to apply the trust algorithm to any MediaWiki based wiki today. The problem is, of course, that the person applying the code also has to control the wiki (Hmmm...I wonder if Intellipedia uses this...I wonder why Wikipedia doesn't use it now...).
None of these solutions specifically had the intelligence professional in mind, however. This has changed recently with Dax Norman's recent online publication of his 2001 Joint Military Intelligence College/National Defense Intelligence College thesis, usefully titled How To Identify Credible Sources On The Web. Dax is the curriculum manager at the National Cryptologic School and one of the most intelligent and insightful people I know. Possessed of deep experience and a darn good mind, he has spent a good bit of time reflecting on how best to improve the analytic process. As a result, he is always worth listening to.
His thesis is particularly well worth the read for anyone who is interested in the subject. While much as been done in the area of assessing internet sources (see above), his take-away -- a research based checklist of key variables in assessing source reliability -- is as good today as it was in 2001.
If you are interested in the details of this scoring system, how it was derived and validated, I will have to refer you to the thesis. Using the checklist, however, is dead easy. Just check the blocks, add up the total and compare it to the scale on top. While I am virtually certain that Dax would not claim that this checklist should replace analytic judgment, I do think that it is far better than a guess-timate or, even worse, no assessment of source reliability at all.