Thursday, December 10, 2009
According to Henley-Putnam: "Author and Henley-Putnam adjunct professor Thomas B. Hunter will provide an introduction to careers in intelligence analysis, including a discussion of counterterrorism, human factors in terrorism, weapons systems, detainee support and Homeland Security. He will also offer a breakdown of the different agencies and their missions. Prior to joining Henley-Putnam, Mr. Hunter served as an intelligence officer with the Defense Intelligence Agency (DIA), where he specialized in a variety of analytical areas, including Homeland Security, Detainee Support, and South American narcoterrorism."
Sunday, December 6, 2009
Nope, we didn't win the DARPA Balloon Challenge. These guys did and congratulations to them. It was clearly a well-thought-out and well-executed effort on their part.
While, for the participants, the contest was clearly about winning, it was also about learning. Here, in more or less random order, are some of the things I took away from the experience (Bear in mind, though, that I only watched about two hours of the student's work on Saturday and got to observe through emails, etc. some of the planning. It is my hope that some of the participants will add their own insights in the comments section to this post -- Hint, hint!):
- This contest was incredibly motivating. DARPA is to be applauded for their willingness to sponsor something like this. I am not sure if they got what they hoped to get (and I hope they make public their findings), but it really fired up many of our students.
- The Mercyhurst strategy was good... I was very impressed with how the students modeled the problem and self-organized to handle the various tasks. Their fundamental strategy was to identify lesser known networks which would likely be in a position to see the balloons and then motivate people in those networks to report. For example, the students, early on, identified local law enforcement intelligence analysts and crime analysts as an excellent potential network to tap into. Furthermore, they thought that this might not be a network that other participants would think of/have access to. Other examples of networks they tried to tap into included interstate truckers and Mercyhurst alumni. Their secondary strategy was to monitor a wide variety of online sources for leads on the day and then work to confirm or deny them.
- ... And clearly had some success... The Facebook group the students set up grew from 0 to 447 members in the 24 hours before the contest began. Likewise, some bloggers, such as Deborah Osborne, picked up on the effort and a number of Twitter users re-tweeted the call for help.
- ... But we got started too late. We did not hear about the contest until 3 DEC and the team did not form until 4 DEC. This gave the team about 24 hours to organize and get the word out. Clearly it was not enough. Looking at the non-traditional social networks was a very good idea but I think it simply was going to take more lead time to energize them.
- The Game Day execution was brilliant (and fun to watch). With the social networking strategy not panning out the way they had hoped, the students fell back to monitoring online sources and running down leads. The students had done a simple IPB before the contest and had made some judgments about where the balloons were likely to be deployed before the contest even began (for example, they had mapped where all the DARPA funded sites were across the US thinking that the DARPA guys would stick close to home for deployment purposes). That way, as reports started to come in, they were in a position to rank order the leads for viability. It was impressive to be able to watch one of the team members quickly examine a new target and rattle off four or five reasons why it was likely worth exploring or not in a matter of seconds.
- HUMINT was particularly important. One of the most impressive aspects of the operation was the ability of the students to utilize human sources to track a target. Once a potential target had passed the IPB test, the students used Google Earth to identify restaurants or stores close to the target site and then just called those stores. Listening to the conversations was always interesting: "Uh, this is going to sound weird, but can you look out your window and see if you see a red balloon?" It was also surprising how often people were willing to call back or to go out of their way to confirm or deny the info (one woman got in her car and drove 20 minutes only to call back and indicate that there was not a balloon at the reported location). The big exception here seemed to be in areas where the weather was bad.
- A custom collection management tool would have been useful. I don't think anyone expected the number of leads that poured in. Each had to be sorted out and tasked for follow up. A system emerged over the course of the several hours I watched the team at work but having something that was custom designed to deal with the problem would have been nice. Here again, I think the late start was a critical factor.
- We could have gotten more out of the IPB. The OPFOR at NTC used to do something interesting before an attack. They would figure out where the defenders were likely to put scouts and then just shell the stuffing out of all of those locations. They knew that they would likely be blowing up a lot of dirt and trees and only occasionally hitting the defender's scout units but they had plenty of artillery and the strategy was almost guaranteed to blind the defenders. Given the IPB work done in advance of the contest and the generous submission guidelines set by DARPA (each entrant could submit up to 25 entries), it would have been possible to make "good guesses" and enter each of those submissions as soon as the contest opened up.