Wednesday, July 18, 2012

Myth #2: Games Work Because They Capture Attention (The 5 Myths Of Game-based Learning)

Part 1:  Introduction
Part 2:  Myth #1:  Game-based Learning Is New

Eyes wide and focused.  Body oriented directly towards the screen.  An apparent inability to hear, even when being shouted at by mom.  If you have ever seen a person play a game that they really enjoy, you know that games have the ability to command complete attention.

Scientists tend to say things like this about the connection between attention and learning:

The assumption that attended stimuli are encoded more effectively into memory than less attended ones is straightforward and supported by substantial evidence (Sarter and Lustig).
or, more obtusely:
Neural models of perception and cognition have predicted that top-down attention is a key mechanism for solving the stability-plasticity dilemma, which concerns the fact that brains can rapidly learn enormous amounts of information throughout life without just as rapidly forgetting what they already know (Grossberg).
What all this means is what any teacher already knows -- attention is the key to learning.  Without a student's attention, it is impossible for them to learn.

Games, in particular, are noted not only for their ability to attract attention but to hold attention, often for very long periods of time.  That the player's attention does not waver despite the difficulty of the challenge or the fact that players often fail, makes this apparent superpower that games have over other media even more extraordinary.

Psychologists have a name for this phenomena -- Flow.  First described by Mihaly Csikszentmihalyi, a professor of psychology at Claremont Graduate University, he defined flow as "being completely involved in an activity for its own sake. The ego falls away. Time flies...Your whole being is involved, and you're using your skills to the utmost."

Flow was first linked to games in 2000 and the concept has gained widespread popularity among game designers since then.  Jenova Chen, a game designer who actually made a game called Flow, describes the relationship between games and this ultimate psychological experience as something which has evolved over time:
"As the result of more than three decades of commercial competition, most of today’s video games deliberately include and leverage...Flow. They deliver instantaneous, accessible sensory feedback and offer clear goals the player accomplishes through the mastery of specific gameplay skills."
Flow derives from a balance of challenge and ability.  Too little challenge and the game (or other situation) is boring.  Too much challenge and the game or other situation) creates anxiety.  The chart to the right (taken from a 2007 article by Chen) graphically shows this relationship and how game designers seek to use this knowledge to design a better game.

Certainly other activities besides gaming routinely create a Flow-like learning experience.  Bailey White, an author and first grade teacher, claims the story of the Titanic can create much the same effect in the minds of her students:
"When children get the idea that written words can tell them something horrible, then half the battle of teaching reading is won.  
And that's when I turn to the Titanic.  The children sit on the rug at my feet, and I tell them the story.  It's almost scary to have the absolute, complete attention of that many young minds...
(The book the children use) is written on the fourth grade reading level - lots of hard words - so I tipped in pages with the story rewritten on an easier reading level.  But by the end of the second week the children are clawing up my pages to get at the original text underneath."
It is, however, gaming's ability to create this experience at large scales, for an extended period of time and (even) across generations that has created what I have come to call the "Magic Formula" of game-based learning:  Game = flow (or more commonly, "fun") = increased attention = increased learning.

If you look across much of the academic literature on game-based learning (and in virtually all of the popular literature on the subject), you will likely find some variant on this magic formula.  Moreover, given everything I have written so far, this formula seems to make a certain amount of sense.

But it is wrong.

It is missing an important element, one that everyone recognizes just as soon as I mention it but one that very few people include in any discussion of game-based learning.  This missing element goes back to the very definition of "game".

You need to look no farther than Wikipedia to determine that (much like the word "intelligence"...) there is still a good bit of debate as to what defines a game.  So you don't have to click, here is a sample:
"A game is a system in which players engage in an artificial conflict, defined by rules, that results in a quantifiable outcome." (Katie Salen and Eric Zimmerman)
"A game is an activity among two or more independent decision-makers seeking to achieve their objectives in some limiting context." (Clark C. Abt)
"A game is a form of play with goals and structure." (Kevin J. Maroney)
My favorite definition, however, is by philosopher Bernard Suits and comes from his 1978 book, The Grasshopper:  Games, Life and Utopia.  According to Suits, a game is a "voluntary attempt to overcome unnecessary obstacles."  While undoubtedly glib, Suits has a point.  Jane McGonigal, who has been mentioned previously in this series, points to the game of golf as  a perfect example of this definition in action.  If the true intent of the game were to merely get the ball in the hole, there are many easier ways of doing so besides making people hit the ball with a stick.  As if this weren't hard enough, we actually strive to make the game harder by adding unnecessary obstacles such as sand traps and water hazards.  Surely it would be easier to simply walk over and drop the ball in!

The most important word in this definition and the missing component to the Magic Formula of game-based learning is, for me, "voluntary".  We volunteer to play a game and because we volunteer, we have an expectation that it will be enjoyable from the outset.

Expectations are powerful things.  We know, for example, that the subjective experience of pain can be manipulated simply by changing the expectations regarding that pain.  We also know that teacher expectations about an individual's ability to learn can drastically alter learning outcomes.

You can test this yourself.  Imagine being forced to play a game you know you hate.  How much attention are you paying to the game?  How much learning do you think you might do if that game were associated with an instructional objective?  Ian Schreiber, game designer and professor at Columbus State Community College, has a wonderful term for this kind of learning experience -  "Chocolate covered broccoli".

In short, games don't work because they capture attention; games work as teaching tools because they are voluntary activities that capture attention.

The good news is that "voluntary" is an analog condition not a binary one.  In other words, voluntary is not something that either exists or doesn't but, in fact, has degrees.  People will love certain games, hate certain games but, in general, will have a wide range of responses to the games they choose to (or have to) play.

I have seen this repeatedly in my own classes.  Every student inevitably has a favorite game and, equally inevitably, it is the lesson associated with that game that they most clearly remember.  Dealing effectivly with this problem leads directly to -- 

Myth #3:  I need a game that teaches...

Tuesday, July 17, 2012

Myth #1: Game-based Learning Is New (The 5 Myths Of Game-based Learning)

Part 1:  Introduction

You would be hard pressed to find an explicit reference to game-based learning anywhere prior to 2000.  Google Trends (see chart to the right) only begins to register the term in the news in mid-2009.

Since 2009, however, game-based learning has started to crop up everywhere.  Mentions of game-based learning in academic literature have risen an average of 18% per year since 2008 and the New Media Consortium's 2012 Horizon Report on tech trends in higher education states that, within 2-3 years:
"...we will begin to see widespread adoptions of two technologies that are experiencing growing interest within higher education: game-based learning and learning analytics. Educational gaming brings an increasingly credible promise to make learning experiences more engaging for students, while at the same time improving important skills, such as collaboration, creativity, and critical thinking..." 
It certainly seems new so why do I call this a myth?

Game-based learning, whether you call it that or not, has been with all of us (and with the intelligence community in particular) for quite some time.  In the first place, there is hardly a teacher alive or dead who has not used/did not use a game in the classroom to help teach.  Remember playing Monopoly to learn about money?

If one can see the parallels between Sun Tzu's admonition 2500 years ago to "know the enemy and know yourself" and modern notions of intelligence and operations, then I think it is possible to argue that the first game with intelligence implications is the ancient Chinese game of Go.  In fact, Chinese strategic thinking is probably still being influenced by Go.

It is possible to argue the same about Chess, and Benjamin Franklin actually made this case (indirectly) in his famous essay on Chess:
"...Life is a kind of Chess, in which we have often points to gain, and competitors or adversaries to contend with, and in which there is a vast variety of good and ill events...  By playing at Chess then, we may learn: 1st, Foresight... 2nd, Circumspection (and) 3rd, Caution..."
What good intelligence professional would not want to have better foresight, be a bit more circumspect and exercise appropriate levels of caution?

http://en.wikipedia.org/wiki/Tafl_games
My favorite example along these lines, however, is the ancient Norse game of Hnefatafl.  It is an extraordinary game (See image to the left).  In the first place, it is asymmetric.  This means that the two sides are not evenly matched and, in fact, have entirely different victory objectives.  One player is typically (there are a number of versions of the game) surrounded and outnumbered by about 2-1.  This player's goal is merely to escape the board (not with everyone - just the "king" needs to escape).  The other player's goal is to capture the king.  It is interesting to speculate what young viking warriors were implicitly learning as they played these games night after night...

Learning through games for intelligence professionals took a massive leap forward in the 1800's.  While Clausewitz recognized that war was a game "both objectively and subjectively", it was left to another German, Baron Georg Leopold Von Reisswitz, to take the game, so to speak, to the next level -- Kriegspiel.

Kriegspiel, literally "war game" in German, was invented by Von Reisswitz in 1812 and modified and improved by his son.  It was not, however, until Helmuth Von Moltke became Chief of the Prussian General Staff in the 1850's that the game began to be used seriously as a training aid for officers.  It is noteworthy that one of the most influential books on Kriegspiel was written by Von Moltke's staff officer for intelligence, Julius von Verdy du Vernois.

Based on Prussian success with wargaming, many militaries adopted the system or made up their own.  Today, all militaries use war games of one sort or another (though they are often referred to as "conflict simulations") and they have grown beyond traditional force-on-force simulations and now include political, economic and unconventional warfare factors as well (my thesis when I was in the army, for example, was based on a political game I had designed).

Paper-pencil war games even had a brief surge of commercial popularity in the 1980's.  Today the industry is much reduced from its heyday but it is still possible to find lots of people playing these type games at events like Origins and Historicon and talking about them at sites like Board Game Geek and Consim World News.

No, game-based learning is not new and certainly not new to the intelligence community.  What is new, however, is the advent of the video game.

By any measure, video game sales have skyrocketed since the early 90's (see chart at right).  Not only is revenue largely up since the end of the recession but the market for electronic games has drastically expanded.   Anita Frazier, analyst for the NPD Group, which, among other things examines the gaming industry in detail, outlines some of these new trends in the video below:



Jane McGonigal, game designer and researcher, claims that nearly half a billion people worldwide spend approximately 3 billion hours per week playing online games.  Anyone with a teenager knows that they game a lot but few people know that one of the fastest growing segment of gamers is actually older women.  So called "casual games", like Farmville and Words With Friends, as well as smart phone enabled games, such as Angry Birds, have taken gaming out of the basement and put it at the front and center of popular culture.

The goal, then, has become to tap into this rapidly growing medium for educational or "serious" purposes; to augment the entertainment experience with a learning experience - and this is precisely where we find the second myth. 

Next:  Myth #2:  Games Work Because They Capture Attention

Monday, July 16, 2012

The 5 Myths Of Game-based Learning (A Report From The Classroom)

Let me start this series of posts by saying - unequivocally - I am a strong advocate of game-based learning.  It has worked for me personally, I have seen it work in the classroom and have read the research that, in general, suggests that game-based approaches can provide powerful new ways to learn.

But...

As someone who has spent the last three years applying at least some of the theory of game-based learning in the classroom, I can tell you that it is...well...tricky.

Don't get me wrong.  My intent is not to lead you on and then ultimately come to the conclusion that it can't be done or that it doesn't work or, even, that it is hard to do.  It is just trickier than I expected due, I think, to the "myths" that have sprung up about games and learning.  My hope is that this series of posts will help other teachers (particularly other university professors teaching intelligence studies...) to have a more realistic view of both the difficulties and the rewards of incorporating games into their classes.

Where did these myths come from?  I believe that they are a natural consequence of the inevitable distance between theory and practice.  Any practitioner will tell you that theory only works well...in theory.  Actually applying a pedagogical approach to a real world classroom with real world constraints and challenges is another thing entirely.

The broader conversation on game-based learning largely reflects this divide.  At one end of the spectrum there are the big picture thinkers, the evangelists, if you will, like Jane McGonigal.  McGonigal, a researcher and game designer (and one of my personal favorite experts on games and gaming), makes a strong case for games and game-based learning in her book, Reality is Broken.  If you don't have time to read her book, I highly recommend McGonigal's 2010 TED talk:



At the other end of the spectrum are the things that have actually been tried in class and have been shown to work at meaningful scales.  Here the pragmatists rule and the best statement of that position I have heard comes from Assistant Deputy Secretary of Education, James Shelton, at last year's Games For Change Conference (Shelton's comments begin around minute 6 and some key takeaways are at minute 11 and 13):



Games for Change Festival 2011: James Shelton, U.S. Department of Education from Games for Change on Vimeo.


(Note:  While education has been kicked around like a political soccer ball for what seems like forever, Shelton's entire speech and comments are worth listening to by anyone interested in solving the difficult problem of innovation in education.  You get the sense that this is a guy in the trenches, who understands the reality of the problem, has no political axe to grind and is willing to listen to anyone who has a good idea that can work on a large scale.)

Shelton's speech was not much discussed during or after the conference but it is, for me, a good representation of the practitioner's plea:  "I'll try anything; just show me that it really works."

In the gap between these two extremes, between the heady optimism of McGonigal and the blunt practicality of Shelton, live the 5 myths I intend to talk about in this series of posts. 

Next:  Myth #1 -- Game-based Learning Is New

Monday, July 2, 2012

Top 5 Things Only Spies Used To Do (But Everyone Does Now)

There has been a good bit of recent evidence that the gap between what spies do and what we all do is narrowing -- and the spies are clearly worried about it.

GEN David Petraeus, Director of the CIA, started the most recent round of hand-wringing back in March when he gave a speech at the In-Q-Tel CEO Summit:

"First, given the digital transparency I just mentioned, we have to rethink our notions of identity and secrecy...We must, for example, figure out how to protect the identity of our officers who increasingly have a digital footprint from birth, given that proud parents document the arrival and growth of their future CIA officer in all forms of social media that the world can access for decades to come."
Richard Fadden, the Director of the Canadian Security Intelligence Service (CSIS), added his own thoughts in a speech only recently made public:
"In today's information universe of WikiLeaks, the Internet and social media, there are fewer and fewer meaningful secrets for the James Bonds of the world to steal," Fadden told a conference of the Canadian Association of Professional Intelligence Analysts in November 2011. "Suddenly the ability to make sense of information is as valued a skill as collecting it."
Next I ran across a speech given by Robert Grenier, a former case officer, chief of station and 27 year veteran of the clandestine service, given at a conference at the University of Delaware.  In it, he describes the moment he realized that the paradigm was shifting (and not in his favor):
"Grenier said he came to realize the practice of espionage would have to change when he received a standard form letter at a hotel overseas, while undercover, thanking him for visiting again.  When he realized electronic records now tracked where he had been for certain date ranges, he said he knew the practice of espionage was going to have to change.  “It was like the future in a flash that opened up before my eyes,” Grenier said."
(Note:  While I could not embed the video here, the entire one hour speech is well worth watching.  The part of particular relevance to this post begins around minute 8 in the video.   This is, by the way, fantastic stuff for use in an intelligence studies class).

Finally  (and what really got me thinking), one of my students made an off-handed comment regarding his own security practices.  I needed to send him a large attachment and I asked for his Gmail account. In response, he gave me his "good" address, explaining that he only used his other Gmail address as a "spam account", i.e. when he had to give a valid email address to a website he suspected was going to fill his in-box with spam.

That's when it hit me.  Not only is it getting harder to be a traditional spy, it is getting easier (far easier) to do the kinds of things that only spies used to do.  The gap is clearly closing from both ends.

With all this exposition in mind, here is my list of the Top 5 Things Only Spies Used To Do (But Everyone Does Now) -- Don't hesitate to leave your own additions in the comments:

#5 -- Have a cover story.  That is precisely what my student was doing with his spam account.  In fact, most people I know have multiple email accounts for various aspects of their lives.  This is just the beginning, though.  How many of us use different social media platforms for different purposes?  Take a look at someone you are friends with on Facebook and are connected to on LinkedIn and I'll bet you can spot all the essential elements of a cover story.  Need more proof?  Watch the video below:


The only reason we think this ad is funny is because we intuitively understand the idea of "cover" and we understand the consequences of having that cover blown.

#4 -- Shake a tail.   It used to be that spies had to be in their Aston Martins running from burly East Germans to qualify as someone in the process of "shaking a tail."  Today we are mostly busy running from government and corporate algorithms that are trying to understand our every action and divine our every need, but the concept is the same.  Whether you are doing simple stuff like using a search engine like DuckDuckGo that doesn't track you or engaging "porn mode" on your Firefox or Chrome browser, or more sophisticated stuff like enabling the popular cookie manager, NoScript, or even more sophisticated stuff like using Tor or some other proxy server service to mask your internet habits, we are using increasingly sophisticated tools to help us navigate the internet without being followed.

#3 -- Use passwords and encrypt data.  Did you buy anything over the internet in the last week or so?  Chances are good you used a password and encrypted your data (or, if you didn't, don't be surprised when you wind up buying a dining room set for someone in Minsk).  Passwords used to be reserved for sturdy doors in dingy alleyways, for safe houses or for entering friendly lines.  Now they are so common that we need password management software to keep up with them all.  Need more examples? Ever use an HTTPS site?  Your business make you use a Virtual Private Network?  The list is endless.

#2 -- Have an agent network.  Sure, that's not what we call them, but that is what they are:  LinkedIn, Yelp, Foursquare and the best agent network of all -- Twitter.  An agent network is a group of humans who we have vetted and recruited to help us get the information we want.   How is that truly different from making a connection on LinkedIn or following someone on Twitter?  We "target" (identify people who might be useful to us in some way), "vet" their credentials (look at their profiles, websites, Google them), "recruit" them (Easy-peasy!  Just hit "follow"...), and then, once the trust relationship has been established, "task" them as assets ("Please RT!" or "Can you introduce me?" or "Contact me via DM").  Feel like a spy now (or just a little bit dirtier)?

#1 -- Use satellites.  Back in 2000, I went to work at the US Embassy in The Hague.  I worked on a daily basis with the prosecutors at the International Criminal Tribunal For the Former Yugoslavia.  That collaboration, while not always easy, bore results like the ones that led US Judge Patricia Wald to say, "I found most astounding in the Srebrenica case the satellite aerial image photography furnished by the U.S. military intelligence  (Ed. Note:  See example) which pinpointed to the minute movements on the ground of men and transports in remote Eastern Bosnian locations. These photographs not only assisted the prosecution in locating the mass grave sites over hundreds of miles of terrain, they were also introduced to validate its witnesses’ accounts of where thousands of civilians were detained and eventually killed."  It is hard to believe that only 12 years ago this was state of the art stuff.

Today, from Google Earth to the Satellite Sentinel Project, overhead imagery combined with hyper-detailed maps are everywhere.  And that is just the start.  We use satellites to make our phone calls, to get our television, and to guide our cars, boats and trucks.  We use satellites to track our progress when we work out and to track our packages in transit.  Most of us carry capabilities in our cell phones, enabled by satellites, that were not even dreamed of by the most sophisticated of international spies a mere decade ago.

-----------------------

If this is today, what will the future bring?  Will we all be writing our own versions of Stuxnet and Flame?  Or, more likely, will we be using drones to scout the perfect campsite?  Feel free to speculate in the comments!

Monday, June 25, 2012

How To Replace "Class Participation" With "Professionalism" (Brilliant Idea!)

Sometimes you go to a conference and come away energized - full of new thoughts and ideas.  And sometimes you go to a conference and are lucky to come away with just one new idea.  The American Association of University Professor's Annual Conference in DC a few weeks ago was more of the latter than the former for me -- but that one idea was a real doozy!

I had been paired serendipitously with Dr. Alice Armstrong, a computer science professor from Shippensberg University, on a pedagogy panel.  While I gave my presentation on "The 5 Myths Of Game-based Learning" (more on this later this week) to the, shall we say, "modest" crowd that had assembled for the 0830 start time, Alice really engaged both me and the crowd with her approach to instilling professional standards in her students (It all revolves around the chart below -- but more on that in a moment).

Alice (and her co-author, Dr. Carol Wellington) were facing a serious problem.  Their capstone class, like many capstone classes, asked their students to pursue an independent, long term project.   Despite their best efforts to keep these students in the game, they were still facing failure and incomplete rates that were (combined) hovering around 33%.

Their analysis of this problem indicated that the difficulties were not technical or knowledge based - the students had the skills to do the projects.  Instead, the problems were behavioral; sticking with a schedule, staying in touch with their technical mentors, meeting intermediate deadlines, etc.  Solving this problem seemed difficult if not impossible.

Now comes the brilliant part...

Their solution to the problem was to separate the course grade into two components.  The first part is your standard, old grading system based on how well the students did the various assignments.  Like any standard course grade, you start with a zero and then, as tests and assignments accumulate, your grade emerges as a function of the average of how well you did on those tests and assignments.

The second part is where they did something different and very, very cool.  The second half of the grade is a "professionalism" grade.  Students start with 100% and can only lose points for being "unprofessional." 

I will define "unprofessional" in a second but think for a moment just how clever this is.  In the first place, it does away with the nebulous "class participation" grade.  In the second place, it emphasizes something that faculty in applied professions, like law, medicine, engineering, architecture, computer science and intelligence, value dearly -- professionalism.  Third (and I love this one), it mimics the employer's opinion of an employee.

Think about it:  You just got hired by a company or agency.  You were selected to fill a slot over a bunch of other qualified applicants.  The assumption is that you are the best available candidate for that job.  Because you are new, you are going to be watched and because you are being watched you are going to have chances to disappoint - to make the boss wonder if he really did hire the best candidate - not just with respect to your knowledge but also with respect to your behavior.  It may not be particularly fair, but it is true.

Talking about it is all fine and good, but how do you make it real in the classroom?  If you are like me, you have probably seen a number of potential flaws in this approach.  Here is once again where our friends at Shippensburg impress.  Look at the list below.  I am hard pressed to find much that is objectionable about it.  Lateness, missing appointments, etc.  Those are things we are probably counting off for anyway.
To get the final grade for each student, Alice and her colleagues multiply the content and professionalism scores.  Look at the chart near the top of this post again:  This effectively means that the best you can ever do is the lower of the two grades and that, quite often, your overall grade will be less than the average of your two grades.  Alice and her colleagues believe that this is the key innovation in this approach.

I have to disagree with that a bit.  I think the key innovation is not in the specifics but in the approach itself.  By focusing on professionalism as an attribute that you are expected to have and can only lose, Alice and her colleagues have changed its psychological value.  Loss aversion is a well understood effect and Alice reported that her students responded as any good psychologist would expect:  They hated to lose professionalism points!

Regardless why it works, it certainly seems to be working.  They cut their failure and incomplete rates in half.  They are so happy with the system that they are pushing in down their curriculum to their freshman classes (there they do give students the opportunity to earn points back, though).  One of the most important endorsements of the process actually came from outside the university:  The computer science department's industry advisory council loves it.

I am thinking about implementing some aspects of this system in my own classes.  While I think the Shippensburg system is pretty harsh, I can understand their reasoning.  I would not want half or more of my points tied up with issues like lateness and missing deadlines, though.  Frankly, those kinds of things have  rarely been a problem in any of my classes here at Mercyhurst. 

More important to me is the point of such a system:  To send a signal that professionalism matters and that it is something you are expected to have and can only lose.  Getting away from the more wishy-washy "class participation" grade and moving towards something that is both important and helps the students is a strong step in the right direction, in my opinion.