Skeletons paved the cobblestone streets. Thousands had succumbed to the blood plague quickly, but others lingered—only to infect everyone they met. No one was safe. Warriors, mages, and healers all fell. Word spread, urging everyone to flee, but still the plague ripped through the world, creating a holocaust.
A hulking, serpentine blood god, Hakkar the Soulflayer, had sparked the epidemic. Attacked in his dungeon, the monster unleashed his final defense—a curse called corrupted blood. The curse infected the attackers and quickly spread to their companions like an ultra-virulent airborne virus. As adventurers fled the dungeon, they carried the illness back to their towns. Soon the plague even crossed into animals. Within days, the World of Warcraft—a hugely popular online adventure game—was devastated.
Although the death of a character in the World of Warcraft is a mere annoyance—the character disappears for a minute or two and then rematerializes—the plague proved unstoppable. Eric Lofgren was playing the game during the virtual outbreak in September 2005. “It was a big deal,” says Lofgren, who at the time was an epidemiology student at Tufts University in Boston. “Early on, it wasn’t clear how it spread or what was going on. Players attempted to heal other players … not knowing that they were taking damage and indeed spreading the plague. There was a lot of confusion. A lot of people abandoned [the game] until it got sorted out.”
It took Blizzard Entertainment, the Irvine, Calif., company behind World of Warcraft, nearly a week to stop the virtual plague. At that time the online Tolkeinesque world of swords and sorcery boasted 4 million subscribers (it now has 9 million). To enrich the game, the company’s programmers had created Hakkar and made the monster so strong that players would have to band together to kill it. The programmers placed Hakkar in a remote dungeon and expected his blood curse to remain localized there. But they hadn’t accounted for human behavior.
Instead of staying in the cave, infected players teleported to the towns. Soon, their virtual pets became infected—and contagious. Both man and beast spread the disease to densely populated areas, where weaker characters who contracted it died instantly. Computer-controlled characters such as shopkeepers also became infected, but didn’t die. Along with the pets, these characters acted as silent carriers, virtual Typhoid Marys.
It turns out that Lofgren’s adviser at Tufts, Nina Fefferman, specializes in computer modeling of infectious diseases. When Lofgren told her about the virtual chaos, she called Blizzard. Enticed by parallels between the virtual and actual outbreaks, Fefferman asked the company to preserve the plague data. “Their initial reaction was confusion,” she says. “They said, ‘This is a bug, we’re worried about fixing it, we’re not worried about logging data for you.'”
Minus Blizzard’s help, Fefferman and Lofgren still learned enough from observing the outbreak, reading accounts on game-related Web sites, and interviewing players to publish a paper in Lancet Infectious Diseases this August. In it, they outline the potential of garnering valuable lessons from virtual outbreaks.
With that publication, the pair joined a growing cohort of behavioral scientists who are mining virtual worlds for real data on human behavior.
Computer programs that model how infectious diseases spread aren’t new. Government and university researchers have been developing them for decades. But, say Fefferman and her colleagues, studying the actions of the millions of real people invested in World of Warcraft and other online worlds could substantially boost the reality quotient of disease simulators.
“The [computer] models we have are incredibly good at figuring out what the disease will do once we know what the behavior of the person is,” Fefferman says. But the models make broad assumptions about how people will behave, and “we’re pretty bad at knowing what those assumptions should be.”
Lofgren, now an epidemiology graduate student at the University of North Carolina, Chapel Hill, says that “it is extremely hard mathematically to model risk aversion, or panic, or altruistic behavior, or noncompliance with quarantines.” World of Warcraft players exhibited all of these behaviors during the outbreak.
In March, Ran Balicer, an epidemiologist at the Ben-Gurion University of the Negev in Be’er-Sheva, Israel, published a paper in Epidemiology outlining two particularly striking parallels between Hakkar’s curse and real epidemics. First, virtual teleporting is like air travel, spreading bugs across the world in a flash. Severe acute respiratory syndrome (SARS), for instance, originated in China, but quickly dispersed as infected patients traveled in airplanes. Second, animals often act as reservoirs of human disease. With avian influenza, some fowl, especially ducks, “catch the disease in a mild way and then they transmit it onward, much like the animals in the game did,” Balicer says.
More interesting to Fefferman is the “complete diversity” of player behavior reported. Some players logged out—a panic response with obvious parallels in the real world. Others deliberately spread the corrupted blood. These “griefers,” so called because they rejoice in virtual destruction, propagated what Balicer calls “the first act of virtual bioterrorism.” Still others put themselves at risk to heal the infected, not unlike first responders to, say, the current Ebola outbreak in West Africa.
One of the more interesting group dynamics, says Fefferman, was the influx of characters to disease epicenters. Many came not to heal or sow chaos, but just to be near the action. In a game where the cost of virtual death is small, such thrill-seeking makes sense. But Fefferman and others say it’s conceivable that similar behavior would emerge during a real epidemic.
“I tend to think it’s more realistic than we acknowledge, that there would be motivations for people to go to the disaster,” says William Sims Bainbridge, director of the Human-Centered Computing Cluster at the National Science Foundation (NSF) in Arlington, Va. During a smallpox outbreak, for instance, he says “if you believe, like I do, that the federal government can’t succeed in containing it, you would rush to the place where they were giving immunizations, knowing that the smallpox was going to get everyplace pretty soon. It goes well beyond curiosity seeking.”
Drooling for data
Social scientists are invading online worlds in droves to study human behavior. For instance, Dimitri Williams, an assistant professor at the Annenberg School for Communication at the University of Southern California in Los Angeles, is working to figure out how he can divine social dynamics from four terabytes of server logs that preserve the actions of the 400,000 players of the online fantasy game EverQuest 2. He plans to look at how groups form in the game and to highlight how the important players—the social hubs—behave. He also plans to study in-game economics. Williams landed a $200,000 NSF grant for the project.
In September, NSF awarded $360,000 to a team headed by Robert Kraut at Carnegie-Mellon University in Pittsburgh to study interactions in World of Warcraft and other cyber locales such as the user-written Wikipedia.
At the Palo Alto Research Center in California, Nicolas Ducheneaut and colleagues also study the dynamics of player groups, called guilds, in World of Warcraft. With “robots” programmed to survey the online population of World of Warcraft, Ducheneaut and his team collected data on half a million characters over 3 years. They then identified characteristics of the most successful guilds, which are groups of up to 40 players that adventure together. “People tend to be dismissive of games … but I could see right away that the kind of organization and interpersonal dynamics you see in guilds are very, very close to what you see in work groups in corporations,” he says.
Many in the field credit Edward Castronova of Indiana University with legitimizing such pursuits via his examinations of the economics of virtual worlds. Dan Hunter, an associate professor at the Wharton Business School of the University of Pennsylvania, Philadelphia, calls the burgeoning field “computational social science.” This new field is driven by computers, but more importantly by the millions of users who use them to log in to online worlds.
Says Kraut: “It used to be that to study [group dynamics], you had to do very detailed ethnographic-style interviews and observations. But … researchers recognize the enormous value of automatically collected, long-term, large-scale data. It lets us see [social] structure that’s otherwise invisible.”
After gleaning early lessons from the accidental World of Warcraft outbreak, Fefferman now wants to insert a planned epidemic into a game world. “It may not have to look like a realistic disease, it may just have to [be] a perceived risk to something that’s emotionally valued,” she says.
In the World of Warcraft, death may be fleeting, but if a socially spread threat instead disabled some of a character’s abilities, depleted his or her coffers, or destroyed a powerful weapon, risk perception might be more in line with that of a real outbreak, says Williams. “You need a realistic incentive structure. Maybe if you lose treasure or an item that you worked hard for, that might be getting closer to a real-world model—there’s a risk the player wants to avoid.”
Fefferman’s pitch: A planned outbreak could enrich game worlds, offering players a chance to band together to quest for a cure or to build a hospital. “It has to feel less like an experiment and more like another challenge the character lives through,” she says.
Support for planned virtual outbreaks is building in academic quarters. Most recently, two computer scientists from Sweden explored how to design a virtual epidemic. In a paper presented at the Digital Games Research Association conference in Tokyo this September, Magnus Boman of the Swedish Institute of Computer Science and Stefan Johansson of Blekinge Institute of Technology write that carefully designed outbreaks could enhance online worlds, but only if the epidemics “are neither too devastating nor too easy to fend off.”
Or, as Williams puts it, “it’s not that good epidemiology can’t be done, but you really have to know the rules of the game world.”
To date, though, game companies have taken a pass on partnering with Fefferman, and, with the exception of Williams, researchers have had no luck convincing game producers to hand over their hard drives. Researchers “can’t just say, ‘Please help me.’ What are [the game companies] getting out of data sharing?” Williams says. His project, for instance, will help Sony pinpoint why players stay in the game.
In the face of reticence from the industry, some academics are instead designing their own worlds, with limited success. Online games cost millions of dollars, require the talents of dozens of programmers and artists, and devour huge advertising budgets to attract their minions. In October, Castronova shelved a nascent online world of Shakespeare geared for social science research after burning through a $240,000 grant from the John D. and Catherine T. MacArthur Foundation. On Terra Nova, a Web forum for social scientists who study online worlds, he announced that “nothing worth noting is going to happen for a long time.”
The programmers of Whyville have had more success. Launched in 1999 as an education-and-research tool, the free world for 8- to 15-year-olds now boasts 1.7 million users, according to Numedon, Inc., of Pasadena, Calif. Several times programmers have introduced outbreaks of Whypox, which stippled the faces of the users’ avatars with red bumps and interrupted their text chats with “achoo, achoo.” Yasmin Kafai, a University of California, Los Angeles, education professor who studies Whyville and its users, says that offline, kids vigorously debated the cause of the outbreak and how it spread. “It turned out to be a really good learning tool for the kids,” she says.
Social scientists hope that online worlds turn out to be good learning tools for them too. Bainbridge says that “there’s a transition in terms of the magnitude of the social phenomenon” of online worlds that will continue drawing researchers. How much useful information the worlds ultimately provide, and their applicability to the offline world, remain open questions.