Web edition: March 12, 2013
Depending on your age, the word troll might evoke a nasty creature who lives under a bridge — or a nasty creature who posts inflammatory comments online. The former, found mostly in Scandinavian folktales, is typically a dim-witted beast, not inclined to help humans. The latter (judgment on wits aside) is also rarely considered helpful. But new research suggests a more nefarious role for these postmodern trolls: Their uncivil, rancorous remarks can influence how readers perceive science.
Social scientists have long studied how and whether argumentative, obnoxious talk may influence peoples’ perceptions. A growing body of research suggests that cantankerous rhetoric pushes some deep primal buttons that may override the more reasonable, conscious parts of our brains. One study demonstrated this phenomenon by experimentally manipulating the tone of an imaginary blogger, “Curt,” who opined about a climate change policy story. Though Curt’s reasoning was consistent, experimenters altered his language to make one post civil and the other rude, denigrating those who didn’t agree. Readers of insulting Curt came away from his blog less open-minded about the policy than readers of polite Curt.
Now scientists are exploring how the comments posted at the bottom of an online story may shape readers’ perceptions. For a test case, researchers chose an article about nanotechnology, a field whose fruits are already prevalent in consumer goods (hundreds of sunscreens, for example contain titanium oxide or zinc oxide nanoparticles) but is still largely unfamiliar to the general public.
More than 1,000 study participants read a neutral online news story that discussed silver nanoparticles, comparing risks (such as water contamination) and benefits (such as antibacterial properties). Some readers then read civil-toned comments on the article: “Well I think the risks of this technology are just too high for the fish and other plants and animals in water tainted with silver.” Or: “Think of all the clean clothes we’ll have and the germs that we’ll keep our kids from.” Other participants read uncivil versions: “You’re stupid if you’re not thinking of the risks for the fish and other plants and animals in water tainted with silver.” Or: “F*&# off! Think of all the clean clothes we’ll have and the germs that we’ll keep our kids from.”
The uncivil comments had a polarizing effect on readers, Dominique Brossard of the University of Wisconsin–Madison reported in February in Boston at the annual meeting of the American Association for the Advancement of Science. Among people who had already identified themselves as wary of nanotechnology’s risks, those beliefs were exacerbated when the online comments were uncivil. The rude comments also had an effect on participants who self-identified as religious; those people perceived nanotechnology as riskier compared with readers of the civil comments. And people who considered themselves familiar with and supportive of nanotechnology became surer of their opinions after reading the uncivil remarks.
That incivility makes people less open-minded is troubling, because it aggravates an already difficult problem. Despite our big brains, conscious thought and ability to reason, we are often unreasonable creatures. Many studies have demonstrated that humans tend to seek out and believe that which reinforces their own views. We’re even resistant to the opinions of people with recognized expertise on a subject; a recent study found that expert testimony presented in congress or in courtrooms rarely changes the listeners’ beliefs or attitudes.
Scientists can be the worst offenders when it comes to these known antirational tendencies. They often think that if people just knew more about science they would more strongly support all sorts of research, from climate change to nanotechnology.
A myth that perpetuates this thinking is that the space race era was a golden age of scientific literacy. But several surveys from that era of widespread support for science reveal that Americans’ scientific knowledge was pretty scant even as the nation pulled together to beat the Soviets to the moon, Brossard’s Wisconsin colleague Dietram Scheufele and Matthew Nisbet of American University in Washington, D.C., point out in a recent paper. One survey from the time found that only 38 percent of Americans knew that the moon is smaller than the Earth.
Yet science was still held in high regard during that era: about 90 percent of people agreed that science was making life healthier and easier and contributing to social progress.
So what gives? It’s all about framing, the researchers argue. During the 1960s, public opinion about science fit within strong existing frames of social progress and patriotism.
Internet trolls, it seems, negatively frame the science-based debates we see online. Their rancor turns what ought to be open-minded considerations of the facts into ad hominem shouting matches among antisocial dwellers beneath bridges.
A.A. Anderson et al. Crude comments and concern: online incivility's effect on risk perceptions of emerging technologies. Journal of Computer-Mediated Communication. Published online February 19, 2013. doi: 10.1111/jcc4.12009 [Go to]
M.C. Nisbet and D.A. Scheufele. What’s next for science communication? Promising directions and lingering distractions. American Journal of Botany. Vol. 96, October 2009, p. 1767-1778. doi: 10.3732/ajb.0900041 [Go to]