Attempt to shame journalists with chocolate study is shameful

dark chocolate

A study suggesting that dark chocolate could aid weight loss turned out to be a contrived attempt to bait the media into covering a poorly done nutrition study. 

Boz Bros/Flickr (CC BY-NC-SA 2.0)

“I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here’s How.”

That’s the headline on a May 27 article by science journalist John Bohannon that revealed the backstory of a sting operation he conducted earlier this year. Bohannon and a German television reporter teamed up to “demonstrate just how easy it is to turn bad science into the big headlines behind diet fads.” So they recruited subjects and ran a small clinical trial purporting to test whether eating dark chocolate helped people lose weight. The team used real data from their real trial and published their results in a real (non peer-reviewed) journal, the International Archives of Medicine. The study did find an effect, but that was likely due to a statistical sleight-of-hand; it was impossible to say whether chocolate really helped people lose weight.

To lure reporters into covering the flimsy research, Bohannon and his colleagues ginned up a press release describing their chocolate results. The team sent the release out via Newswise, a site that aggregates press releases and distributes them to some 5,000 journalists. And some of those journalists took the bait.

From the big headline on Bohannon’s piece describing the con, you might think that it was a roaring success. But the “millions” that Bohannon and his partners-in-crime fooled weren’t millions of reporters, but millions of regular people, the consumers of journalism, who believed the reporters covering his study. Not only was Bohannon’s con ethically reprehensible — he lied to the public, undermining their trust in both journalism and science — but also Bohannon is guilty of the very practices he claims he exposed.

Let’s start with the ethics. Deception in the name of journalism has a long history. Typically it’s used to uncover serious wrongs, for example, exposing discrimination in housing practices by sending a white couple and a black couple to apply as renters for the same apartment. But generally, deception is considered a last-resort tactic when there are no other ways to expose fraud or injustice.

“Deception is occasionally appropriate, but should be used very sparingly,” Rick Edmonds, a faculty member at the Poynter Institute, told me. “In this particular case, the point could have been made in other ways.”

The main mark in Bohannon’s sting wasn’t shoddy scientific journals that publish shoddy studies that use shoddy statistics. It was the reporters who cover those studies.

The study showed accelerated weight loss in the chocolate-eating group, but “you might as well read tea leaves as try to interpret our results,” wrote Bohannon. That was part of his point. The study was small and had so many measurements that the odds of getting a “statistically significant” result were good, even if chocolate wasn’t helping people lose weight. I agree with Bohannon’s point that reporters shouldn’t have covered the study.

“People who are on the health science beat need to treat it like science, and that has to come from the editors. And if you’re reporting on a scientific study, you need to actually look at the paper. You need to talk to a source who has real scientific expertise,” Bohannon told the Washington Post.

This point that journalists should take care when covering health and statistics has been made over and over again. This was not an instance of last resort that required undercover tactics. There are numerous resources for journalists to help them interpret statistics: At its annual meeting, the National Association of Science Writers, for example, has hosted many sessions dedicated to this topic. (Speakers at last year’s session included Science News managing editor Tom Siegfried, who has written extensively on this topic, and statistician/reporter Regina Nuzzo, the author of the piece on statistical errors that Bohannon links to in his write-up). Another way to highlight bogus science is to call it out by exposing quacks and telling readers how to be skeptical.

Bohannon and his colleagues decided to create a wrong to prove that wrongs exist. They lied to the public to make their point. Granted, it’s unlikely that anyone will be harmed by eating more dark chocolate. But not only does the caper do a disservice to people who are desperate for meaningful information about health and nutrition, it also undermines all of science and all of journalism. There’s real wrongdoing in both science and journalism (most infamously, see Stephen Glass, Jayson Blair, Janet Cooke, Jonah Lehrer, Brian Williams). But intentionally creating wrong to make a point is both bizarre and potentially very damaging.

“Our key resource as journalists is credibility,” Edmonds told me. “And a deceptive ploy like this could damage that.”

“Good faith with the reader is the foundation of good journalism,” is a key canon of the American Society of News Editors statement of principles. “Minimize harm,” says the Society of Professional Journalists code of ethics. This includes making sure to “consider the long-term implications of the extended reach and permanence of publication.” Did Bohannon and his colleagues consider how many readers — journalism’s “man on the street” — will conclude from their prank that all of journalism and all of science is not to be trusted?

Putting the ethics aside, let’s look at Bohannon’s evidence and conclusions. “If a study doesn’t even list how many people took part in it, or makes a bold diet claim that’s ‘statistically significant’ but doesn’t say how big the effect size is, you should wonder why,” he notes. Fair enough. But his own claims that his scam exposed “laziness” and the generally “lackadaisical” approach of reporters are not supported by any data on how many journalists read the press release, contacted him or wrote up the story.

Bohannon names 12 news outlets that covered the study (that’s counting HuffPo twice) and a 13th reporter whom he spoke with, but hasn’t yet published anything on the study. He also refers to it making news in “20 countries.” (A Google News Archive search suggests that 17 outlets picked up the story). Considering that more than 5,000 reporters subscribe to the Newswise press release feed, even if a tiny fraction of those reporters actually read his bogus study, only 13 deemed it worthy of coverage? (And the 5,000 who might have read the release doesn’t count the German version sent to reporters in Austria and Germany.) Here’s some math: 13 out of potentially hundreds or thousands of reporters who read the release covered it. And from that he concludes that reporters are lazy, treating science “like gossip, echoing whatever they find in press releases.” I don’t need statistics to call BS on that.

In Bohannon’s write-up he doesn’t mention the reporters who were skeptical. But one reporter from Ohio, according to the Washington Post, questioned Bohannon about the institute Bohannon invented for the scam and the sample size of the study. “Afraid that he was about to lose his cover, Bohannon put the reporter off. Eventually, he stopped calling,” the Post article notes. As Bohannon correctly notes in his own write-up, that’s another statistical no-no, dropping “outlier” data points. And although I’ve done my job in trying to contact Bohannon to discuss this whole operation, he has not returned my e-mails.

Bohannon’s “I fooled millions …” headline is actually, to use journalistic parlance, the kicker here. He didn’t fool millions of reporters, the target of his con, but millions of innocent readers. He and his colleagues set out to “demonstrate just how easy it is to turn bad science into the big headlines.” Well, they certainly succeeded at that. Just not the way they intended.

More Stories from Science News on Science & Society

From the Nature Index

Paid Content