New work shows the inflation of scientific results happens at many stages in the press game
Many scientists greet the prospect of media coverage with a combination of excitement and trepidation. The attention can be heady and show the importance of a researcher’s findings. But there’s also the chance that the news media will hype the scientist’s findings. A lab study in rats ending up in the paper with the claim that more bacon jerky could prevent cancer can be embarrassing at best. More importantly, an over-hyped finding, especially in areas related to human health, could mislead readers and patients, some of whom may end up changing their behavior based on unfounded information.
It’s easy to point fingers and blame journalists, media-hungry fame seekers or the endlessly hungry news cycle. But a new analysis shows that some exaggerations slip in at the institution level, in the press releases that universities and organizations send to journalists about new scientific findings. The results reinforce that hype creep occurs at all levels of the news chain — beginning with the scientists themselves, the organizations wishing to promote the work and the media outlets that report on it. But this doesn’t mean that the media can pass blame. No one gets a pass.
“The anecdotes pile up about this contaminated food chain, as it were, of spin or imbalanced incomplete messages in papers, which then leads to spin and imbalanced incomplete messages in news releases, and then it leads to really inadequate news stories,” says veteran health care journalist Gary Schwitzer, publisher of healthnewsreview.org, a watchdog site that reviews health news. “Lost in all of this is the consumer, the reader, the patient at the end of this food chain who doesn’t know how the sausage is made, and if he or she could see it would see what an ugly mess it is.”
Every day, journalists (and bloggers) have to sift through mountains of scientific research to find the best work to report on. Press releases from scientific journals, institutions, companies and elsewhere can help point reporters to the highlights. Press releases are short summaries of the scientific studies, designed to convey the impact and potential of the findings and to use plain language to explain the technical details found in the journal article itself. But press releases are also promotional material for the journals involved, the funding agencies who paid for the work, the universities where the result was produced and the scientists who did the experiments in the first place.
Ideally, journalists will see a press release and then read the scientific paper it’s based on and maybe even other related journal articles. They will then interview the authors of the paper and call up other experts in the field for perspective on the work’s findings and claims. The journalist’s story will then go through at least one editor, usually more, before it’s finally published.
That’s in an ideal world. But media outlets are competing for readers, and there is pressure to grab eyeballs with big statements and flashy findings. “There’s a lot of competition and time pressure,” says Petroc Sumner of Cardiff University in Wales, coauthor of the new study. “These pressures can mean exaggeration occurs and doesn’t get corrected or changed.”
Sumner and his colleagues at Cardiff were especially interested in the role of press releases in that hype. They conducted a survey of 462 press releases based on biomedical studies performed at academic institutions in the United Kingdom and published in 2011. They also examined news articles in 11 U.K. news outlets written about those studies. They combed the press releases and news stories for three types of exaggeration: advice (say, to stop eating eggs or to drink more coffee) that was not indicated by the scientific study, stronger claims than indicated in the article (for example, saying that stress causes hemorrhoids when a study might show only that stress was associated with hemorrhoids) and directly relating findings to humans, when in fact the study was performed in rats, mice or cells.
The results, published December 9 in the British Medical Journal, showed that 40 percent of press releases contained explicit advice not indicated in the journal article. Another 33 percent of claims in press releases used stronger language than in the journal article. Finally, 36 percent of press releases inferred that a finding was related to human health when the study was not actually performed in humans.
Exaggeration in the press releases went along with exaggeration in accompanying news articles. When advice was indicated in the press release, 58 percent of news stories also gave advice. News stories exaggerated associations 81 percent of the time if claims were also exaggerated in the press release. And when press releases improperly extrapolated animal findings to humans, the news extrapolated too — 86 percent of the time. When press releases were not exaggerated, news outlets did sometimes hype the results, but rates were much lower, between 10 and 18 percent of cases.
The results suggest that some of the exaggeration that slips into news stories seeps in from press releases about scientists’ work. Sharon Dunwoody, who studies media science messages at the University of Wisconsin–Madison, says that the tone of academic journals is intentionally extremely conservative, but that this caution “can evaporate at the press release stage, where a scientist and her communication unit feel more willing to speak in certain terms.”
Exaggeration at this stage may simply come from researchers’ excitement about work finally getting published, notes Paul Raeburn, author and journalist at the Knight Science Journalism Tracker. “They publish a big study they’ve been working on for 10 years, and they’re excited,” he says. “Then all of a sudden they get press attention and they are so excited, they say it will change treatment. This is a normal human thing.”
That puts a lot of pressure on the press officer to distill that excitement into a compelling press release. But press officers are usually not experts in the scientific studies they are promoting, notes Matt Shipman, a press officer at North Carolina State University in Raleigh. “The people writing the news releases for the most part do not have a lot of technical expertise,” he says. “They rely on researchers to get it right.” A researcher who takes little interest in crafting an accurate press release, or who gets overly excited about the prospects of the work, combined with a press officer who is relying too much on the researcher for information, could result in a hyped press release.
And at the end of the day, an institution is trying to attract coverage for its work — and that success is often measured in headlines. “I think it’s really important to actually look at how press offices are managed and run and how the outputs of media relations are measured,” says Mark Henderson, head of communications at the Wellcome Trust in London. “If all you’re doing is counting the number of media hits you get, then a super hyped press release may score better, if people feel that’s what they’re going to be measured on, that’s a problem.”
But in Sumner’s study, while exaggeration in the press release was associated with hype in the media, it didn’t mean that the piece got more media attention. In fact, Sumner and colleagues showed that there was no significant difference in the amount of news coverage of studies with hyped or non-hyped press releases. “This surprised us quite a lot,” Sumner says. He hypothesizes that some journalists, especially those who specialize in specific areas of human health, can see through the exaggeration, and so hype may make little difference in whether or not a writer reports on a study. “Another possibility is that the press releases being exaggerated are not equal in all other ways to press releases not being exaggerated,” he notes. “Things that are intrinsically newsworthy don’t need to be exaggerated. Exaggeration then might have an effect.”
The finding that hype doesn’t necessarily mean more press coverage is encouraging, says Ivan Oransky, vice president and global editorial director of MedPage Today, and one of the founders of Retraction Watch, a watchdog site that often covers cases where overhyped science results in retraction. He hopes that finding provides “something to guide people and say it doesn’t have to be hyped.”
The study’s authors note that if exaggeration is occurring within academic institutions, then the communities within have the power to address it. Study coauthor Andy Williams, who investigates media sociology at Cardiff University, says there is “probably a little bit more scope for change in the universities than in the news media. It doesn’t look good for universities to be hyping their research this way, and I’m sure once they see the evidence there will be willingness to improve the situation.”
Ben Goldacre, a researcher at the London School of Hygiene and Tropical Medicine in England who speaks frequently on science reporting, wrote an editorial accompanying Sumner’s study. In it, he notes that academics should take press releases more seriously, as seriously as they would the scientific studies they are based on. He also argues for transparency, for press releases to be permanently linked to the studies they are associated with and made publicly available.
Exaggeration may begin in the release, or even in a study, but that doesn’t mean there’s an excuse for it ending up in the news. “What I worry about is that [journalists will] react to this paper by going ‘it’s not our fault,’ and absolve ourselves of responsibility,” says Ed Yong, freelance journalist and blogger at Not Exactly Rocket Science at National Geographic’s Phenomena Blogs. “Our job is to act as arbitrators of the information we see and report on. We analyze and filter. If we don’t there’s no point to us.” Even here, at Science News, it is the responsibility of the writers and the editors to get the story right and not hype findings. All journalists and editors are people. And people make mistakes. But we have to take responsibility when we do.
The most important thing to remember is that there are people, and sometimes patients, waiting on the end of the news cycle. “I wouldn’t be doing [healthnewsreviews.org] if I didn’t think that these kinds of messages and misleading conclusions and observations and statements made in many news releases have the potential — and indeed I think that potential is realized — of hurting people at the end of the food chain,” Schwitzer says. Everyone in the news cycle bears responsibility, in the end, for getting it right.
*Ed Yong, Ivan Oransky, Matt Shipman and Mark Henderson are colleagues and friends of mine. Also note that Yong and Shipman have contributed chapters to an upcoming book I’m editing on science blogging.
Follow @scicurious on Twitter
Editor's note: Updated on December 19, 2014, to correct the location of Cardiff.