One of the tricky things about science writing is news cycles move quickly while research moves slowly. Research in the biomedical and health fields rarely involves one, big, “Eureka!” moment. Usually it starts with “hey that’s an interesting correlation we found,” followed by a “proof of concept” experiment, followed by one study in one kind of animal showing promise, and later more tests in more animals showing promise (or setbacks). Finally, when researchers are pretty sure that they have found something, they have to verify that it is true in humans, if they have been using animal models. This all takes time and the process can make for some boring headlines.

Science writers usually have to study the research well enough to explain it to a non-technical audience. But reading the original research article and following up on sources also takes time. Furthermore, if after all of that time and pushing against a deadline, you find that really the research only shows a loose correlation in mice studies, your story has lost its luster.

This leads to two temptations: 1) Exaggerating scientific findings to make them sound more interesting, and 2) Relying on press releases or other news articles without fact-checking to get the story out faster. The results of a BMJ study show that both of these temptation are problems in reporting biomedical and health-related research. The study looked at the content of Russell Group (UK) university press releases to see if they exaggerated the claims presented in the original, peer reviewed research paper and if exaggerated claims showed up in other news stories. They also looked at whether exaggerating research findings lead to more news sources picking up the story.

The study showed that of the 462 university press releases evaluated from biomedical or health-related fields, 40% contained exaggerated advice, 33% had exaggerated causal claims (i.e., claiming causation when the research only showed a correlation), and 36% contained exaggerated inference to humans from animal research. They used the original peer reviewed paper as a baseline with the caveat that some peer review papers may exaggerate claims, and they used conservative metrics to evaluate the press releases. Their numbers are likely an underestimate.

Additionally, the BMJ study found that when university press releases contained exaggerated advice, 58% of the news stories about the press release did too. Eighty-one percent of news stories contained exaggerated causal claims if the press release did, and 86% of news stories made exaggerated inferences to humans from animal research if the press release did as well. When the press release does not exaggerate research findings, only 17% of news stories offer exaggerated advice, 18% exaggerate casual claims, and 10% exaggerated the inference to humans from animal research. In other words, journalists are relying on the press releases for their information, and if the press release is inaccurate, then the news reports are too.

A Nature commentary on the research points out that the university press writers are not the only ones responsible here. Most universities have the principle investigator look over the press release before it is published, which means he or she either approved it without reading it or approved the exaggerations.Additionally, journalists are responsible for what they report, including fact-checking their sources.

This study is reminiscent an informal experiment conducted by Shane Fitzgerald, a sociology major at Dublin University, in 2009. He was taking a class on communications and wanted to see how a globalized, internet-dependent media dealt with accuracy in news. While taking the class, renowned composer,Maurice Jarre, died. He posted a fabricated quote on Wikipedia by the composer. He did not attribute the quote, and Wikipedia’s editors caught and removed it quickly. Unfortunately, Wikipedia did not remove it fast enough because several major media outlets, including The Guardian, published an obituary with Shane’s fabricated quote. He pointed out that Wikipedia did its job by removing the quote several times. The problem was that journalists were using the quote even though it did not have a reference or link.

Perpetuating a false quote is one thing; perpetuating bad biomedical and health information is another. Many people turn to these sources assuming that they are providing reliable information. When it comes to health-related research, people may make decisions based on what they think the latest research shows or the recommendations from the university that conducted the research. In the fast-paced news cycle, there is a responsibility to be truthful about health-related research, even if it makes for a less interesting story.