Science and Research Content

Ignoring previous research is rampant and unethical, says John Hopkins study -

The vast majority of already published and relevant clinical trials of a given drug, device or procedure are routinely ignored by scientists conducting new research on the same topic, a new Johns Hopkins study has suggested. Johns Hopkins researchers further say failure to consider existing evidence is both unscientific and unethical.

The authors of the findings, reported in a recent issue of the journal Annals of Internal Medicine, argue that these omissions potentially skew scientific results, waste taxpayer money on redundant studies and involve patients in unnecessary research.

Conducting an analysis of published studies, the Johns Hopkins team concludes that researchers, on average, cited less than 21 percent of previously published, relevant studies in their papers. For papers with at least five prior publications available for citation, one-quarter cited only one previous trial, while another quarter cited no other previous trials on the topic. Those statistics stayed roughly the same even as the number of papers available for citation increased. Larger studies were no more likely to be cited than smaller ones.

Dr. Karen Robinson, an assistant professor of medicine at the Johns Hopkins University School of Medicine and co-author of the research with Dr. Steven N. Goodman, searched the Web of Science, an Internet archive, for meta-analyses done in 2004 on groups of randomised, controlled trials on such common topics as a cancer treatment or a heart procedure. A meta-analysis is a systematic procedure for statistically combining the results of many different studies on a similar topic to determine the effectiveness of medical interventions.

The researchers ultimately looked at 227 meta-analyses comprising 1,523 separate clinical trials in 19 different fields, including oncology, neurology and pediatrics. Of 1,101 peer-reviewed publications for which there had been at least five previous relevant papers, 46 percent acknowledged the existence of no more than one previous trial.

The Hopkins researchers could not, however, say why prior trials failed to be cited or whether non-cited trials may have been taken into account in the trial design and proposal stages, such as grant requests to the National Institutes of Health.

In her publications study, Robinson found several papers that claimed to be the first even when many trials on the subject preceded them. There are no barriers to funding, conducting or publishing a clinical trial without proof that prior literature had been adequately searched and evaluated, she says. But requirements such as those have been instituted by some European funding agencies, the medical journal The Lancet, and the US Centers for Medicare & Medicaid Services, which require that a covered trial not “unjustifiably duplicate existing studies,” Robinson writes.

According to Robinson, funders, institutional review boards and journals need to take steps to ensure that prior research is considered. To do otherwise, she says, encourages this 'unethical' behaviour to continue.

Search for more Industry study reports

To access our daily STM news feed through your iPhone, iPad, or other smartphones, please visit www.myscoope.com for a mobile friendly reading experience.

Click here to read the original press release.

STORY TOOLS

  • |
  • |

sponsor links

For banner ads click here