Millions of animals may be missing from scientific studies
Most animals used in
biomedical experiments are not accounted for in published papers, a
first-of-its-kind study suggests. The analysis found that only one-quarter of
more than 5500 lab animals used over a 2-year period at one university in the
Netherlands ended up being mentioned in a scientific paper afterward. The
researchers believe the pattern could be similar at institutions around the
world, resulting in potentially millions of animals disappearing from
scientific studies.
“I think it’s just
outrageous that we have such a low rate of results published for the number of
animals used,” says Michael Schlüssel, a medical statistician at the University
of Oxford who was not involved in the study. “If we only look for
groundbreaking research, the evidence base won’t be solid,” he adds. And that
could impact studies that may confirm or refute the benefits of certain drugs
or medical interventions.
Scientists have long
suspected that a considerable share of animal studies doesn’t get published.
That could be because the results aren’t deemed interesting enough, or the
study didn’t find anything noteworthy. But many academics argue that such “negative” results are important and worth
publishing, and that failing to do so constitutes publication bias.
Yet getting a handle on
this problem has been hard because it’s difficult to track how many animals
scientists use—and what happens with them. Researchers usually list such
details in applications for ethical approval, but those often remain
confidential.
For the new study,
researchers asked scientists at three University Medical Center Utrecht (UMCU)
departments for permission to review the study protocols they had filed with an
animal ethics committee in 2008 and 2009. (They picked those years in part to
be completely sure that the scientists had plenty of time to conduct and report
the studies.) Then the team—led by Mira van der Naald, a doctoral student at
UMCU—searched the medical literature for papers resulting from the work.
Of the approved studies,
46% were published as a full-text paper; if conference abstracts—short
summaries of a talk or poster presented at a scientific meeting—were counted as
well, 60% ended up being published. Yet out of the 5590 animals used in the studies, only 1471 were acknowledged in
published papers and abstracts, the team reports in BMJ Open
Science. Small animals, including mice, rats, and rabbits—which made up 90%
of the total—were most often missing in action: Only 23% of them showed up in
publications, versus 52% of sheep, dogs, and pigs.
The researchers also
surveyed the scientists involved to find out why so many animals were missing.
The most common reasons they gave were that the studies didn’t achieve
statistical significance, a controversial but commonly used threshold for
publication; that the data were part of a pilot project; and that there were
technical issues with the animal models.
But none of these is a valid
excuse to not publish your findings in the scientific record, says study
co-author Kimberley Wever, a metascientist at Radboud University Medical
Center. “All animal studies should be published, and all studies are valuable
for the research community.” Not publishing all research means other scientists
may waste time, effort, and money redoing studies that have previously failed,
Wever says. She adds that the trend likely holds up at institutions around the
world and hopes other researchers will conduct similar studies.
“It’s a very big issue,”
agrees Anita Bandrowski, an information scientist at the University of
California, San Diego, who has created software that automatically
scans published papers for details such as the sex of animals used in studies.
Van der Naald and her
colleagues launched a potential remedy for the problem in 2018: preclinicaltrials.eu, the first online
registry dedicated to animal research. (A similar registry, animalstudyregistry.org, was recently set up by
the German Centre for the Protection of Laboratory Animals.) In these
databases, researchers can share methodologies, protocols, and hypotheses
before carrying out their experiments—a process called preregistration that has
been gaining traction in the academic community in recent years.
The Dutch government has
said it is sympathetic to the idea. But despite a 2018 motion in support of
registration passed by the Dutch House of Representatives,
the government has not made it compulsory yet.
Aucun commentaire:
Enregistrer un commentaire