A call for better reporting of study results in research papers

NEWS | 2015-03-05

Systematic reviews and meta-analyses are important tools for summarising large bodies of evidence on a topic. However, they are often hampered by inconsistent and limited reporting of study results. In a recent paper in the journal Conservation Biology, EviEM Project Leader Neal Haddaway calls on researchers to improve their reporting to maximise the impact of their findings.

Nearly half of the studies included in one systematic review currently underway in agriculture and soil science reported their data in a way that made them difficult or impossible to use. Photo: Claes Bernes.

Many researchers fail to report their findings properly

Scientific research is typically published in academic journals, and these form a major resource for systematic reviewers when looking for evidence. Reviewers read and appraise large numbers of research papers, extract findings from those studies that are relevant and reliable, and combine them in a systematic review synthesis.

In order to accurately assess the reliability and applicability of individual primary research, reviewers must be able to extract information relating to study design, experimental procedure and the studies’ findings. If study findings are to be included in a meta-analysis, data must be reported either as a standard effect size or as means, and both must be accompanied by sample sizes and measures of variability.

Our experiences of systematic reviews in conservation biology and environmental management show that a vast proportion of published research fails to provide details on experimental design or fails to report means, variability and sample size. In one systematic review currently underway in agriculture and soil science, 46% of included studies failed to report any measure of variability. Without this information it is impossible to assess whether observed differences within the studies are likely to be real differences or just chance variation in sampling.

A recipe for maximising research impacts

In his recent paper in Conservation Biology, Neal Haddaway calls on researchers, editors and peer-reviewers to ensure that study results are presented in a way that facilitates meta-analysis and systematic review.

Increasingly, people needing to make decisions in policy and practice settings are turning towards systematic reviews of existing evidence to help answer questions. As this happens, researchers in all disciplines should be thinking about how to maximise the impact of their findings. Thinking carefully about legacy and future use of data is not only sensible, but should be an obligation.