Software engineering principles address current problems in the systematic review ecosystem

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Systematic reviewers are simultaneously unable to produce systematic reviews fast enough to keep up with the availability of new trial evidence while overproducing systematic reviews that are unlikely to change practice because they are redundant or biased. Although the transparency and completeness of trial reporting has improved with changes in policy and new technologies, systematic reviews have not yet benefited from the same level of effort. We found that new methods and tools used to automate aspects of systematic review processes have focused on improving the efficiency of individual systematic reviews rather than the efficiency of the entire ecosystem of systematic review production. We use software engineering principles to review challenges and opportunities for improving the interoperability, integrity, efficiency, and maintainability. We conclude by recommending ways to improve access to structured systematic review results. Major opportunities for improving systematic reviews will come from new tools and changes in policy focused on doing the right systematic reviews rather than just doing more of them faster.

LanguageEnglish
Pages136-141
Number of pages6
JournalJournal of Clinical Epidemiology
Volume109
DOIs
Publication statusPublished - 1 May 2019

Fingerprint

Ecosystem
Software
Technology

Keywords

  • Evidence synthesis
  • Machine learning
  • Software engineering
  • Systematic reviews as topic
  • Trial registration
  • Updating systematic reviews

Cite this

@article{cb6f1d6285f147f29a28ee7e626f2b81,
title = "Software engineering principles address current problems in the systematic review ecosystem",
abstract = "Systematic reviewers are simultaneously unable to produce systematic reviews fast enough to keep up with the availability of new trial evidence while overproducing systematic reviews that are unlikely to change practice because they are redundant or biased. Although the transparency and completeness of trial reporting has improved with changes in policy and new technologies, systematic reviews have not yet benefited from the same level of effort. We found that new methods and tools used to automate aspects of systematic review processes have focused on improving the efficiency of individual systematic reviews rather than the efficiency of the entire ecosystem of systematic review production. We use software engineering principles to review challenges and opportunities for improving the interoperability, integrity, efficiency, and maintainability. We conclude by recommending ways to improve access to structured systematic review results. Major opportunities for improving systematic reviews will come from new tools and changes in policy focused on doing the right systematic reviews rather than just doing more of them faster.",
keywords = "Evidence synthesis, Machine learning, Software engineering, Systematic reviews as topic, Trial registration, Updating systematic reviews",
author = "Rabia Bashir and Dunn, {Adam G.}",
year = "2019",
month = "5",
day = "1",
doi = "10.1016/j.jclinepi.2018.12.014",
language = "English",
volume = "109",
pages = "136--141",
journal = "Journal of Clinical Epidemiology",
issn = "0895-4356",
publisher = "Elsevier",

}

Software engineering principles address current problems in the systematic review ecosystem. / Bashir, Rabia; Dunn, Adam G.

In: Journal of Clinical Epidemiology, Vol. 109, 01.05.2019, p. 136-141.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Software engineering principles address current problems in the systematic review ecosystem

AU - Bashir, Rabia

AU - Dunn, Adam G.

PY - 2019/5/1

Y1 - 2019/5/1

N2 - Systematic reviewers are simultaneously unable to produce systematic reviews fast enough to keep up with the availability of new trial evidence while overproducing systematic reviews that are unlikely to change practice because they are redundant or biased. Although the transparency and completeness of trial reporting has improved with changes in policy and new technologies, systematic reviews have not yet benefited from the same level of effort. We found that new methods and tools used to automate aspects of systematic review processes have focused on improving the efficiency of individual systematic reviews rather than the efficiency of the entire ecosystem of systematic review production. We use software engineering principles to review challenges and opportunities for improving the interoperability, integrity, efficiency, and maintainability. We conclude by recommending ways to improve access to structured systematic review results. Major opportunities for improving systematic reviews will come from new tools and changes in policy focused on doing the right systematic reviews rather than just doing more of them faster.

AB - Systematic reviewers are simultaneously unable to produce systematic reviews fast enough to keep up with the availability of new trial evidence while overproducing systematic reviews that are unlikely to change practice because they are redundant or biased. Although the transparency and completeness of trial reporting has improved with changes in policy and new technologies, systematic reviews have not yet benefited from the same level of effort. We found that new methods and tools used to automate aspects of systematic review processes have focused on improving the efficiency of individual systematic reviews rather than the efficiency of the entire ecosystem of systematic review production. We use software engineering principles to review challenges and opportunities for improving the interoperability, integrity, efficiency, and maintainability. We conclude by recommending ways to improve access to structured systematic review results. Major opportunities for improving systematic reviews will come from new tools and changes in policy focused on doing the right systematic reviews rather than just doing more of them faster.

KW - Evidence synthesis

KW - Machine learning

KW - Software engineering

KW - Systematic reviews as topic

KW - Trial registration

KW - Updating systematic reviews

UR - http://www.scopus.com/inward/record.url?scp=85060113089&partnerID=8YFLogxK

U2 - 10.1016/j.jclinepi.2018.12.014

DO - 10.1016/j.jclinepi.2018.12.014

M3 - Article

VL - 109

SP - 136

EP - 141

JO - Journal of Clinical Epidemiology

T2 - Journal of Clinical Epidemiology

JF - Journal of Clinical Epidemiology

SN - 0895-4356

ER -