Skip to main content
eScholarship
Open Access Publications from the University of California

UCSF

UC San Francisco Previously Published Works bannerUCSF

Contribution and quality of mathematical modeling evidence in World Health Organization guidelines: A systematic review

Abstract

Mathematical modeling studies are frequently conducted to guide policy in global health. However, the contribution of mathematical modeling studies to World Health Organization (WHO) guideline recommendations, and the quality of evidence contributed by these studies remains unknown. We conducted a systematic review of the WHO Guidelines Review Committee database to identify guideline recommendations that included evidence from mathematical modeling studies since inception of the Guidelines Review Committee on 1 December, 2007. We included WHO guideline recommendations citing a mathematical modeling study in the primary evidence base. We defined a mathematical model as a framework that predicted epidemiologic, health or economic impact of an intervention or decision in the clinical or public health context. The primary outcome was inclusion of evidence from mathematical modeling studies in a guideline recommendation. We evaluated each unique modeling study across multiple domains of quality. Between 1 December 2007 and 1 April 2019, the WHO Guidelines Review Committee approved 154 guidelines providing 1619 guideline recommendations. Mathematical modeling studies informed 46 WHO guidelines (29.9%) and 101 unique guideline recommendations (6.2%). Modeling evidence addressed topics related to infectious diseases in 38 guidelines (82.6%) and 81 recommendations (80.2%), most commonly for HIV and tuberculosis. Evidence from modeling studies was assessed in the GRADE evidence profile for 12 recommendations (12.9%) and GRADE evidence-to-decision framework for 45 recommendations (44.6%). Modeling-informed recommendations were more likely than other recommendations within the same guidelines to be issued with a "conditional" rather than "strong" strength of recommendation (53.5% versus 37.8%), and the evidence underlying modeling-informed recommendations was more likely to be assessed as very low quality (41.6% versus 24.1%). Upon review of individual modeling studies, we estimated that 33.8% of models performed a calibration, 29.4% of models performed a validation of results, and 20.6% of models reported a change in the study conclusion in the sensitivity analysis. While policy recommendations in WHO guidelines are informed by evidence from modeling studies, the validity of modeling studies included in guidelines development is heterogeneous. Quality assessment is needed to support the evaluation and incorporation of evidence from mathematical modeling studies in guidelines development.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View