Skip to main content

Table 1 A summary of impact evaluation study designs and methodologies

From: Evaluating malaria programmes in moderate- and low-transmission settings: practical ways to generate robust evidence

Methodology/study design

When is it useful?

What types of data can be used?

How robust is the design?

Interrupted time series

Policy change or other intervention introduced on a known date. Useful when no underlying contemporaneous control group, but can be adapted to include a control group

Time-series data (retrospective or prospective), ideally RHIS

Good. Considers secular trends and confounding factors, counterfactual can be estimated

Dose–response

When no clear intervention and comparison areas, but intervention at varying levels of intensity by district

Sub-national data (e.g., district-level) describing intervention, impact indicator, and potential confounders. Ideally RHIS. Requires data on process and activities to define ‘intensity’

Moderate, if high spatial and temporal resolution and confounders included. Can estimate counterfactuals for alternative programme coverage levels. Prone to confounding because intensity of intervention or program applied may be related to impact outcome

Constructed controls (matching or discontinuity designs, instrumental variables)

When no clear intervention and comparison areas, but differences in individual use and access to interventions, or eligibility criteria determine whether an individual or area received interventions. Useful for inference at the individual level

Individual-level data from cross-sectional survey data with large sample size, and all possible confounders measured

Moderate. Limited by availability of data from which to estimate controls. Often uses data from a single cross-sectional survey, and evaluation may have low power to identify changes where cross-sectional RDT positivity is the primary impact indicator

Stepped-wedge

Phased introduction of programme with or without randomization

RHIS or repeat cross-sectional surveys

Moderate. Important to account for other programmes or contextual changes occurring during the phased roll-out of program being evaluated