The strategic use of data is central to the science of delivery, and the performance of Results-Based Financing programs is systematically monitored and evaluated using a combination of quantitative and qualitative data. This data is then used to make informed decisions about program design and implementation, or as a springboard for further inquiry. Quantitative operational data is collected for many RBF programs using “performance dashboards”. Operational data highlights trends in program performance based on quality scores and indicator achievement.
Collected either on a monthly, quarterly or semi-annual basis, operational data allows for “real-time” program analysis at the facility, regional and national levels. The quantities of each indicator are logged in the dashboard in each reporting period. Once entered in the system, quantitative data can be viewed by individual facilities or aggregated within an administrative unit (e.g. district, council, county, province level). Similarly, quality scores can be viewed per facility or averaged within an administrative unit.
While providing a very powerful management tool for health system administrators, performance dashboards also enhance project transparency and accountability: PBF dashboards are publicly available in many countries in which PBF has been/is being implemented.
COMPLETED IMPACT EVALUATIONS AND EMERGING LESSONS
The learning portfolio of the Health Results Innovation Trust Fund (HRITF) comprises 29 rigorous impact evaluations, and 10 assessments of Results-Based Financing (RBF). While evaluations are at different stages from early design to end-line analysis and dissemination, there is an increasing number of evaluations of HRITF-funded pilot projects reaching completion and providing evidence on various aspects of RBF programs in maternal and child health. These studies and learning activities provide an excellent opportunity to both shine a light on the “black box” of RBF and drive the future RBF learning agenda.
As the HRITF operational and evaluation portfolios begin to mature with the country programming and evaluations now initiated, the HRITF has refreshed its learning strategy to reflect the in-country and cross-country learning that the portfolio provides. The aim is to leverage these component evaluations into composite learning. The strategy traces out the highlights of what has been learned so far in the HRITF and what remains to be learned, noting where the HRITF can fill those gaps. The Learning Strategy also includes a results framework for completing its learning objectives.
MID-TERM REVIEW AND MANAGEMENT RESPONSE
In 2017 the external consulting firm IOD PARC conducted an independent mid-term review of the Health Results Innovation Trust Fund. The report, which was finalized in 2018, aimed to accomplish three objectives:
- To assess the performance of the HRITF against the given goals and outputs of the program (as described in the results framework) identifying strengths, weaknesses and lessons learned,
- To determine what progress has been made in addressing the recommendations from the previous 2012 evaluation, and
- To make recommendations to inform on-going and future programming specifically aimed at (a) improving the performance of the current HRITF program from a donor, implementer and country level perspective and (b) supporting the design and implementation of future RBF approaches being considered.
In the Management Response to the mid-term review, World Bank management found the evaluation fair in both its highlighting the strengths and accomplishments of the HRITF and useful in its recommendations for continued programmatic and evaluation work.