Assessment of the Roll Back Malaria Monitoring and Evaluation System


PDF document icon wp-02-55.pdf — PDF document, 142 kB (146,196 bytes)

Author(s): Macintyre K, Eckert E, Robinson A

Year: 2002

Abstract:
Introduction The Roll Back Malaria (RBM) partnership is currently undergoing an evaluation of its progress after three years of implementation. One objective of the RBM Partnership is to develop an effective monitoring and evaluation (M&E;) system to assess RBM progress towards its objectives and determine whether its goals have been met at the country, regional, and international levels. USAID, as a primary funder of this monitoring and evaluation system, particularly for the Africa region, has requested a specific assessment of the monitoring and evaluation system at the regional and global level. The results of this assessment will feed into the larger external evaluation and will provide recommendations to improve the capacity of RBM to monitor its effectiveness. The methods used here have consisted of document reviews, database reviews, summary analysis of indicators and methodology, and key informant interviews in Harare, Geneva, Atlanta and by phone with nearly all other partners. The consultancy took place between November 2001 and January 2002. WHO/AFRO: 3 staff from RBM and 1 person from integrated disease surveillance WHO/HQ: 5 staff from RBM, 2 from integrated disease surveillance, 2 from TB Interviews and general discussions were also held with most of the individuals involved in or with a close interest in M&E; of RBM, with members of the Partnership, and several malaria experts from RBM itself, and externally. Framework The framework for M&E; for RBM is comprehensive in its coverage of all areas relevant to Roll Back Malaria. It emphasizes local control over data collection efforts by developing standardized approaches and encouraging countries to pick indicators appropriate to its epidemiologic profile. The framework uses minimal new data collection, instead relying on existing mechanisms and tapping into larger survey efforts, such as the DHS, where appropriate. This reliance on on-going data collection efforts while improving existing systems aims is laudable but has potential to increase problems in acquiring the desired data in a timely fashion. The conceptual framework spells out the elements of a malaria program but does not clarify the processes, outputs, and outcomes within each element. In addition, there is no guidance on the appropriate selection of indicators at different levels, except to urge countries to choose one process and one outcome indicator for each element. The 'evaluation' aspect of M&E; is not evident in the framework documents either which could limit efforts to empirically prove the merits and cost-effectiveness of various programs. Databases and Platforms Monitoring and evaluation depends on high-quality valid and reliable data on the target program. Several databases are in use, or being created. However, many challenges remain if these databases are to play a solid role in M&E.; In many cases the databases are not complete and some of the data are of questionable quality. It is of particular concern that the baseline surveys are still not complete. At the country level, various sources of data exist including national health information systems, national surveys such as the Demographic and Health Survey (DHS) or the UNICEF Multi-indicator Cluster Survey (MICS). These sources provide information for program monitoring and impact assessment on a regular basis. WHO/AFRO has also developed a methodology for collecting country baseline data which is currently being implemented in Africa. In addition, RBM has contracted with the INDEPTH network of demographic surveillance sites to collect specific indicators on malaria morbidity and mortality to inform the program on disease trends. Indicators and Sources of Data There is a lack of consistency in indicators and definitions reported across countries and regions within RBM. The biggest issue is a lack of clarity on the definition of the indicators and target population covered. This lack of consistent guidelines and practices is a minor problem within a given country but can create more serious problems when it is aggregated at the regional or international level and compared with data from other countries that use different definitions or data sources. The guidelines require countries to report on the 5 'global' indicators and suggest selecting indicators to cover outcome and process levels as well. However, many countries have difficulty in recognizing the process/outcome/impact hierarchy. RBM (either regional or international) could greatly assist in this effort by providing technical assistance to individual countries to develop their M&E; plans. The RBM M&E; framework suggests many different sources of data for most of the key indicators, including 4 of the 5 global indicators, which leads to confusion as to the most appropriate mechanism to obtain the needed data. A large number of the proposed indicators are population-based, yet the bulk of the data used are derived from routine health information systems or facility-based information and do not use the most accurate denominator estimates. The RBM guidelines currently provide no guidance on the appropriate selection of data sources. Indeed, in AFRO Region, community surveys are being implemented without the rigid sampling methodologies necessary to be representative. This can create confusion and controversy when an indicator derived from one source is not the same as one calculated from another. Finally, there is an inconsistency in definitions of the suggested indicators, particularly the 'global' or 'core' indicators. These inconsistencies lead to confusion and ultimately jeopardize attempts to aggregate data at the regional or international level. Organizational Capacity Many of the shortcomings of the M&E; system of RBM are due to organizational or structural issues within the RBM offices. The M&E; team at HQ is tasked with: a) coordinating an internal M&E; working group; b) developing and implementing a work plan to track progress of RBM at all levels; c) developing a geographical information system for RBM; d) developing and testing tools for malaria M&E; and e) coordinating reporting on RBM and related activities. In addition to the M&E; team at WHO/HQ, individuals within the programmatic components of RBM have M&E; responsibilities. Several individuals working in other units such as Stop TB and Communicable Disease Surveillance are also collaborating on aspects of RBM M&E;, however, the organizational structure of RBM does not clearly define the roles and responsibilities of these individuals vis-à-vis the M&E; team. Likewise, budget allocations for M&E; activities are not clearly defined among the groups. This confusion leads to redundancies in some activities and gaps in others. At WHO/AFRO, the M&E; team is understaffed, consisting of one epidemiologist and one data manager. Both individuals are frequently on other activities within RBM and the larger WHO office. Other regional offices do not have dedicated M&E; staff. This is a serious shortcoming given that all the data for international monitoring must come through the regional offices first. There is no clear delineation of responsibilities between the regional bureaus and WHO/HQ for monitoring and evaluation activities nor is there any formalized chain for reporting or deadlines. RBM is caught between the stated goal of helping countries develop their monitoring systems and the demand to produce accurate, timely tracking for the overall initiative. However, given the constraints mentioned above, this review suggests that technical assistance for the development of monitoring systems should be viewed as a separate, but equally important, activity from the monitoring of international efforts, at least in the early years of the initiative. Recommendations 1. Recommendations for establishing systematic evaluation of RBM 1.1 Establish a strong M&E; Team at the RBM Secretariat and in the Regional Offices. We see this as needing at least three separate initiatives: Increase the number of qualified M&E; staff both at HQ and in the Regional offices, especially AFRO. Streamline the management structure so that there is more authority to drive the evaluation decisions. Establish a reference group to provide periodic consultation on specific technical issues related to monitoring and evaluation. 1.2 Establish and maintain a plan and timeline for RBM M&E; reports at the regional and global levels. Reports that are essential in the near future include: A baseline report for measures (dating from approximately 1998-1999) of impact, outcome, and process indicators from settings where these data exist Progress reports describing specific issues such as evaluation of priority interventions, or monitoring the effect of a major policy change (e.g. change in first line drug policy). A format for annual reporting on progress with specific indicators and a timeframe for reporting must be established. A global report on malaria, produced every few years, like the TB Global Report, would be very helpful at the international level. 1.3 Establish a transparent system for assessing data quality and standardization across countries especially for the core indicators. The current M&E; framework allows for local adaptation of many indicators thus potentially rendering some indicators incomparable. Certain indicators, when established as ""global"" or at least as ""regionally critical,"" must be exempt from country modification. 1.4 Establish methods for documenting sources of data within the specific databases used for M&E; purposes, and the extent to which they are representative of a country situation. Currently, data sources for country indicators are not documented when the data are aggregated to the national or regional level thus confusing interpretation. 1.5 Establish clear guidelines for data collection protocols and sampling strategies used to collect malaria-focused data in countries. For those indicators which can be obtained through standard survey methodologies, these should be used. For other indicators, RBM needs to provide clear and consistent recommendations on how to collect the necessary data and technical assistance in data collection when necessary. 1.6 Establish a complete malaria database at the global level. Currently, no complete database for malaria exists at the global level (although the AFRO Regional office is compiling one for that region). RBM must be pro-active in collecting data and holding countries to reporting requirements and deadlines. 1.7 Develop clear terms of reference for the HQ M&E; unit as a whole. Management needs to clarify how the cross-cutting programs like M&E; should interact with the vertical teams. Current collaboration is based more on personal relations than on a defined structure.