01 December 2020

Monitoring and Review

1. Brief Overview

Monitoring and review is a key component of an effective integrated national financing framework (INFF).FN It brings together the information required by policy makers and ensures necessary systems are in place to facilitate transparency and accountability in the implementation of the INFF and related activities and reforms.FN In so doing, it can facilitate learning, improve the effectiveness of financing policies, and enable policy makers to adjust course when conditions change. For example, monitoring systems can be an important enabler for rapid response and recovery planning during a shock such as COVID-19. Regular reviews provide lessons to improve implementation and design of the INFF, including governance and coordination mechanisms and processes.

Monitoring and review can be conducted at all levels of government – central agencies such as Ministries of Planning and Ministries of Finance, line ministries with sector responsibilities, and at the sub-national level by decentralized institutions. It can also be conducted by non-state actors to hold government accountable (see guidance on Building Block 4 Governance and Coordination for the institutions and processes that can facilitate access to information by all stakeholders).

In the context of an INFF, monitoring and review brings together existing data and tracking systems from across different types of finance, and links them to results (see Box 1 in Section 2). It does not aim to replace or duplicate these systems. Rather it acts as an ‘integrator’ by streamlining efforts and providing access to policy-relevant information across multiple financing policy areas, and feeding it back into integrated policy making processes.

Sections 2 and 3 of this guidance outline the value and role of INFF monitoring and review. Section 4 lays out relevant stakeholders, as well as processes that could be used as entry points. Section 5 focuses on steps countries can take to strengthen relevant monitoring and review systems and ensure the required data and information to guide overall INFF implementation is gathered and used by policy makers (Annex 2 provides an illustrative INFF monitoring and review frameworkFN that may be used to bring together data and information from different systems). It puts forward key elements of a monitoring and review system in the context of an INFF and proposes two steps to strengthen (or establish) such a system, depending on a country’s existing processes, capacities, priorities and needs. The first is to establish a baseline, including levels of buy-in, roles and responsibilities, and data systems and capacity (a checklist for this step is included in Annex 1). The second step is to build on the baseline and fill any gaps, drawing on established good practice in the field and country examples, across four areas: i) institutionalise INFF monitoring and review; ii) integrate existing systems; iii) link the process to ongoing or planned data/ statistical reform processes, and make use of needs-based IT solutions; and iv) leverage insight and lessons from peers and regional/global knowledge-sharing platforms.

National government officials, especially members of the INFF Oversight Committee, where one is in place, and those with roles and responsibilities related to monitoring and review of financing flows and policies, are the primary audience. The guidance also provides a common reference point for development partners that support countries in these efforts. 

2. The value of INFF monitoring and review

INFF monitoring and review can help governments to:

  • Streamline monitoring efforts by reducing gaps (common in areas of private finance and investment, and in relation to impact of financing), redundancies, overlaps (such as between government and donor systems) and misalignments in existing monitoring systems and processes related to different types of finance and financing policy areas (INFF monitoring and review as an ‘integrator’, see Box 1);
  • Build the necessary evidence base (i.e., with enhanced data) to improve efficiency and effectiveness of financing policies and reforms, encourage broad-based participation in policy processes by all relevant stakeholders, and enhance buy-in and understanding of the value of the INFF;
  • Regularly review the value added of the INFF financing strategy, including whether it is succeeding in enhancing mobilization and increasing coherence and alignment of financing vis-à-vis national sustainable development priorities; 
  • Support dynamic policy making by facilitating learning on what works and what does not work, and by enabling timely course adjustments in response to changes in conditions, such as in the financing and risk landscapes;
  • Create positive feedback loops, encouraging learning and innovations from the implementation level to be fed back to policy design;
  • Strengthen partnerships, dialogue and trust among stakeholders;
  • Improve transparency and accountability.

Box 1. INFF monitoring and review as an integrator

INFF monitoring and review can act as an integrator (Figure 1), as it builds on, and brings together, existing planning, budgeting and tracking systems. Countries tend to have a variety of monitoring and review systems in place (see Table 1 in Section 4.2), which are often not well aligned and/or duplicative. INFF monitoring and review facilitates tracking of both volumes and impact of all types of finance (public, private, international, domestic) as well as the implementation of relevant financing policies and strategies, including their impact vis-à-vis identified national sustainable development objectives. Thus, it not only integrates existing tracking systems across types of finance, but also provides a framework to link them to planning and results frameworks related to national development plans and/or SDG strategies.

INFF monitoring and review also provides the opportunity for governments to bring together initiatives and strategies related to data and statistical capacity, which are often developed in siloes.  It can help to better link initiatives and strategies to the concrete data needs of national policymakers (see Section 5.2, Action Area 3).

Figure 1. INFF monitoring and review as ‘integrator system’

3. The role of monitoring and review within an INFF

3.1. Monitoring and review within an INFF

Similar to Building Block 4 on Governance and Coordination, Building Block 3 on Monitoring and Review cuts across the various INFF phases, from inception to design and ongoing implementation. Lessons from early implementers underline the importance of agreeing and clearly articulating the value added of the INFF from the inception phase. Monitoring and review tools, such as the Theory of Change, can help with this, while also providing an integrated perspective on activities related to the entire INFF process (see Box 2). Information on volumes and impact of financing as well as on the workings of related governance arrangements can feed into ongoing or updated financing needs and landscape assessments, risk assessments and binding constraints diagnostics (Building Block 1), to better understand the allocation and use of current financing sources, track mobilization and alignment efforts, and identify emerging risks and challenges. Information related to progress in the implementation of the financing strategy and to the effectiveness of INFF design can inform adjustments to specific policies (Building Block 2), and further strengthen governance and coordination arrangements (Building Block 4). 

Figure 2. Linkages between building block 3 and other INFF building blocks

As illustrated in Figure 2, INFF monitoring and review can help gather data and insight on three main areas: i) volumes and impact of financing; ii) progress in implementation of the financing strategy; and iii) what works/ what does not work in INFF design and implementation. More specifically, it can help answer the following questions:

Volumes and impact of financing

  • How much public and private financing is spent/ invested in the country?
  • How is financing currently allocated? How does it contribute to the achievement of national sustainable development priorities?
  • To what extent do different types of finance work synergistically and in an integrated manner toward identified goals (versus undermining each other’s impact)?

Progress in implementation of the financing strategy

  • Is the financing strategy (and related policy reforms) succeeding in increasing mobilization of additional financing in line with set targets and from the required sources (e.g., public/ private/ domestic/ international)?
  • Is the financing strategy (and related policy reforms) succeeding in increasing overall coherence and alignment between financing and national sustainable development priorities?

What works/ what does not work in INFF design and implementation

  • What lessons can be learned to further support adjustments and improvements in financing policies? For example, how effective are the underlying governance and coordination mechanisms? Are there areas of the financing strategy that may be working better than others?

Box 2. An INFF theory of change

The inception phase and the INFF roadmap will contain a description of the value added and rationale of implementing an INFF in a national context. An INFF “theory of change” can link this value added to inputs, activities, and concrete and measurable outputs. Such a “theory of change” would spell out inputs and activities, and related outputs, outcomes, and the ultimate goal of achieving national sustainable development priorities, along with key assumptions/ risk factors (see Figure 3 for an example of an INFF theory of change). It would thus articulate activities across building blocks (from necessary assessments and diagnostics, financing policy formulation and review, to setting up of adequate governance structures), and tie them to concrete outputs, outcomes and the ultimate impact of contributing to the achievement of national sustainable development objectives. In so doing it can provide the basis for an integrated and comprehensive monitoring and review framework (see Annex 2 for an example), while allowing for flexibility that may be required if national and/or global contexts change. For example, assumptions and risk factors may be revisited regularly, as additional or more up-to-date information becomes available via assessment and diagnostic exercises. Similarly, the initial logic linking inputs to the desired impact can be adjusted if necessary, as information from regular monitoring of progress in INFF implementation becomes available. Thus, while developing a theory of change may visually appear as a linear process, it is not; it will inevitably involve regular validation and adjustments and support continuous learning around INFF implementation.

Figure 3. An INFF theory of change

Pre-requisites, assumptions and risk factors:

Strong commitment at the highest political and technical levels; broad-based buy-in across relevant stakeholders; willingness of non-government national and international stakeholders and partners to support the INFF; conducive global context; minimum availability of data on financial flows and their allocation/ use; political stability; functional public sector.

3.2 Key elements

Annex 2 provides an illustration of a full INFF monitoring and review framework. It shows how information from different monitoring and review systems (e.g., those related to government finance, international development cooperation and private finance, where existing) may be brought together to guide overall INFF implementation and related financing policy making.

The following elements form the basis of the framework:

  • A theory of change (TOC) - or similar logical framework - to ensure common understanding of the rationale, effects, barriers and enablers of the INFF, while keeping in mind complexity of the change process, and with an emphasis on enabling feedback loops and learning. The scope of the TOC, which will in turn define the scope and depth of required monitoring and review, will depend on the scope of the INFF (e.g., in countries where INFFs are focused on specific sectors or financing policy areas, the TOC will be narrower in scope compared to that of an INFF covering an entire national development plan as illustrated in Box 2 in Section 3.1 and in Annex 2) and on specific activities and objectives articulated in the financing strategy (e.g., changes in, or introduction of new, policies, regulations and instruments).
  • Indicators to identify which data should be regularly collected and reported on to monitor progress against the TOC. Where possible, they should be lifted from existing monitoring frameworks for different types of finance (such as those related to the national budget, and development cooperation results frameworks). Relevant globally agreed SDG indicators may also be applicable, depending on the scope of the country’s INFF. At the impact level, indicators from national development plans or sectoral plans may be used. At the activity and output levels, the choice of indicators will be informed by the INFF action plan articulated as part of the financing strategy building block (see for example Table 5 in Building Block 2 guidance on the financing strategy). Indicators can be quantitative or qualitative, and should be clear (precise and unambiguous), relevant (appropriate to the subject at hand), economic (available at a reasonable cost), adequate (providing a sufficient basis to assess performance), and monitorable (amenable to independent validation) (CREAM). Figure 4 provides some examples. Additional examples are included in Annex 2, which also lays out key information that is typically required for each indicator.  

Figure 4. Examples of output and outcome indicators

  • Targets to establish common objectives for what needs to be completed by when, in order to achieve the outcomes identified in the TOC. Targets should be specific, measurable, achievable, relevant, and time-bound (SMART). Such SMART targets should be informed by timeframes and sequencing considerations in the INFF roadmap and financing strategy, and use, to the extent possible, already identified targets in existing financing policies and strategies. Annex 2 provides some examples.
  • Data systems and capacity to generate ‘good enough’ data to enable regular reporting on identified indicators, as well as access and use of such data by decision-makers and those holding them to account. Table 2 in Building Block 1.2 Financing landscape assessment provides an overview of national data sources that would likely form the basis of relevant data systems, such as central bank statistical publications, national accounts, surveys by national statistics offices, budget publications, economic bulletins, etc. (International data sources can complement national ones where there are gaps; e.g., monitoring data from Development Cooperation Forum surveys and the Global Partnership for Effective Development Cooperation may shed light on alignment of ODA with national priorities; World Bank data on Private Participation in Infrastructure can contribute to monitor the scale and alignment of public-private partnerships with national priorities).
  • Adequate resources, including human resources within national government to ensure the system is effectively implemented and remains functional over time, and incentives to focus on results and impact. Responsibility for INFF implementation falls primarily on national governments, and thus requires relevant resources and capacity to be available among government officials. However, it also relies on the collaboration and participation of other actors such as development partners, the private sector and civil society. Adequate resourcing of, and skills in, monitoring and review among all these stakeholders is thus also an important element of success.

Box 3. The role of evaluations within an INFF

Evaluation is the systematic and objective assessment of an ongoing or completed project, programme or policy, its design, implementation and results. Even though the terms review and evaluation are sometimes used as synonyms, an evaluation is usually a more robust, comprehensive and in-depth assessment. Evaluation is used across sectors and topics to determine the relevance and fulfillment of objectives, coherence of the intervention with other interventions in a country, sector or institution, development efficiency, effectiveness, impact and sustainability. Evaluation, as other review systems, serves a dual purpose of providing evidence for accountability and learning to inform enable policy adjustments and quality improvement as well as supporting transparency and oversight of expenditure. 

Evaluation is a vast field of study, with extensive literature and tailored methodologies and approaches for specific sectors. Internationally accepted evaluation norms and principles include: the quality standards for development evaluation of the OECD Development Assistance Committee; the good practice standards of the Evaluation Cooperation Group; and the norms and standards of the United Nations Evaluation Group. In the area of financing, evaluation guidelines and methodologies exist in relation to specific types of finance, especially public finance (for example, related to the national budget, blended finance and private sector support, trade-related aid, and budget support). Evaluations can inform spending of public money and efficient and effective policymaking, which is critical as governments are under pressure to provide more and better services under tight fiscal environments. Examples of evaluations in the area of financing for sustainable development include evaluations focused on specific types of finance, such as budget support, and on individual government agencies or institutions, such as the CDC’s Financial Institutions portfolio evaluation

Evaluation plays a key role in understanding progress toward achieving the Addis Ababa Action Agenda and the SDGs. The 2030 Agenda states that review of the SDGs will be “rigorous and based on evidence, informed by country-led evaluations” and calls for “strengthening of national data systems and evaluation programs” (paragraph 74). The focus has mostly been on evaluating the progress towards achieving the SDGs, for example through Voluntary National Reviews, Government Annual Reports, performance audit reports and regular evaluations of the effectiveness and coherence of sustainable development policy. Scope remains to strengthen evaluation of financing for sustainable development. INFFs provide an opportunity for both national governments and international development partners to do so. Note: For an overview of the role of evaluation in results-based management systems, see Chapter 7 of World Bank (2004) A Handbook for Development Practitioners: Ten Steps to a Results-Based Monitoring and Evaluation System, which covers the different uses, types, timing, and characteristics of evaluations.

4. Relevant stakeholders and processes

4.1 Key roles and responsibilities

As with all INFF components, government plays the central role when it comes to monitoring and review. Leadership from both the political and senior technical levels is key to ensure sustainability and effectiveness of monitoring and review efforts (see success factors listed in Section 5.2). Typically, this leadership stems from the Ministry of Finance and/ or Planning, or the Office of the President or Prime Minister. Depending on the focus of the INFF, it may also be within specialized agencies or line ministries. 

The INFF Oversight Committee, when one is present, is the likely body with overall responsibility to oversee the design and implementation of an adequate monitoring and review system, and to report on progress in implementation of the INFF financing strategy both internally (such as to senior government leadership) and publicly (e.g., around annual budget statements). 

Various stakeholders, both within and outside government, will be involved in monitoring financing efforts at different levels and in different sectors. Their experience and expertise should be sought when considering INFF monitoring and review systems. 

Figure 5 illustrates typical producers and users of data. Depending on the scope of the INFF, monitoring and review roles may differ. For example, if a country’s INFF is focused on a specific sector or a specific financing policy area, the role of specialized agencies may become more central. If the INFF is focused on the sub-national level (such as in Ghana) then the role of sub-national agencies would be more prominent. 

Figure 5. Typical roles and responsibilities in INFF monitoring and review

4.2 Entry points

Countries do not need to start from scratch when it comes to INFF monitoring and review. Existing monitoring systems, processes and frameworks should be the starting point. Such systems can be strengthened or expanded, better aligned and made more coherent within an INFF as necessary. Similar to Building Block 4 Governance and Coordination, the overarching aim should be to streamline efforts, not to replace or duplicate existing systems nor to establish new systems, unless there are gaps that need to be filled.

Table 1 illustrates potential entry points for INFF monitoring and review, as well as relevant stakeholders typically involved in establishing and maintaining adequate systems. It also includes links to useful resources and tools that can shed light on existing systems and processes. (Table 2 in Building Block 1.2 Financing Landscape Assessment provides an overview of data sources, which can inform the tracking of volumes of financing).

Overall, if a well-established system for monitoring implementation of the national development plan is in place, it could serve as a starting point for INFF monitoring and review. Similarly, if established processes around Voluntary National Reviews (VNR) exist, these should also be considered (see Box 4). Data and statistical strategies and reform processes may constitute another entry point (see more in Section 5.2, Action Area 3). Existing monitoring systems for different types of finance (such as national budget tracking systems and review processes, or private finance reporting initiatives) can act as entry points to build a more comprehensive system. As further articulated in Section 5, the comprehensiveness of such a system will differ depending on country contexts, reflecting the nature of INFFs as a long-term and gradual approach to guide better planning for, and implementation of, financing policy reforms.

With regard to public finance, monitoring systems relate to government finance (revenue, spending and investment), as well as development cooperation. Public financial management information systems (PFMIS) are typical starting points from the government’s perspective. A growing number of countries have gender, climate, or SDG budget tagging or coding systems in place, or are developing them as part of their INFF. Monitoring systems for government finance are usually developed around the annual budget process, with both volumes and data on performance indicators reported in relation to objectives that programmes within each budget agency are expected to achieve on an annual or multi-year basis. In the budget planning and formulation stage, and as part of their submissions to the Ministry of Finance, budget agencies may be required to articulate a narrative around how they will contribute to identified national priorities or the SDGs, and/or to link their programmes to specific goals. Data on government revenue may be collected to various degrees of disaggregation and may be subject to review with regard to questions of progressivity or inequality. There are also varying practices in relation to monitoring of tax expenditures, used to shed light on revenue foregone through tax incentives. Specific monitoring frameworks may also be in place for major investment projects (e.g., articulated by ministries and other entities involved in major infrastructure projects) and for state-owned enterprises, which may have dedicated systems to track their investments and contributions toward national development.

Monitoring systems for development cooperation and finance are based on national aid information systems and country results frameworks. According to 2018 GPEDC monitoring data, 96% of developing countries have one or more information management systems in place to collect information on development cooperation. While the quality of such information varies, it typically includes data on financial commitments, scheduled and actual disbursements, and in some cases on intended and achieved results. Data from the 2020 DCF survey shows, however, that less than half of development cooperation information systems tracked results, off-budget flows, funding gaps and conditionalities. A results framework to review the performance and results of international development cooperation was in place in just over half of respondent countries (56%). Critically, in the context of an INFF, in only 36% of cases countries and development partners use the same, or mostly overlapping, results framework, meaning that there are multiple parallel systems. Furthermore, only half of results indicators from development partners’ projects are monitored using national statistics and monitoring systems, according to GPEDC data.

With regard to private finance, the monitoring and reporting landscape is even more fragmented. On financing flow volumes, national accounts and related central banks reporting are common systems. In addition, relevant line ministries (e.g., ministries of business or local development) may have systems in place to collect and report data on investments in the country, including in relation to the role of SMEs at the sub-national level. Beyond volumes, policymakers need data and information on the impact of private business and investment on economic, environmental, and social issues to assess the private sector’s contribution to sustainable development objectives. Meaningful data and information on this remains scarce though a number of initiatives and innovations are ongoing, with INFFs providing a platform for enhancing coherence of this increasing wealth of information at the country level and feeding it into financing policy making processes.

For example, private sector- or government-led systems for consolidating data on the contributions of business to sustainable development priorities exist in some countries (e.g., in the Philippines and in Colombia) and there is a growing number of companies publishing a sustainability report (mainly large, listed companies). However, information published is often not comparable across companies or time, and tends to focus on qualitative indicators rather than on quantitative data. Companies select the issues they choose to communicate, as sustainability reporting remains largely voluntary. In addition, sustainability-related information is generally behind paywalls and is not in the public domain; policymakers could change this by creating an open repository for company sustainability data to create more transparency. Companies also need to adjust their internal systems to track and report data on environmental and social issues. This might be particularly challenging for smaller companies with limited resources. Nonetheless, voluntary sustainability reports can be a starting point. Governments can increase their relevance by agreeing (including at the global level) on harmonized metrics and indicators to be used for company disclosure. Countries across a range of development contexts (e.g., EU, China, Mexico, South Africa, Mongolia, Bangladesh) have developed or are in the process of developing taxonomies that could enhance reporting on both volumes and impact of private business and finance. The G20 Sustainable Finance Working Group (SFWG), established in 2021, is also working on improving corporate sustainability disclosure and on facilitating the compatibility and consistency of national approaches regarding sustainable taxonomies. The INFF process can help consider these national and international efforts in connection to public finance classification systems.

Table 1. Typical entry points for INFF monitoring and review

Box 4. Linking INFF monitoring and review with Voluntary National Reviews

Voluntary National Reviews (VNRs) are regular reviews of progress in relation to SDG implementation at the national and/or sub-national level. They provide the opportunity for governments to reflect on the alignment of national priorities with the SDGs and to prioritise goals and targets accordingly. They aim to facilitate the sharing of experiences, including success factors, challenges and lessons learned, to strengthen relevant policies and institutions, and to enhance multi-stakeholder support and partnerships. In countries where INFFs are linked to SDG action plans and strategies, the VNR process is of particular relevance to INFF monitoring and review.

On the one hand, as noted in section 4.2, processes and governance arrangements that may be in place in relation to VNR preparation and delivery will provide entry points for INFF monitoring and review (e.g., participatory fora for sharing of information among different stakeholders, data collection systems, etc.). On the other hand, the data and information collected through INFF monitoring and review can help countries report on means of implementation in their VNRs. For example, data on volumes and allocation of different types of finance can shed light on areas where gaps in financing may be hindering progress; information on what is working or not in INFF implementation can help highlight specific governance challenges and capacity constraints. (The updated VNR guidelines elaborate on how other aspects of INFFs, such as the financing strategy, can also be used to inform SDG means of implementation reporting). 

Box 5. The role of Supreme Audit Institutions in monitoring and review

Supreme Audit Institutions (SAI) play an important role in contributing evidence for more informed policymaking. Their traditional role of overseeing public expenditure is evolving towards taking a broader, more comprehensive view on reliability, effectiveness, efficiency and economy of policies and programmes. The International Organisations of Supreme Audit Institutions (INTOSAI) has launched several initiatives to support SAIs in this new role and at INTOSAI XXII in Abu Dhabi in 2016, the following four approaches for SAIs to auditing and reviewing SDG issues were identified: i) assessing the preparedness of national governments to implement SDGs; ii) undertaking performance audits in the context of SDGs; iii) contributing to the implementation of SDG 16, which envisages effective, accountable and transparent institutions; and iv) possibilities for SAIs to act as models of transparency and accountability in their own operations. Regional working groups support SAIs as well. For example, AFROSAI published a guideline on Sustainable SAIs in 2019. The role of SAIs is also touched on in the World Public Sector Report 2019 and there is material available from the OECD on SAI and good governance. Guidance on Building Block 4 Governance and Coordination discusses the role that SAIs can play in the context of an INFF.    

5. ‘How To’ – Monitoring and review in practice

Building Block 4 on Governance and Coordination provides an overview of the institutions and processes that underpin transparency and accountability, and that can support adequate availability and access to relevant knowledge and information by stakeholders both within and outside government. The focus in this section is on the systems that such institutions and processes would need to supply necessary data and information, and to enable their use.

Suggested approach

There is no one-size-fits all model to establish an effective monitoring and review system for INFFs. Different countries will start from different baselines and may have identified different priority areas as part of their financing strategies. Consequently, their focus and level of ambition with regard to INFF monitoring and review will also differ. Figure 6 sets out two common steps that may be used to guide efforts to establish and/or strengthen monitoring and review functions in all contexts. The first focuses on establishing the baseline, looking across existing systems and the ‘enabling environment’ for monitoring and review (including underpinning elements, such as level of buy-in, roles and responsibilities, and relevant data systems and capacity). An illustration of basic, intermediate, advanced levels of integrated monitoring and review is presented to guide countries in establishing their level of ambition with regard to strengthening existing systems in the context of their INFF. The second step lays out possible actions to do so, drawing on good practice in the field complemented by country examples, and focusing on: i) institutionalizing INFF monitoring and review, and forging alliances among stakeholders; ii) enhancing integration of existing systems and developing pilots to fill gaps; iii) linking to ongoing or planned data/ statistical reform processes and making use of needs-based IT solutions; iv) leveraging insight and lessons learned from peers and regional/global knowledge-sharing platforms. 

Figure 6. Step-by-step guidance

5.1 Step 1: Establish the baseline

The first step is to identify all relevant systems used in the country by government and non-government stakeholders to monitor and review financing flows and their impact, as well as the implementation of financing policies and strategies, and the extent to which the key elements listed in Section 3.2 are incorporated. Section 4.2 provides an overview of relevant systems and processes that countries may have in place. The following questions (also included in Annex 1 in the form of a checklist) can help to further guide the identification exercise:

  • What systems are in place to monitor and review execution of the government budget, including the extent to which spending allocations are in line with identified sustainable development objectives?
  • What systems are in place to monitor and review government revenue, including the extent to which it is aligned with sustainable development objectives (e.g., progressivity of taxes)?
  • How is the sustainability of public borrowing monitored?
  • What systems are in place to monitor and review allocation and impact of public investment? Are findings from such systems considered alongside those related to government spending?
  • What systems are in place to monitor allocation, use, and impact of international development cooperation, including from public development banks? Are these separate or integrated into other monitoring and review processes related to government finance?
  • What systems are in place to monitor volumes, allocation, and impact of private investment in the country? Are there different systems for domestic and foreign investment (including from MDBs)?
  • Are there reporting requirements for companies and other financial institutions active in the country? If so, do these include requirements to report data on contributions to sustainable development outcomes? Are these mandatory or voluntary?  If mandatory reporting requirements are not in place, are there established voluntary norms?
  • What systems are in place to monitor volumes and allocation of non-profit/ philanthropic finance in the country? Are there different requirements for domestic and foreign foundations? Is data and information from these systems considered alongside data related to other sources of finance (public and private)?
  • What systems are in place to monitor risks to the country’s capacity to finance sustainable development? Which types of risks are monitored (e.g., economic/ non-economic risks)?
  • How often is data collected through the above-mentioned systems? How is it used by government to inform policy making? How is it published for wider accessibility?
  • How often are existing monitoring and review systems related to different types of finance reviewed to ensure their continued relevance?
  • How compatible are they (e.g., do they rely on common definitions of key terms and/or on comparable data sources)?
  • How is the implementation of specific financing policies (e.g., medium-term revenue strategies/ debt management strategies/ public investment strategies/ development cooperation strategies/ investment promotion policies/ financial inclusion strategies/ etc.) monitored? How regularly are they reviewed? How are the findings from such reviews used?

For illustrative purposes, Table 2FN presents three stylized levels of development of monitoring and review systems related to the areas listed above: basic, intermediate, advanced. Countries may of course be at different stages with regard to different elements of the system; the purpose of this illustration is to help countries establish the appropriate level of ambition in terms of moving along the levels in the context of their INFF. Existing legal, political, and organizational factors will shape the ‘enabling environment’ for progress and whether it can be sustained over time.  They should be assessed as part of establishing the baseline. The following questions can help guide this exercise:FN

 

Buy-in:

  • Who are the champions behind the INFF in the country? Do they have the required authority and political clout to drive efforts to strengthen monitoring and review systems and capacities across all relevant ministries/ stakeholders?FN
  • Is there broad-based buy-in for INFF monitoring and review, including a common understanding of the rationale behind it and its potential value?
  • Is there buy-in from development partners to support INFF monitoring and review, including through capacity building, where needed?


Roles and responsibilities:

  • What are the roles and responsibilities in relation to monitoring and review of financing issues in government, and among non-state actors? What is the role of Parliament and of the Supreme Audit Institution?
  • Who in the country produces data (both at the national and sub-national levels)? E.g., central ministries, line ministries, specialized units, provincial ministries, local government, NGOs, development partners, etc.
  • Who uses data? E.g., for budget preparation, resource allocation, programme policymaking, legislation and accountability to parliament, planning, fiscal management, evaluation and oversight. 


Data systems and available capacity:      

  • What is the quality of data produced and used in the country? Are needs-based IT processes in place to facilitate collection and processing of required data? (See Box 6).
  • What are the skills of civil servants in the national government in programme management, data analysis, budget management, performance auditing? Are these functions adequately resourced (including financially)?
  • Are any statistical/ data reform processes ongoing or planned (e.g., development of a national statistics development strategy, or related efforts in specific sectors or financing policy areas)?
  • Is there any technical assistance, capacity building, or training in monitoring and review underway or that was recently done? Who provided this support?
  • Are there any institutes, research centres, private organisations, or universities in the country that have capacity to provide technical assistance and training in monitoring and review to civil servants and other relevant stakeholders?

Box 6. Assessing the quality of data systems through a data quality review

Data quality reviews (DQR) can be used to perform an independent review of the quality of the data that is available to inform INFF monitoring and review. Specific objectives of DQRs typically include: a) verifying baseline and historical data for key indicators based on information available from different sources; b) recommending changes to indicators, data collection mechanisms and protocols as necessary; c) identifying data sources that have been used and confirm their accuracy on the ground and/or between data sources or reports; d) suggesting appropriate method of data collection and sources of data, where new data may be required; e) identifying capacity needs for data collection and making recommendations on the most appropriate monitoring and review institutional mechanisms and technical tools as well as training needs for major stakeholders. Typically, a DQR can include an assessment of identified indicators, and an assessment of data used to calculate such indicators (e.g., validity, reliability, timeliness, precision, integrity). It may also include a more general assessment of existing monitoring and review systems, e.g., structure, functions and capacities, including for data collection, processing, analysis, reporting and use.

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Table 2. Financing monitoring and review systems: illustrative basic, intermediate, and advanced levels

5.2. Step 2: Strengthen existing systems, close gaps if needed

Building on the established baseline and targeted level of ambition, the second step is about taking action to strengthen existing systems, and to close any gaps, thus supporting the move along the ‘levels’ depicted in Table 2. Depending on the country baseline, capacity and needs, such action may be taken in one or more of the following areas:

  • Action area 1: Institutionalise INFF monitoring and review by progressively raising the level of ambition, putting in place or reinforcing the right incentives, establishing an effective monitoring and review function within government, and ensuring participatory approaches to shift the culture around monitoring and review from seeing it as a compliance exercise to focusing on accountability and learning;
  • Action area 2: Enhance integration of existing systems by ensuring they are compatible and able to feed the necessary information into key INFF indicators of performance, reviewing data and information currently being collected, and implementing pilots where changes or new systems may be required;
  •  Action area 3: Link to ongoing or planned data/ statistical reform processes and make use of needs-based IT solutions by reviewing ongoing statistical capacity development efforts, articulating a data development plan (if needed), and considering the potential role of business intelligence software to facilitate collection, processing, use and storage of data in ways that can serve country needs;
  • Action area 4: Leverage insight and lessons learned from peers and regional/ global knowledge-sharing platforms by making use of existing knowledge on what may or may not work, with a focus on INFF-specific initiatives and platforms.

Overall, these actions involve strengthening both institutional arrangements and technical systems, which can help countries overcome common challenges in establishing effective monitoring and review systems (see Box 7). While every country is different, a pragmatic approach, taking advantage of monitoring and review ‘champions’ where they exist and of existing knowledge and current practices as a starting point, can help ensure sustainable improvements over time. Success factors in implementing these actions will depend on the specific country context, including institutional and cultural specificities; however, the following have been proven to be applicable in most situations:

  • A common understanding of the purpose and value of INFF monitoring and review. Monitoring and review requires additional time and efforts from government officials and professionals who may be already overwhelmed by current tasks. Unless everyone involved can clearly see the value of INFF monitoring and review, success and sustainability of the exercise will be at risk.
  • Buy-in from key stakeholders. Buy-in at both the political level and at senior technical levels within government can ensure that adequate resources (financial, time, human) are dedicated to monitoring and review activities from the inception and on an ongoing basis thereafter to support effective implementation and positive policy feedback loops. In addition, buy-in from non-state actors, including the private sector, is needed to ensure effective collaboration and sharing of relevant data and information.
  • Realism on what constitutes ‘good enough’ data. It is easy to be discouraged by the inadequate coverage and reliability of available data. However, once the idea that data quality is a relative concept is understood and internalised, a more realistic approach of trying to obtain ‘good enough’ data can be taken, which can form the basis for additional, gradual improvements over time, while at the same time providing the evidence needed to begin adequate monitoring and review.
  • Dissemination of intermediate results. Producing and sharing presentations, videos, infographics, and other similar materials that showcase results related to different aspects of the INFF can help demonstrate its value added and broaden buy-in. For example, disseminating success stories on what improvements in data and monitoring systems helped accomplish can help expand monitoring and review efforts beyond pilots where champions already exist. 

Box 7. Common challenges in monitoring and review

Common challenges related to establishing and maintaining monitoring and review systems span both political and technical issues. Those most relevant to INFF monitoring and review include: 

  • Weak political will and institutional capacity, including for example, frequent staff turnover and different ministries/ agencies being at different stages in terms of capacity;
  • Resistance to enhancing transparency and accountability;
  • Difficulties in inter-ministerial cooperation and coordination, compounding issues related to being able to monitor and review cross-sectoral impacts;
  • Fragmentation of monitoring and review due to lack of coherent policy framework and supporting systems, and proliferation of indicators and reporting requirements;
  • Limited respect for principles of harmonization and use of national systems by donors;
  • Limited data availability and access, and limited resources to effectively use existing data;
  • Poor statistical systems and capacity;
  • Risk of over reliance on digital (often remote) data collection tools, which can lead to over-collection of data with little capacity for analysis, and the loss of contextual understanding obtained from physical visits and face-to-face interviews;
  • Excessive focus on accountability and control and too little on learning;
  • Lack of follow up on findings from reviews and evaluations, reducing usefulness of the exercises.

 

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada; CEPA strategy guidance note on Monitoring and evaluation systems, February 2021; World Bank (2004) A Handbook for Development Practitioners: Ten Steps to a Results-Based Monitoring and Evaluation System

Action Area 1: Institutionalise INFF monitoring and review

Building and sustaining a monitoring and review system in the context of an INFF is no small endeavour. Practicality dictates that the objectives and scope of any such system need to be realistic and take into account a diversity of factors, such as: the slowness of bureaucratic processes; the importance of incentives to vanquish vested interests and resistance to change; the need to understand the legal, political, institutional and cultural dimensions of monitoring and evaluation; and the complexity of building successful partnerships. An INFF monitoring and review system should be designed and managed as a medium-term process with a progressive expansion of scope, starting from existing monitoring platforms and any related reform processes (ongoing or planned).

This involves building on existing institutional mechanisms (see examples in Building Block 4 guidance on Governance and Coordination) and defining clear and complementary roles and responsibilities with regard to producing, collecting, analysing, reporting and using data and information related to financing (both public and private). It involves putting in place or reinforcing the right incentives (see Box 8), e.g. the opportunity to better align and consolidate existing systems, and to reduce, or at least not add to, administrative burdens. It also involves sensitizing policy makers and senior technical officials on the need for, and value of, adequate INFF monitoring and review, to create and maintain buy-in (for example, based on the experience of pilots as discussed under Action 2 below). In addition, institutionalising INFF monitoring and review may require capacity building activities (see Box 9).

Box 8. Incentives and disincentives for effective, learning-oriented monitoring and review

Incentives that can encourage quality monitoring and review:

  • Clarity of roles and responsibilities
  • Financial and other rewards: appropriate salaries and other rewards
  • Activity support: support, such as financial and other resources, for carrying out project, programme or policy activities
  • Personnel and partner strategy: hiring staff who have an open attitude to learning, and signing on partners who are willing to try out more participatory forms of monitoring and review
  • Project, programme, or policy culture: compliments and encouragement for those who ask questions and innovate, giving relatively high status to monitoring and review among staff
  • Performance appraisal processes: equal focus on staff capacity to learn and innovate, rather than just on reaching quantitative targets
  • Showing the use of monitoring and review data: making the data explicit and interesting by displaying them
  • Feedback: telling data collectors, information providers, and others involved in the process how these data were used (analysed), and what it contributed toward.

 

Disincentives that can hinder quality monitoring and review:

  • Using the monitoring and review unit as the place to park demoted or unqualified staff
  • Not making clear how data will be or were used
  • Chastising those who innovate within the project boundaries or those who make mistakes
  • Focusing performance appraisals only on activities undertaken (outputs)
  • Frequent rotation of staff to different posts
  • Staff feeling isolated or helpless in terms of their contribution being recognised toward achieving identified objectives (‘line of sight’ issue)
  • Unconstructive attitudes toward what constitutes participation or toward primary stakeholder groups.

 

Source: Boxes 10.3 and 10.4 in World Bank (2004) A Handbook for Development Practitioners: Ten Steps to a Results-Based Monitoring and Evaluation System

A balance is needed between making monitoring and review ‘everyone’s job’ (integrating it in all planning and implementation processes) and putting in place a dedicated monitoring and review unit with the required power, capacities and independence to exercise its mandate and bring together relevant information from different stakeholders. This balance will be dictated by country contexts, the level of ambition, and the focus and needs related to the country’s INFF.

Examples of good practices that can inform decision making around this include:

  • The elaboration and official adoption by the highest executive and legislative branches of policies or laws regarding monitoring and review (e.g., evaluation policies in Uganda by the Office of the Prime Minister and then officially vetted by Parliament, and in the Philippines by the Department of Budget and Management and the National Economic and Development Authority);
  • The creation of a monitoring and review unit at the highest level of the executive branch for coordination purposes. This could be linked to, or established by, the INFF Oversight Committee (where this is in place) and liaise with existing monitoring and review units of line ministries/ agencies (e.g., the Department for Planning, Monitoring and Evaluation at the Presidency of South Africa, the Center for Excellence in Evaluation at the Treasury Board Secretariat of the Federal Government of Canada from 2001 to 2016, and the Consejeria Presidencial para la Gestion y CumplimientoFN in Colombia);
  • The elaboration of working-level monitoring and review guidelines (e.g., Monitoring for Results Handbook by the Department of Budget and Management of the Philippines);
  • The adoption of an institutional capacity building approach (e.g., the support to the “Ecole Nationale d’Administration” of Niger to develop and improve its training programmes in results-based management) rather than an individual capacity building approach (e.g., the training of thousands of civils servants in public investment management that may then be dismissed due to changes in administration).

Forging alliances at all levels of government (e.g., between the Ministries of Finance and/or Planning, and line ministries, as well as between central and subnational governments) and with, and among, non-state actors, can be a powerful way of creating the necessary buy-in to institutionalise monitoring and review and enhance related processes and practices.

In many contexts, monitoring and review is seen as a compliance exercise, without sufficient focus on accountability and learning. Ensuring participatory, inclusive approaches to budgeting and broader financing policy making can help shift the culture and attitude. Guidance on Building Block 4 Governance and Coordination (Section 5.2) lays out ways in which participation of all relevant stakeholders can be enhanced to strengthen accountability of public and private finance providers in the country (such as through dedicated agencies or units that ensure systematic dialogue between stakeholders, consultative committees or fora, networks, open government initiatives and citizen budgets). Country experiences with such participatory approaches highlight the importance of: i) significant technical skills and overall capacity of intermediary civic groups who analyse, track and evaluate different stages of the budget process (or other financing policy areas being considered); ii) a conducive political environment in the form of free and able media, information disclosure laws, and political will to make government systems more open; iii) institutionalized processes for participation at all stages of the policy cycle, and for disclosure of key data and information by all relevant stakeholders (including private finance providers).

Box 9. Institutionalising monitoring and review in Colombia

Colombia has made significant progress in terms of the design and implementation of an institutional framework oriented towards performance-based management. Its experience highlights the importance of adequate capacity building to successfully institutionalise such a process.

Since the early 1990s the Constitution assigned to the national planning authority the responsibility to design and organize a system for the evaluation of public policies and programmes. SINERGIA (Sistema Nacional de Evaluación de Géstion y Resultados) was established in response to this mandate, as the system for monitoring targets identified in the National Development Plan (NDP), including evaluating related policies, public investment projects and other expenditure programmes. SINERGIA is managed by the National Planning Department (DNP) and the Presidency, with participation from line ministries and the Ministry of Finance (MOF). 

The DNP in coordination with the MOF and line ministries is responsible for preparing the NDP, and for allocating budget capital expenditures to identified investment projects (budget resources can only be allocated to investment projects listed in the NDP). The MOF is responsible for preparing the Medium-Term Expenditure Framework, which includes current and capital expenditures at the national level. DNP plays an advisor role in the budgeting process, focused on supplying performance information to be taken into account in the allocation of resources. Results-based programs have been implemented currently for 30% of the total budget and the target for 2022 is to allocate 50% of budget through results-based programs.

Through several iterations of government, Colombia managed to establish a balance of key roles from a supply and demand perspective, focusing on improving the quality of data, creating the capacities for information analysis, and creating the mechanisms to use such data and information for decision making. While significant effort has been made at the national level, establishing such mechanisms at the subnational level and making them operational remains a challenge. The decentralization process is under way with key sector responsibilities in delivery of health and education services. However, small to medium subnational governments do not have the human and financial capacities to be as rigorous as the national level. The strategy adopted to face this challenge has been to design performance statements that establish common development goals and investment efforts between the national and the subnational governments with agreed performance indicators and targets, and reporting on actual values on the SINERGIA REGIONAL website. This is a useful institutional mechanism, though its results will depend to a large extent on capacity building at the subnational level.

 

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Action Area 2: Enhance integration of existing systems

Moving along the levels illustrated in Table 2 does not imply replacing existing monitoring and review systems. Rather, it is about enhancing compatibility among them to ensure they are aligned with, and can feed into, the overarching INFF monitoring and review framework (illustrated in Annex 2). In this sense, and as illustrated in Figure 1 in Section 2, INFF monitoring and review is an integrator system, taking data and information from a variety of existing systems related to different types of finance and policies, and bringing it together into key indicators of performance.

Not all the information collected via the specialised monitoring systems (e.g., on the government budget, on development cooperation, or on private investment in the country) and via the reviews of different financing policies will be reflected in the overarching INFF monitoring framework. Similarly, there may be gaps in the data and information currently being collected via existing systems that is required for overall INFF monitoring and review. The INFF process is an opportunity to review whether the data and information being collected via existing systems is relevant, adequate, and being used, as well as to adjust related indicators and data sources accordingly. The overall guiding principle when it comes to INFF monitoring and review should be to select the smallest number of indicators possible that combined, can provide a comprehensive enough picture of progress on the implementation of the country’s financing strategy and the functioning of underpinning governance structures and mechanisms. 

Public finance plays the foundational role when it comes to financing sustainable development, and it will be the starting point in many countries. Countries will typically have better developed systems to monitor domestic public finance (government expenditure and investment) and development cooperation than private finance. Government officials may consider enhancing the compatibility and integration of public finance-related systems, and learn from this exercise before embarking on the creation of additional systems for other types of finance, such as private finance. In line with globally agreed principles, development partners should ensure that preference is given to national processes for monitoring allocation and impact of aid resources,FN to avoid hindering such integration (see for example Box 10 on health sector financing).

Better integration of national and donor monitoring systems can be prioritised across a range of areas, including capital investment projects. These projects play a critical role in achieving national development priorities and often receive substantial support from donors active in the country. Experience in numerous countries has shown the value of a structured and stepwise approach. Lessons learned include the need for a flexible investment monitoring system able to accommodate various levels of project complexity and size, sector specificities, data availability, and project analysis capacity within line ministries, along with explicit consideration of political priorities in the ranking of investment projects (rather than imposing a straight-jacket which looks good on paper but is unrealistic). The system should be easily customisable to reflect different situations and national business processes. (Building block 4 on Governance and Coordination provides insight into ways in which access to perspectives and input of various stakeholders (government, development partners, and private sector where relevant) may be facilitated, to ensure resulting systems can serve the needs of all relevant stakeholders).

Box 10. Integrating tracking of different sources of finance in the health sector

Tracking health-related funding is challenging due to the range of public and private sources and the variety of services and programs that fall under the health sector. At the country level, national health accounting exercises have not always realised their full potential, partly due to difficulties in obtaining timely and consistent data on all relevant sources of finance (including private finance and development cooperation).

In 2004, the Center for Global Development convened the Global Health Resource Tracking Working Group to conduct an examination of both donor- and country-level financing flows from both public and private sources. As a result, the Working Group was able to outline how a coordinated approach could produce a coherent system to track important financial flows in global health over time, requiring new resources and political commitment.

At the global level, health resource tracking has focused on tracking how much health-related aid is flowing from various donors to low- and middle-income countries (LMICs) and analysing its composition. One major source for this is the OECD Creditor Reporting System (CRS) database managed by the OECD Development Assistance Committee (DAC), which tracks bilateral and multilateral aid and other resource flows from member countries, several multilateral agencies, and some large private foundations to developing countries, both at the aggregate and programme levels. Another source is health spending data within individual countries reported using  internationally-standardized System of Health Accounts, which includes domestic expenditures.

There has been little effort made to compare data on donor assistance tracked by the CRS with information captured by country-based systems tracking donor assistance. If the CRS could provide a basis for consistent information on donor flows needed by countries to undertake the elaboration of health accounts, then additional data collection exercises could be avoided, thereby making health accounts exercises cheaper to conduct and reducing the reporting burden on all health sector stakeholders. 

The Global Health Resource Tracking Working Group suggested to commit donors to increasingly rely on country PFM systems to monitor and report on their aid flows (as per the Paris Declaration on Aid Effectiveness) and facilitate the buy-in from national decision-makers on those estimates. Donors have to ‘fight the temptation’ of duplicating systems instead of building on existing systems and enabling national governments to manage aid through national processes. As presented in the INFF approach, the essence is to support improvements in the ability of national governments to develop sound financing policies and budgets, and to report on their execution.


Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Where there are gaps in existing systems and/or additional systems need to be established, government officials could start with pilots in sectors and/or financing policy areas where champions can be identified (and interest in testing exists), as a way of learning what may work and what may not, and of demonstrating the feasibility of enhanced systems.FN Box 11 illustrates the case of Guyana, where efforts to enhance monitoring and review of government finance started from specific ministries before being rolled out across central government.

Following a demand-led approach is important to ensure sustainability and a higher likelihood of success and learning. Progressive expansion of scope and depth can then be more easily pursued as successes from pilots are demonstrated and buy-in is broadened as a result. A balance between perfection, practicality and sustainability should always be pursued, when selecting pilots and when expanding efforts beyond them. For example, relying on consultants to conduct pilots and set up systems for data collection and analysis may yield to results that cannot be extended or adopted in other sectors/ areas, given common issues related to data availability and human resource capacities in the public sector. Relying on available human resources and data systems, investing in capacity building, and adopting a stepwise and pragmatic perspective, though longer-term, can ensure more sustainable improvements and expansions in monitoring and review practices.

Box 11. Implementing pilots to enhance monitoring and review of government finance in Guyana

Guyana is currently classified as an upper-middle-income economy, and in 2019 it became one of the newest petroleum producing countries, which changed its development perspectives. This is both an opportunity and a challenge for the public sector to expand its public services in a context of Gini coefficient of 0.451, to ensure that the new fiscal space is used to benefit the entire population, including the poorest and most vulnerable groups. Over the past five years, the Ministry of Finance (MoF) has started a more evidence-based budget analysis, considering alternative budget-results scenarios and moving from a yearly budget to a medium-term horizon to better consider strategic priorities.

The entry point chosen by the MoF for its reform process toward results-based management, was strengthening strategic planning and monitoring and evaluation (M&E) capacities across central government. For the past ten years, the MoF organized and financed repeated trainings for the staff of all line ministries. It also supported several ministries to draft their Performance Measurement Framework (indicators and targets) and to conduct clinical sessions on improving existing result matrices. The trainings and direct support were initially piloted with two Ministries (Health and Education) before being rolled out to all line ministries.

Following this, the concept of budgeting for results was introduced in its Budget Process Manual, requiring line ministries to include Programme Performance Statements in their budget requests and including them also in the yearly, publicly available Budget Estimates Volumes.

However, the challenge remained to connect budget allocations with output targets. The MoF moved on to work on output unit costing and the elaboration of budget scenarios with a pilot programme at the Ministry of Health (MoH). This was supported by the customization and parametrization of a business intelligence software for results-based budgeting. The biggest challenge for output unit costing was the lack of analytical accounting data within the sector for the provision of medical services. This meant that lump sums for packages of medical services had sometimes to be used during the first year of implementation of the pilot. During the second and third years of implementation, the pilot programme was able to iterate and refine the costing data, using a variety of methods, including past average, normative, or activity-based costing. The core of the innovation was to determine the costs of key actions, activities, sub-programmes and programmes by budget line of the Chart of Accounts. 

Beyond output unit costing, the MoF combines two budgeting approaches to determine budget scenarios: on one hand, it uses a bottom-up approach whereby it asks the line ministry, in this case the MoH, to estimate budget needs based on results targets and output unit costs; on the other hand, it uses a top-down approach to estimate an ‘a priori’ estimate of budget envelopes based on the Fiscal Framework and past budget allocations. The two approaches are then reconciled to formulate alternative budget-result scenarios that are discussed with the line ministry during budget negotiations. Most desirable budget-results scenarios are then presented to top government authorities for final arbitrage.

This pilot represents the first milestone for the country, and highlights the importance of adopting a stepwise and pragmatic approach starting with programmes where leaders are interested to act as champions of change and innovation, where data and capacities are of good enough quality, and to then progressively expand the system in terms of scope and depth.

 

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Action Area 3: Link to ongoing or planned data/ statistical reform processes and make use of needs-based IT solutions

Data systems and capacity underpin effective monitoring and review. INFF monitoring and review should take into consideration all ongoing efforts that government may have in place to improve existing data and statistical capacity in different areas. Often, statistical capacity development efforts happen in siloes, in response to specific donor interests and requirements, and not necessarily linked to overarching national development priorities. The INFF process, and the monitoring and review building block in particular, offers an opportunity to review all ongoing initiatives and to establish whether a process may already exist within which INFF monitoring and review can be considered. If needed, an overarching data development plan hinged on nationally identified needs and priorities may be articulated. The INFF Oversight Committee, where one is present, should lead the process and review the plan to ensure a systemic approach. The findings from the assessment of the quality of data systems undertaken as part of step 1 (see Box 6) can be used to inform such a plan. Box 12 illustrates Niger’s experience.

A data development plan can identify critical data improvement needs using a participatory approach and then work out a stepwise workplan to improve the business processes related to information flows and address key methodological issues. It should not only focus on the production and collection of data but on data use too. Enhancing the reporting of relevant data in a way that facilitates access and use by target users should be considered as part of the plan.  

Box 12. Elaborating a data development plan with a medium-term perspective in Niger

Funded by the Millennium Challenge Corporation (MCC), the Millennium Challenge Account in Niger (MCA-Niger) is an aid programme aimed at fostering food security and climate resilience in four rural regions of Niger over the period 2018-2023. As part of this programme, a Data Quality Review (DQR) was commissioned with the objective of assessing the quality of existing data systems, including the data used to inform identified performance indicators.

The review concluded on the need to recognize the trade-off between the benefits and costs of better information which underlines the importance of determining the desired level of completeness, precision, and timeliness by users of the information. Quality information comes at a price. On the demand side, it was important to determine acceptable levels of coverage, accuracy and frequency for each key performance indicator depending on the priority needs of the main users of this information for decision-making, accountability, and learning. On the supply side, it was critical to ascertain the required capacities in terms of human, material and financial resources in data collection and analysis systems to deliver the desired level of quality and check if they match current capacities and, if not, revise expected quality levels.

A data quality improvement plan was elaborated as a result of the review, and as a stepwise capacity-building process over the medium run. It included proposals to: revise data collection and analysis methods for the indicators exhibiting data quality issues; undertake data collection and analysis capacity-building activities through training on indicators selection and data quality; sensitize stakeholders on the importance of data quality for their own performance; and conduct problem-solving evaluations. The plan also highlighted the importance of documentation and of progressively developing a knowledge management system.

 

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Information technology (IT), and more specifically the use of business intelligence software, can help address and implement the improvements identified in the data development plan, and support to enhance processes related to INFF monitoring and review (e.g., see Box 13). Where this is feasible, to be efficient, such software should: i) be built or customized according to specific country business processes; ii) include interfaces with existing relevant information systems; iii) support the analysis of data through dashboards and reporting features that respond directly to users’ needs. What is usually required is not a complex, costly solution that is good in theory, but is incompatible with existing data limitations, staff capacities and technological capacities in the country (including for example access to reliable large band internet). Rather it should use an Extract, Transform, Load (ETL) approach that enables extraction of relevant data from a variety of databases on different platforms (e.g., related to different types of finance), transform and store them in a secure way. The value added of the IT solution will depend on the extent to which it can be customized in terms of programme architecture, data disaggregation, reporting dashboards, alert systems, and report generation. Automatization and systematization of reports can save time at critical moments of budget preparation for example. However, the adoption of IT solutions should always be mindful of the shortcomings that may be associated with them, such as the potential loss of contextual information surrounding data findings, which tends to be difficult to obtain without in-person data collection and validation methods.

Box 13. Making use of IT solutions to move toward performance-informed budgeting in the Dominican Republic

Since 2008, the Government of the Dominican Republic has been strengthening its public management with a focus on performance and accountability management. As part of these efforts, the government decided to move to programme budgeting. The Ministry of Economy, Planning and Development designed and implemented an information system to capture information on public production and incorporate it into the budget; and linked the categories of the National Development Strategy and the Multi-annual Plan for the Public Sector to programme categories of the budget.

In 2019, the New Dominican Budget System (NSPD), prepared by the Office of the Budget (Dirección General de Presupuesto, DIGEPRES) with the support of development partners, such as the European Union, UNDP and the World Bank, represented another key step to prepare multi-year budgets. The new system involved the introduction of i) the preparation of a Medium-Term Fiscal Framework; ii) a multi-year budgetary policy; iii) cost estimates for outputs of each public entity; iv) a commitment of the Council of Ministers for the introduction of results-based budgeting; and v) an information system to facilitate physical and financial programming, articulated through a clear theory of change.

As the process began and budget agencies were supposed to follow the new standards, estimating output unit costs and the connection between outputs and higher levels of results remained challenging. In 2020, the DIGEPRES started to pilot the use of software to guide the process, using business intelligence tools. Currently, two pilot agencies have re-organised their planning architecture according to a programme budgeting perspective with its related costing elements. They are testing two new software modules (Budgeting for Results, B4R ® and Monitoring for Results, M4R ®) as add-ons to be interfaced with the existing Integrated Financial Management System (Sistema Integrado de Gestión Financiera).

 

Source: Martin, F. P. and A-M. Fernandez (2021). Background Orientation Paper prepared for FSDO/UNDESA on INFF building block 3: Monitoring and Review (M&R) Analytical Framework and Country Case Studies, IDEA, Canada.

Action Area 4: Leverage insight and lessons learned from peers and regional/ global knowledge-sharing platforms

To support actions related to all the above-mentioned areas and facilitate the shift toward more effective and efficient INFF monitoring and review, governments should leverage insight and knowledge from others who have already gone through the process. While each country situation will be different, implementation experiences from peers can provide valuable lessons on what may or may not work, and on how to best overcome specific challenges.

Since 2019, when the first pioneer countries begun their INFF journeys, INFFs are now being considered and implemented in over 70 countries worldwide. The INFF Knowledge Platform provides a digital space where country experiences and lessons from early implementers can be shared and accessed, and represents the basis of a growing community of practice. The INFF Dashboard tracks operationalization of INFFs at the national, regional and global levels. It contains data and information related to the ways in which different countries across the globe have started to design and implement various elements of INFFs, ranging from the inception phase to each of the four INFF building blocks. With specific reference to the monitoring and review building block, it provides the status of the monitoring system for the country’s financing strategy, its key elements (such as public financial management information systems and private sector monitoring initiatives), and information on any planned or ongoing capacity building support for monitoring and review. The UN system is also working to foster regional and global exchanges, ensuring peer learning at the regional level and facilitating knowledge transfer at the global level.

Annex 1. Monitoring and review checklist

Existing monitoring and review systems:

Underpinning aspects of existing systems:

Buy-in

Roles and responsibilities

Data systems and available capacity