Special Report
No25 2019

Data quality in budget support: weaknesses in some indicators and in the verification of the payment for variable tranches

About the report: Budget support is a form of EU aid which involves transferring money to the national treasury of a partner country, subject to that country’s compliance with agreed conditions for payment. Budget support payments are made in the form of either fixed or variable tranches. The amounts paid from variable tranches depend on the performance achieved by partner countries, which is measured by pre-defined performance indicators. We examined whether the Commission used relevant and reliable performance data for disbursing budget support variable tranches. We conclude that one third of the performance indicators reviewed had design weaknesses, which allowed for different interpretations as to whether targets had been achieved. Furthermore, the Commission’s assessment of whether variable tranche indicators had been met was not always reliable. We make a number of recommendations to improve the formulation of indicators, increase the use of outcome indicators and improve the verification of the performance data used to disburse variable tranches.
ECA special report pursuant to Article 287(4), second subparagraph, TFEU.

This publication is available in 23 languages and in the following format:
PDF
PDF General Report

Executive summary

I

Budget support is a form of EU aid which involves transferring money to the national treasury of a partner country, subject to that country’s compliance with agreed conditions for payment. With an average of €1.69 billion in annual payments, the EU is the largest provider of budget support. Budget support payments are made in the form of either fixed or variable tranches. Approximately 44 % of EU budget support payments relate to variable tranches. This proportion is as high as 90 % in some EU neighbourhood countries. The amounts paid from variable tranches depend on the performance achieved by partner countries, which is measured by pre-defined performance indicators.

II

We examined whether the Commission used relevant and reliable performance data for disbursing budget support variable tranches. We conclude that for one third of the performance indicators reviewed their relevance was weakened by their design, and they do not allow results to be measured objectively, thereby undermining their relevance. Furthermore, the Commission’s assessment of whether variable tranche indicators had been met was not always reliable, leading to some insufficiently justified payments.

III

While the variable tranche performance indicators were aligned to the partner countries’ sector development strategies, most of them were focused on short-term actions rather than longer-term results, including progress over time towards the sustainable development goals. Furthermore, more than one third of the indicators were vaguely defined or had incorrect baselines, or none at all. This allowed for different interpretations as to whether targets had been achieved, making the analysis of the disbursement requests more complex and less objective.

IV

Most variable tranches reviewed provided the intended effect of incentivising partner countries to advance in their reform agendas by having sufficiently ambitious targets. The few exceptions we found concerned indicators with targets which were easily achievable, or which had been achieved via the work carried out by other donors or by external experts paid by the EU. The number of indicators used in each variable tranche was too high in 6 out of the 24 contracts, which further complicated the disbursement process.

V

The disbursement requests for variable tranches contain an analysis on the fulfilment of agreed conditions and performance indicators. These requests are prepared by the partner countries and thus, the reliability of the underlying performance data depends on the capacity of these countries to produce it. We found that in only 5 of the 24 selected contracts did the Commission draw explicit conclusions as to whether the performance data needed to calculate the variable tranche indicators was reliable. When analysing disbursement requests, the EU Delegations carried out a variety of procedures to verify the reliability of this data. Some of these procedures do not ensure that variable tranche payments are based on reliable data and thus, are not fully justified.

VI

Based on our re-performance of the Commission’s assessments regarding the achievement of indicators and the recalculation of the variable tranche payments for a total of €234 million, we found discrepancies for €16.7 million. From this amount, €13.3 million were insufficiently justified, or not in line with contract provisions. An amount of €3.4 million was paid without actual progress. Additionally, €26.3 million for three variable tranches were paid to Moldova without sufficiently documenting the reasons supporting these payments.

VII

We make six recommendations to the Commission in order to:

  • Increase the use of outcome indicators in variable tranches;
  • Improve the formulation of performance indicators;
  • Safeguard the incentive effect of variable tranches;
  • Simplify the disbursement process for variable tranches;
  • Improve the assessments of the countries’ capacity to provide performance data used in variable tranches;
  • Improve the verification of the performance data used to disburse variable tranches.

Introduction

The concept of budget support

01

Budget support is a form of EU aid which involves transferring money to the national treasury of a partner country, subject to that country’s compliance with agreed conditions for payment. The funding thus received enters the partner country’s budget, and can be used as the partner country sees fit. The Commission describes budget support as a means of delivering effective aid and durable results in support of EU partners’ reform efforts and the sustainable development goals (SDGs).

02

In addition to the transfer of financial resources, budget support involves: (i) a dialogue with the partner country on reforms or development results, which budget support can assist, (ii) an assessment of progress achieved, and (iii) capacity development support. Budget support represents a shift away from the traditional focus on activities (e.g. projects) towards results-oriented aid.

03

The EU (both through the EU budget and the European Development Funds) is the largest provider of budget support globally. During the 2014-2017 period, the EU committed around 11 % (see Annex I of its bilateral development aid budget on budget support: an annual average of around €2.13 billion). In 2017, it provided budget support to 90 countries and territories, which received a total of €1.8 billion. Across all ongoing budget support contracts, the total committed amount is €12.7 billion. Figure 1 shows a breakdown of these figures by region both for the EU general budget and for the European Development Funds.

Figure 1

Ongoing budget support commitments and disbursements at the end of 2017 by region

(in million euros)

Source: ECA, based on EC Budget Support Trends & Results 2018.

04

EU budget support requires that relevant and credible policies are in place and implemented effectively by the recipient country. According to the Financial Regulation, a country may be considered eligible for budget support when1:

  • The partner country’s management of public finances is sufficiently transparent, reliable and effective;
  • The partner country has put in place sufficiently credible and relevant sectoral or national policies;
  • The partner country has put in place stability-oriented macroeconomic policies;
  • The partner country has put in place sufficient and timely access to comprehensive and sound budgetary information.
05

The European Commission uses three types of budget support contracts.

  1. The Sustainable Development Goals contract (SDGC), which supports national policies and strategies in achieving progress towards the sustainable development goals.
  2. The Sector Reform Performance contract (SRPC), which supports specific sectoral reforms.
  3. The State and Resilience Building contract (SRBC), which is provided for countries in fragile situations to ensure vital state functions or to support processes towards democratic governance.
06

The vast majority of the European Commission’s budget support programmes are based on SRPCs (74 % of all ongoing budget support commitments in 2017). In terms of funding, the four largest sectors supported using SRPCs are: education, agriculture and rural development, health, and energy.

Variable tranches as an incentive to achieve results

07

Prior to each disbursement, the Commission analyses the fulfilment of general conditions attached to the budget support contract. These conditions are in most cases those related to the eligibility criteria for receiving budget support (see paragraph 04). Budget support payments are made in the form of either fixed or variable tranches. Fixed tranches are paid either in full (if all conditions are met) or not at all (if one or more conditions are not met). Variable tranches are used to create incentives for partner countries to improve policy delivery, and are paid based on performance achieved in relation to specified performance indicators and targets, if the general conditions are all met. They can be paid either in full or in part. The performance indicators used for variable tranches can be selected from among the monitoring systems already used by the partner country, or through a common performance assessment framework agreed with the partner country and other donors.

08

Different types of performance indicators can be used for public policies (see Box 1). In general, the Commission recommends the use of outcome indicators, but other types of indicators can also be appropriate depending on the specific context of the partner country or sector.

Box 1

Performance indicator types with examples

Definition* Illustrative examples of indicators from the audited contracts**
Inputs Financial, human and material resources that are mobilised for the implementation of the programme
  • Budget allocated for female-specific projects
Process The policy and regulatory actions taken
  • EU-compliant regulations adopted on border control of food
Outputs The immediate and concrete consequences of the resources used and measures taken
  • Public buildings renovated
  • New commercial mechanisms at local level
  • Teacher Management Information System designed
Outcomes Results at the level of beneficiaries
  • Quality education provided to Syrian pupils in host communities and camps
  • Increased percentage of communities served by regular passenger transport services
  • Health insurance coverage for the near-poor
Impact The consequences of the outcomes in terms of impact on the wider objective
  • Reduction of infant mortality rate
  • Reduction of coca crops area
  • Improved control and reporting of expenditure

Source: *European Commission Budget Support guidelines 2017 and ** ECA.

09

Each performance indicator has an associated financial value. One way the Commission calculates the amounts disbursed in the variable tranche is by adding up the amount associated with each performance indicator met by the country. This means that the more indicators a partner country meets, the higher the proportion of the variable tranche paid.

10

In 2017, 44 % of the Commission’s payments for budget support contracts related to variable tranches2. This proportion is as high as 90 % in some EU neighbourhood countries. Figure 2 shows a breakdown by country of the variable tranches paid for SRPCs in 2017.

Figure 2

Sector Reform Performance Contracts, variable tranche disbursements 2017

Source: ECA, based on EC data.

The disbursement of variable tranches

11

The process for disbursing a variable tranche starts with a disbursement request by the partner country. This request includes an analysis of the extent to which the related performance indicators have been met. The EU Delegations analyse the requests and prepare an evaluation note. Based on this note, and after the approval of the budget support steering committee, the Commission decides the amount of the variable tranche that should be paid (see Figure 3).

12

The partner country’s performance analysis accompanying a disbursement request is based on data from its own monitoring and evaluation (M&E) systems. These systems are therefore the primary source of information for the disbursement of variable tranches.

Figure 3

Disbursement process

Source: ECA, based on EC Budget Support guidelines.

Audit scope and approach

13

The EU Consensus on Development adopted in 2017 recognises the central place of budget support in fostering partner countries’ efforts towards achieving the sustainable development goals (SDGs). Considering the importance of variable tranches in budget support payments and the fact that the performance data on which those payments are based originate from the recipient countries, incorrect or unreliable data would imply that budget support payments do not reward actual results as intended. This would seriously weaken the very purpose of budget support. The focus of this audit was therefore to scrutinise the quantity and quality (namely relevance and reliability) of the performance indicators used in variable tranches and the Commission’s analysis of payment requests.

14

The main audit question was: “Did the Commission use relevant and reliable performance data for disbursing budget support variable tranches?” In order to answer this question, we broke it down into the following sub-questions.

  1. Did the budget support contracts include indicators that allow relevant results to be monitored effectively in the supported sectors?
  2. Did the Commission effectively verify the reliability of performance data in the requests for disbursements of variable tranches?
15

The audit covered a sample of 24 sector reform performance contracts (see Annex II to the eight partner countries with the largest disbursements of variable tranches paid in 20173. These countries received 43.29 % of the total variable tranche payments for that year. The variable tranches audited for the selected contracts involved a total of 248 performance indicators. Annex III details the payments made in 2017 and the sectors audited.

16

For the selected contracts, we examined the Commission’s assessment of the capacity of the countries’ systems to provide reliable performance data. We also checked the design of the budget support contract provisions, in particular the quality of the performance indicators used in the variable tranches. Finally, we examined the Commission’s verifications of the data in the variable tranche disbursement requests. When examining these verifications, we carried out a documentary review of the payment files prepared by the Commission and re-assessed the achievement of targets and the calculation of the variable tranche payments. We then compared our conclusions with those of the Commission.

17

The audit included visits to three countries: Jordan, Georgia and Bolivia. During these visits, carried out in February and March 2019, we interviewed Commission staff, representatives of national authorities and other donors/stakeholders. In addition to the procedures carried out for the rest of the countries selected, in these three countries we re-performed the Commission’s verifications of the variable tranche disbursement requests and the related payments and compared our results with the results of the analysis done by the Commission. Furthermore, we cross-checked the performance data declared by the partner countries with other sources of evidence from external experts and other donors to assess its reliability.

Observations

Relevance of variable tranche indicators weakened by their design

18

We examined whether: (i) the performance indicators are relevant for achieving the budget support contracts’ objectives and are aligned with country policies; and (ii) they provide a sound basis for monitoring meaningful results.

Indicators are consistent with country sector strategies, but mostly focus on short-term actions rather than longer-term results

19

Budget support is regarded by the Commission as a results-based means of delivering aid. Whereas project-based aid is disbursed for eligible expenditure, budget support variable tranches are disbursed when partner countries meet the general conditions and achieve previously agreed results, measured by selected performance indicators. Therefore, the relevance of variable tranches depends on whether these performance indicators measure meaningful results.

20

As presented in Box 1, there are five types of indicators. Input, process and output indicators are most relevant for the day-to-day management of public spending programmes. Outcome and impact indicators measure longer-term effects, such as the progress of reforms and programmes towards set objectives; for example, the SDGs. Figure 4 includes an example of different types of indicators used for one of the budget support programmes in Bolivia.

Figure 4

Result chain and different types of performance indicators

Source: OECD Development Assistance Committee result chain, used in EU external actions.

21

The budget support guidelines allow the use of any of the above type of indicator. According to the Commission, though, preference should be given to outcome indicators, because they encourage evidence-based policy making, protect political space for partner countries to choose their own policies and strategies for achieving them, and stimulate a demand for high-quality statistical information. The greater the confidence in the partner government’s ability to produce reliable data, the more emphasis should be placed on outcome indicators. We also consider impact indicators useful for measuring results, in particular the achievement of the SDGs.

22

Our analysis of 248 indicators in the audited contracts shows that they are aligned with the sector strategies of the partner countries. However, variable tranche disbursements are mainly based on input, process and output indicators, which together account for 87 % of the total number of indicators. Only 33 indicators (13 %) measured outcome and impact – see Figure 5.

Figure 5

Type of indicator used for variable tranches in the audited programmes

Source: ECA.

23

We found that most outcome and impact indicators were used in countries with lower and lower-middle income economies, while in upper-middle income countries, such as Jordan and Georgia, which had a better capacity to produce performance data, only four outcome indicators and no impact indicators were used. This shows that indicator selection is not necessarily linked to countries’ development status.

24

The input indicators used in the audited variable tranches related mainly to the procurement of goods (18 cases) and increases in budgetary allocations (9 cases). Increases in budgets or the procurement of equipment might have the potential to trigger changes in a particular area of interest, but they do not automatically mean that meaningful progress will be achieved.

25

The process indicators may play a useful role, in particular when the policy is aimed at achieving changes in the regulatory framework. However, as indicated in the Guidelines, these should not focus only on processes, but should also measure qualitative aspects, e.g. what an entity to be set up is expected to cover, etc.4 However, this was not always the case. Box 2 presents some examples of indicators, which did not specify any minimum quality requirements regarding the content or the structure of the information to be submitted. This makes it difficult to ensure that variable tranches reward good-quality processes.

Box 2

Examples of process indicators with no qualitative specifications

In the contract for the Moldova visa liberalisation action plan (contract 15), one indicator relates to a regulatory framework approved for the mandatory registration of reported offences, without any reference to the quality/content of this framework.

In the energy contract for Jordan (contract 4), the target for one indicator refers to “a multi-stakeholder policy dialogue structure led by Ministry of Energy and Mineral Resources is operationalised and meets regularly” without any additional specification on how often the meetings should be held.

One third of indicators do not allow results to be measured objectively

26

The indicators used to measure progress in the implementation of a public policy need to be specific, have quantified and measurable targets, and when applicable, a reliable baseline. Otherwise, it is not possible to objectively measure progress achieved. Indicators, baselines and targets must be agreed when budget support contracts are formulated and specified in the financing agreement. Although the Guidelines state that changes to the indicators should be avoided, targets might need to be amended during contract implementation to reflect new circumstances, or to correct errors in the definition of baselines or the calculation of indicators.

27

Our analysis shows that 72 indicators (i.e. 29 % of our sample) were not sufficiently specific. Non-specific indicators are especially problematic during the process of analysing disbursement requests: there is a risk of conflicting judgements as to whether targets have been achieved. This can in turn lead to a variety of different results being arrived at when calculating the variable tranche amount to be paid.

28

29 % of the indicators were vaguely formulated with no quantified targets, using words such as ”improve”, ”increase the focus”, and ”provide evidence”. We observed that contracts in the agriculture sector included the highest number of specific indicators (86.5 %), whereas in the Public Financial Management (PFM), only 60 % of indicators were specific. Box 3 gives examples of non-specific indicators and the implications for the calculation of variable tranche disbursements.

Box 3

Examples of non-specific indicators/targets

  • In the contract on “Support to the second phase of education reform in Jordan” (contract 1), the target for indicator 5 was: ”6 new schools equipped and operational with adequate means/learning environment and adequate human resources”. The compliance with the term “adequate” and how this should be quantified for the calculation of the payment was not sufficiently specified in the Financing Agreement and therefore was open to different interpretations. As a consequence, the external expert analysing the target achievement for the Commission had to develop his own methodology to assess these criteria. This means that these criteria had not been set out in advance in the financing agreement, and had thus not been agreed with the partner country.
  • Indicator 2.1 of the contract on “Support to public finance policy reform” in Georgia (contract 8) reads as follows: ”The Ministry of Finance organises a series of public discussions on fiscal governance […]. “This indicator does not specify the number and character of meetings.
29

We have also found good examples of where the Commission had made efforts to measure progress for indicators which were not easy to analyse. In the case illustrated in Box 4, the Commission had led the way for the entire donor community in the area of education.

Box 4

Measuring the quality of education – a good example

In the contract “Budget support to the Jordanian Ministry of Education to deal with the Syrian refugee crisis” the Commission used the indicator ”Quality education provided to Syrian pupils in single- and double-shift schools in host communities and camps comparable to Jordanian standards and advocated in the country”.

This was an outcome indicator measuring the quality of education provided to Syrian students. The value of the indicator was obtained through a survey carried out in a sample of schools. In each school, the quality of education was measured by assessing ten different factors (such as lesson observations, involvement of parents and students, equal representation between boys, girls, refugees and Jordanians) on a four-point scale. The scores obtained (quality performance scores) were used to issue the schools with grades from A+ (good) to C- (in need of improvement).

This indicator was the first to measure the quality of education in Jordan; it has subsequently been used by other donors and introduced into the Common Results Framework in Jordan.

However, when it came to checking the fulfilment of the indicators, we identified weaknesses (see Box 10).

No baselines or incorrect ones on 41 % of progress indicators

30

Specifically and unambiguously defined indicators are not the only prerequisite for the effective measurement of results. Progress is only measurable if the targets set can be compared with the situation before the intervention: in other words, with a baseline. Amongst the 85 indicators analysed for which baselines were necessary (i.e. for those indicators which analysed progress5), we found problems in 35 of them related to: (i) a lack of necessary baseline values (15), and (ii) incorrect or outdated baselines (20).

31

There were 15 indicators with no identified baselines. These were spread across four (contracts 6, 7, 8 and 17) of the 24 selected budget support contracts. The main reason for the lack of baselines was that the partner countries had not monitored the situation before the start of the budget support contract. However, we found examples where, in a similar situation, the Commission had used the first variable tranche of the budget support contract to ask the partner country to calculate the baseline. In other cases, the Commission had carried out the necessary work (i.e. surveys) to calculate the missing baselines. We consider these to be good practices. The example of necessary work carried out by the Commission to accurately assess the progress achieved for the relevant indicator in subsequent tranches is described in Box 5.

Box 5

Setting baselines

An example of good practice in setting baselines by substituting missing data with other sources of information is indicator 3 of the sector reform contract to increase the performance of Rwanda’s energy sector (contract 19). This indicator is measured throughout the contract with an annual survey. However, at the time the Financial Agreement was being drawn up, this survey was not available, so a baseline value did not exist. In order to set the baseline value, the EU Delegation carried out a comprehensive analysis and cross-checked alternative sources of data such as the Biomass Energy Strategy from 2009, the Global Alliance for Cooking Stove assessment from 2012, and the Wood fuel Integrated Supply/Demand Overview Mapping report. On this basis, the EU Delegation was able to define a relevant baseline.

32

In addition, 20 indicators had incorrect or outdated baselines. In 11 of these cases, new data relevant for the correct calculation of the baseline became available after the budget support contract had been signed. Although it is possible to amend the related financing agreement in order to reflect the correct baseline, the Commission had not done so in these cases. The use of incorrect or outdated baselines also resulted in indicators having lower targets than the actual baselines. Box 6 contains some examples.

Box 6

Examples of indicators with absent or incorrect baselines

Lack of baselines

For the “Employment and vocational education and training” contract in Georgia, (contract 7), no baselines were defined when the financing agreement was signed in 2014 for any of the five indicators measuring progress. As a result, the baseline for the required increase was zero, which means that any progress reported could be considered as sufficient to reach the target. For example, the contract included targets to increase the relative number of teachers receiving initial and continuous type of training. However, no data was available on the number of teachers already receiving the type of training measured. Furthermore, the concepts of ”initial” and ”continuous” training were developed only in 2016, so it was not possible to know the baseline situation when the targets were set. As no baselines were available, the Commission considered broader developments in the education sector instead of the indicators' values.

Baselines not disaggregated

In some cases, the baseline was provided, but it was not sufficiently disaggregated to enable progress to be measured. The target for indicator 1 of a contract with Bolivia (contract 9) required specific institutions to train a certain number of staff in 2016. The baseline included information about the number of training courses offered in 2013. However, it did not provide the number of staff trained by particular institutions, so it was not sufficiently disaggregated. Additional documentation provided to us showed that, depending on the training centre considered, the number of staff trained in 2013 already exceeded the two targets set for 2016, in one case by 2 % and in the other by 42 %. In this case, the lack of a properly disaggregated baseline resulted in modest targets being set.

Incorrect baselines

In the “Education Sector Plan Support Programme” in Pakistan (contract 24), the baseline for indicator 5 related to the number of students benefiting from an existing scholarship scheme. The target, however, was a percentage of eligible girls receiving stipends in a timely manner. The baseline was therefore not directly linked to the target set.

Indicators generally provide the intended incentive effect, but there are too many

33

A primary objective of budget support is to incentivise partner countries to follow the path of agreed reforms. To this end, the indicators used for variable tranches should require a meaningful effort from the recipient country. The associated targets should also strike the right balance between ambition and ease of achievement6.

34

We found that most variable tranches reviewed had the effect of incentivising partner countries to implement certain aspects of their development strategies. However, the targets used for 11 of the selected indicators were very easy to achieve. This number includes four indicators whose targets were set very low due to the use of incorrect baselines (see paragraphs 31-32). Some examples of indicators with a limited incentive effect are presented in Box 7. Furthermore, for 12 additional indicators involving three contracts in Moldova (contract 16), Bolivia (contract 10) and Pakistan (contract 23), the targets were mostly achieved with support provided by technical assistance paid by the EU or other donors. In our view, such indicators have limited ownership and their incentive effect is weak, because they do not require any significant involvement by the partner countries.

Box 7

Examples of indicators with easy-to-achieve targets

In contract with Bolivia (contract 9), part of the variable tranche was made conditional upon organising 2 plenary meetings during 2015 by an institution responsible for the implementation of the strategy against drug trafficking. Regular plenary meetings for monitoring the strategy is part of its normal activity, not something that needs to be encouraged though a variable tranche indicator.

The target for indicator 4 of the same contract is the production of a report demonstrating the partner country’s compliance with the EU budget support conditions. This report, however, is part of the standard disbursement process for budget support and thus, should not have been considered as a variable tranche indicator.

35

The Guidelines suggest that the number of variable tranche indicators should generally range from 3 to 10. Having too many indicators dilutes the incentive effect and makes monitoring more complicated. The audited contracts had between 4 and 34 indicators per tranche7. These indicators were further broken down into sub-indicators, often with several targets. For example, the contract to provide support to implement a visa liberalisation action plan in Moldova (contract 15) included 95 independent targets. Such a high number of indicators and targets is not conducive to a focus on the main policy objectives of the budget support contracts. Although problems linked to the high number of indicators were acknowledged in internal Commission documents, it did not influence the formulation of contracts (see Box 8).

Box 8

High number of indicators

The preparatory documents for the contract “Support to agriculture and rural development” in Moldova, (contract 17) mention that “designing few and focused conditions is crucial in Moldova, as having many and complex conditions can contribute to the failure of sector budget support”. This aspect was disregarded during the design, as there were 28 composite “conditions/criteria/activities for disbursement” included in the contract for the disbursement of the 2017 variable tranche, which were further divided into 39 sub-conditions.

The quality of the Commission's verification of indicator fulfilment varied, leading to some payments being insufficiently justified

36

The disbursement requests submitted by the recipient countries contain an analysis of progress in the sectors supported by their budget support contracts and information on the fulfilment of agreed conditions and performance indicators. The dossier usually includes a report on progress in a given field (sector), separate reports on fulfilment of every condition and fiches concerning each indicator, with supporting evidence (e.g. letters from a statistical office, reports on surveys etc.). See Figure 3 describing the budget support disbursement process.

37

Partner countries’ requests are based on data from their own monitoring and evaluation (M&E) systems. These systems are therefore a primary source of information for disbursement decisions. However, countries have varying capacities and systems in place for collecting, storing, analysing and using data. Consequently, before starting budget support operations, the Commission needs to assess the systems in place to produce performance indicators that will be used for subsequent variable tranche payments. In particular, EU Delegations must determine whether weaknesses in statistical systems, availability of data and policy analysis significantly undermine the validity of countries’ disbursement requests.

38

For the contracts reviewed, we examined whether, when selecting the performance indicators for the variable tranches, the Commission had soundly assessed the reliability of the performance data. We also reviewed whether the Commission had performed a thorough examination of the variable tranche disbursement requests and correctly calculated the amount to be disbursed.

The Commission did not conclude on the countries’ capacity to produce data needed for indicators

39

The EU Delegations are asked to provide an overview of the partner country’s monitoring and evaluation systems (for the country in general and for the particular sector for SRPC), assessing whether the country’s public policy is in line with the EU objectives and whether institutional capacity is considered sufficient to implement the policy. Furthermore, the Guidelines require the reliability and availability of data to be analysed, and the weaknesses of the statistical systems to be appraised, before budget support contracts are drawn up.

40

We analysed the Commission’s assessments using criteria established by the European Statistical Office (Eurostat). Eurostat has developed a tool called Snapshot to help EU Delegations assess the strengths and weaknesses of national statistical systems in developing countries. We found that the Commission’s budget support guidelines cover the key requirements of this tool. Nevertheless, Snapshot is more comprehensive and provides detailed explanations of how to measure particular areas8, which are relevant for the assessment of statistical systems. This tool is generally unknown to the staff in the Delegations.

41

Our review of the Commission’s assessment of monitoring and evaluation systems showed that, in practice, the Commission generally describes and assesses the elements mentioned in the Guidelines, but these elements are scattered around several documents. However, even though certain sector-specific weaknesses were mentioned in the majority of contracts (18 out of 24 contracts), only in the five contracts in Jordan did the Commission explicitly draw conclusions as to the reliability of the performance data needed to calculate the variable tranche indicators9. Drawing a conclusion on the reliability of data is important for indicator selection, monitoring and disbursement analysis.

The Commission verified the reliability of the data supporting the disbursement requests, but not always thoroughly

42

The EU Delegations must analyse the disbursement requests presented by partner countries (including the values of the related performance indicators) before disbursing the variable tranche payment. We found a variety of verification procedures, some of which did not provide the necessary assurance to justify the subsequent payments.

43

Overall, the Commission applied three different types of verification procedures: (i) desk review and arithmetical re-calculation of the indicator values using the data provided by partner countries, (ii) the same method complemented by field visits to verify the reliability of the data provided, and (iii) outsourcing the calculation of performance indicators or the verification of data provided by the partner country to external experts.

44

The EU Delegation officials carried out field visits to complement the analysis and verify some of the data provided in 14 of the 24 audited contracts (see Box 9), and hired, sometimes in addition to their own field visits, external experts to analyse the fulfilment of the conditions for the variable tranches in 16 of the contracts. In the neighbourhood countries the use of experts was systematic; while in the remaining countries audited, the Delegations used experts in only 5 out of 13 contracts.

45

The budget support guidelines recommend that experts be called in to assist verification exercises where serious doubts exist about the quality of the data provided. However, without any clear conclusions about the partner countries’ capacity to produce such data (see paragraphs 39-41), it is difficult to decide whether such expert assistance is really needed. For the contracts audited, the average cost of expert verification missions was approximately €110 000.

Box 9

Commission’s verification of disbursement requests

To assess the disbursement request for the sector reform contract to increase the performance of Rwanda’s agriculture sector (contract 18), the EU Delegation carried out a comprehensive analysis of the information submitted by the Government of Rwanda and complemented it with field visits and requests for clarifications to national authorities. Specifically, in the case of indicator 5a, the target for the first year was 80 000 Ha of land to be used for agroforestry in certain areas. The Delegation carried out a field visit that showed that the information submitted was not correct, as the activities in the area were not limited to agroforestry. Consequently, the Commission withheld the payment related to this indicator in the tranche.

46

We found that for 610 of the 24 selected contracts, the Commission did not perform additional verification on the data used as basis for the payment of the variable tranches effectively. This was mostly because the EU Delegations relied fully on the accuracy of the performance data provided by partner countries, or on the verification work carried out by external experts on behalf of the Commission, without any further verification of the data provided. In these cases, it is more difficult for the Commission to identify unreliable performance data in the disbursement requests, thereby increasing the risk of unjustified variable tranche payments. Box 10 presents an example of shortcomings in an external review, which were not detected by the Commission.

Box 10

Examples of shortcomings found in external expert’ work

For contract 2, supporting the Ministry of Education in Jordan to deal with the Syrian refugee crisis, the Commission hired an expert to assess the data declared by the Ministry of Education for some of the variable tranche indicators. The expert validated the data declared by the Ministry with field visits to a sample of 30 schools. Based on the expert report, the Commission disbursed the variable tranche.

We reviewed the expert’s work and visited some of the sampled schools and identified the following issues:

  1. The sampling of schools was not random, but was based on recommendations from an external consultant team with UNESCO and approved by the Ministry of Education. There was a risk that the sample was biased.
  2. Even though the expert actually visited the 30 sampled schools, the data was only compared for 17 of them. This was because the Ministry did not have the information for the remaining schools. Therefore, the conclusions drawn by the external expert on the reliability of the ministry’s IT system were based on insufficient evidence.
  3. The field checks on staff working in the schools referred only to teachers, but the target for indicator 1 referred also to non-teaching staff, so the conclusions obtained on this indicator were incomplete.
  4. The expert concluded that the field data confirmed the ministry’s data. For indicators 1 (number of staff) and 2 (number of students), this conclusion was achieved by comparing the total results for the sample. However, the individual results per school exhibit significant differences (positive and negative) which are offset when calculating the totals.

For some variable tranches, the performance data provided by the partner countries did not justify the payments made

47

Out of a total of €234 million variable tranche payments reviewed, for five out of eight countries, our assessment of the evidence supporting the fulfilment of variable tranche indicators shows different results from those accepted by the Commission, amounting to €13.3 million, in which one country (Pakistan) represented a 19 % difference to the variable tranche payments. Furthermore, we also found that the Commission disbursed €3.4 million to two countries based on indicators linked to incorrectly set baselines. Although there was a contractual obligation to pay, as the target value had been met, no actual progress in the area measured with the indicator had been achieved. Table 1 summarises these amounts for each country visited.

Table 1 – Discrepancies from the Commission’s assessments

Country Reviewed variable tranche
(in €)
Differences from our assessment
(in €)
Contract and Indicators’ reference Reason for discrepancies
(by indicator)
Amounts paid without actual progress
(in €)
Bolivia 32 800 000 0 0
Ethiopia 29 520 000 0 2 000 000
(Contract 14: Ind. 7)
Georgia 19 400 000 1 000 000 Contract 6:
Ind. 1.2 and 1.7
1.2 Achieved 2 months after deadline
1.7 Target not met
0
Jordan 45 750 000 6 000 000 Contract 1:
Ind. 5
Achieved 2 years after deadline 0
Moldova 26 345 111 1 000 000 Contract 16:
Ind. 2.1 and 2.2
2.1.Achieved 1 month after deadline
2.2 Achieved 3 months after deadline
0
Pakistan 25 665 625 4 968 750 Contract 23:
Ind.5, 6 and 8
Contract 24:
Ind.4, 6.2 and 6.3
5,6&8. Targets not met
4&6.3. Incorrect evidence
6.2 Target not met
0
Rwanda 27 667 500 332 500 Contract 20:
Ind.2
Achieved 1 year after deadline 1 437 500
(Contract 19: Ind.5 Contract 20: Ind.4)
Vietnam 27 000 000 0 0
TOTAL 234 148 236 13 301 250 3 437 500

Source: ECA.

48

The main issues we found related to: (i) targets achieved with delay after the deadlines set; (ii) measurement of results based on incorrectly set baselines; (iii) incorrect or insufficient evidence provided to justify the fulfilment of the indicators; and (iv) targets not achieved at all. Figure 6 shows the types and proportion of discrepancies found from the Commission’s assessments.

Figure 6

Types of discrepancies

Source: ECA.

49

One of the cases where we considered that incorrect evidence was provided by the partner countries is described in Box 11.

Box 11

Evidence based on biased sample

The target for indicator 6.3 of the Education contract in Pakistan (contract 24) was that students receiving vouchers should achieve a school attendance rate of at least 80 %. The evidence provided by the partner country showed an attendance rate, which was higher than the target value (87 %), so the Commission considered this target to have been met. The evidence provided was supposed to cover, according to Technical Assistance Compliance report, all students throughout the year. However, our analysis showed that the national authorities had taken only a quarter of the students receiving vouchers into account when calculating the attendance rate for all of them. Furthermore, this sample was taken at the beginning of the school year, when the attendance is highest. We concluded therefore that the supporting evidence was not robust, as the sample was biased.

In the above case, the Commission considered the targets achieved and made the payment in full.

50

In four contracts we found five cases of indicators11 whose targets had been achieved after the deadline set in the financial agreements. Furthermore, for five additional indicators12, we consider that partner countries did not meet the targets; however, the Commission deemed them to have been fulfilled. Box 12 provides examples of both cases.

Box 12

Targets met after deadlines, or partly or not met

The target was achieved after the deadline for indicator 5 of the Education contract in Jordan (contract 1). The target for this indicator was the construction of six new schools by 2015. The schools were only finished in 2017 and the variable tranche was paid in December 2017. The reason for this considerable delay was the lengthy procurement procedures which resulted in the late implementation of the construction works by national authorities.

To take into account these delays, the Commission extended the implementation period of the contract up to December 2017 but did not modify the deadline for the construction of the schools (2015).

The target of indicator 6 of the Education contract in Pakistan (contract 23) included two activities: the design of the English language curriculum and the approval of textbooks for certain grades. We found out that only half of the textbooks had been approved at the time when the disbursement was requested and therefore consider that the second part of the target should have not been considered as achieved.

The target for indicator 1.7 of the Agriculture contract in Georgia (contract 6) was the adoption of a state programme on bio-organic production certification schemes. The government considered this indicator as fulfilled due to the adoption of its State Programme for Tea Plantation Rehabilitation. However, the stated aim of the programme is to support effective utilization of tea plantations in Georgia, increase tea production, including bio tea production, and to enhance self-sufficiency and export potential and not bio-organic certification schemes as such. We consider this target as not fulfilled, as the programme does not specifically target certification of bio products. This was also the position of the external reviewer.

In all the above cases, the Commission made the payment in full.

51

We found three cases where the Commission had made disbursements in accordance with the provisions of the financial agreements, since the partner countries had achieved the agreed targets. However, due to the use of incorrectly set baseline or target values, there was no actual progress in the targeted sectors. Box 13 provides further details on some of these cases.

Box 13

Targets set using incorrect baselines – no actual progress achieved

Indicator 7 of the Transport contract in Ethiopia (contract 14) relates to the percentage of trucks overloaded. The target set was to reduce this percentage to 9 % from a baseline value of 11 %. The result achieved at the end of the period examined was 6 % and therefore the target was attained and the corresponding amount paid in accordance with the provisions of the financing agreement. However, based on the information provided by the partner country in the disbursement request, the actual baseline value was 6 %. In reality, there was no progress on decreasing the actual number of trucks overloaded.

A second instance of an incorrect baseline leading to payment without enough actual progress is indicator 5 of the Energy contract in Rwanda (contract 19). This indicator measures the share of electricity generated from renewable sources in the energy mix. The baseline value used in the contract was 292 GWh generated in 2015. The actual value for the baseline year was wrong. According to data reported by the national authorities, it should have been 368 GWh. The target set for the audited variable tranche was an additional 14.5 GWh generated from renewable energy sources compared with the baseline. The result achieved for this indicator in the audited period was 361.5 GWh leading to the payment of the corresponding amount even though there had been a decrease in the proportion of electricity generated from renewable sources.

52

In addition to the discrepancies quantified in Table 1, we could not confirm the correctness of two variable tranche payments due to the lack of sufficient evidence to support the values reported for five indicators13 (see examples Box 14). These indicators represent a disbursed amount of €3.77 million.

Box 14

Lack of sufficient evidence

In contract 5, supporting public finance management in Jordan, indicator 1.2 related to gaps in the training of the Internal Control Units (ICU) staff in different national ministries and agencies. The target was to train the ICU staff who had not participated in the training during the previous year (there was no numerical target). An external expert hired by the Commission concluded that over 85 % of the staff from all ICUs had attended an additional training course offered. Therefore, he concluded that the indicator was achieved. As the expert report did not specify how the 85 % was calculated, we could not re-perform the calculation or confirm that all gaps were covered by the training. There was no information or data available on what was included in the 85 %.

Indicator 3 in the community development programme contract in Pakistan, (contract 22) measured the share of budgetary allocations for community-driven local development (DCLD) projects in the cost estimates of the district development strategies (DDS). However, the district cost estimates reported could not be traced to the DDS provided as a supporting document. Similarly, the report produced by the expert hired by the Commission to assess this indicator does not present a reconciliation between the district cost estimates and the DDS. Therefore, we could not confirm the reported result.

Insufficiently documented payments to Moldova

53

Although not linked directly to the verification of performance indicators we found that the Commission had paid three variable tranches to Moldova, a total amount of €26.3 million, without sufficiently documenting the reasons supporting these payments (see Box 15).

Box 15

Documentation of payments to Moldova

The budget support financing agreements contain a right for the Commission to suspend the financing agreement if the partner country breaches an obligation relating to respect for human rights, democratic principles and the rule of law and in serious cases of corruption (Art 236(4) of the Financial Regulation applicable to the general budget of the Union, July 2018).

Due to concerns about the state of democracy in the country, in July and October 2017, the Commission decided to postpone the payment of several variable tranches and stated that “the timing of the payment would be clarified taking into account the respect of effective democratic mechanisms, the rule of law and human rights in Moldova.” The main reason for this decision was the impending adoption of the new electoral law in Moldova, which was not consistent with the recommendations of the Venice Commission, an advisory body of the Council of Europe in matters of constitutional law.

Nevertheless, a payment authorisation was issued on 11 December 2017. It was not supported by an assessment demonstrating the improvement of the democratic mechanisms and human rights in Moldova, which was the main reason for withholding the payment in the first place.

Conclusions and recommendations

54

We examined whether the Commission used relevant and reliable performance data for disbursing budget support variable tranches. We conclude that for one third of the performance indicators reviewed their relevance was weakened by their design, and they do not allow results to be measured objectively, thereby undermining their relevance. Furthermore, the Commission’s assessments of whether variable tranche indicators had been met was not always reliable, leading to some insufficiently justified payments.

55

We found that the variable tranche performance indicators were well aligned to the partner countries’ sector development strategies. However, most of them were focused on short-term actions rather than longer-term results, including progress towards the sustainable development goals. Only 13 % of the 248 indicators we reviewed measured outcomes or impacts in the supported sectors (see paragraphs 22 to 25). The use of outcome indicators would enable the Commission to better measure longer-term results in the supported sectors, including progress towards achieving the Sustainable Development Goals.

Recommendation 1 – Increase the use of outcome indicators in variable tranches

The Commission should increase the proportion of variable tranches that are paid based on the achievement of outcome indicators.

Timeframe: end of 2021

56

More than one third of the indicators were vaguely defined or had incorrect baselines, or none at all. This allowed for different interpretations as to whether targets had been achieved, making the analysis of the disbursement requests more complex and less objective (see paragraphs 26 to 32).

Recommendation 2 – Improve the formulation of performance indicators

The Commission should enhance the quality control arrangements in order to ensure that performance indicators of variable tranches measure the results achieved by partner countries in an objective manner. Particular attention should be paid to:

  1. Using performance indicators that are specific and do not allow different interpretations;
  2. Using baseline values and targets

Timeframe: end of 2021

57

An important objective of budget support variable tranches is to produce incentives for partner countries to advance in their reform agendas. Most variable tranches reviewed contained indicators whose targets struck the right balance between being ambitious and achievable, and we consider that they provided the intended incentive effect. The few exceptions we found concerned indicators with targets, which were easily achievable, often due to the use of incorrect baselines, or were achieved via the work carried out by other donors or by external experts paid by the EU (see paragraphs 33 to 34).

Recommendation 3 – Safeguard the incentive effect of variable tranches

The Commission should:

  1. update baseline information prior to contract signature or correct the baseline values during contract implementation if necessary, by amending the budget support contract;
  2. avoid situations in which the partner country achieves targets exclusively due to EU-funded technical assistance.

Timeframe: end of 2021

58

The number of indicators used in each variable tranche was often too high, beyond the number recommended by the Commission’s guidelines. This further complicated the disbursement process (see paragraph 35).

Recommendation 4 – Simplify the disbursement process for variable tranches

The Commission should: refrain from using sub-indicators in order to limit the actual number of indicators to the maximum described by the guidelines.

Timeframe: end of 2021

59

Variable tranche disbursement requests contain performance data showing the extent to which the agreed conditions and performance indicators have been fulfilled. Since the requests are prepared by the partner countries, the reliability of the underlying performance data depends on the capacity of these countries to produce it. We found that the Commission generally assessed the capacity of partner countries by examining the main elements of their monitoring and evaluation systems, as required by its budget support guidelines. However, only in 5 of the 24 selected contracts did the Commission draw conclusions as to whether the performance data needed to calculate the variable tranche indicators was reliable (see paragraphs 36 to 41). Drawing a conclusion on the reliability of data is important for indicator selection, monitoring and disbursement analysis.

Recommendation 5 – Improve the assessments of the countries’ capacity to provide performance data used in variable tranches

When designing a budget support operation, the Commission should assess the reliability of the performance data, which is to be used as a basis for the disbursement of a variable tranche. The assessment should arrive at an explicit conclusion as to whether or not the systems used to produce these data are sufficiently reliable, and could be based on existing assessments done by other recognised bodies.

Timeframe: end of 2021

60

We found that the EU Delegations carried out a variety of verification procedures when analysing disbursement requests. In some cases, the EU Delegations’ staff carried out field visits to verify the data provided by partner countries, while in others, they relied fully on this data, or on the external reviews carried out by experts on behalf of the Commission, without further verification work. This does not provide the necessary assurance to justify subsequent variable tranche payments (see paragraphs 42 to 46).

Recommendation 6 – Improve the verification of the performance data used to disburse variable tranches

The Commission should:

  1. review the underlying evidence supporting the performance data provided by partner countries in the disbursement request, unless it has already explicitly concluded that this data is reliable;
  2. when using external reviews, require in the terms of reference the verification of the reliability of key performance data provided by partner countries. Before disbursing the variable tranche, verify that the experts have complied with this requirement.

Timeframe: end of 2021

61

When we re-performed the Commission’s assessments regarding the achievement of indicators and the recalculation of the variable tranche payments, we found discrepancies from the amounts paid by the Commission. Overall, based on the performance information available, we estimate that from a total of €234 million of variable tranche payments reviewed, we found discrepancies for €16.7 million. From this amount, €13.3 million were insufficiently justified, or not in line with contract provisions. An amount of €3.4 million was paid without actual progress. Additionally, €26.3 million for three variable tranches were paid to Moldova without sufficiently documenting the reasons supporting these payments (see paragraphs 47 to 53).

This Report was adopted by Chamber III, headed by Ms Bettina JAKOBSEN, Member of the Court of Auditors, in Luxembourg at its meeting of 12 November 2019.

For the Court of Auditors

Klaus-Heiner Lehne
President

Annexes

Annex I – Share of budget support in bilateral ODA commitments

Countries 2014 2015 2016 2017 Average (2014-2017)
Austria 0.69 % 1.01 % 0.10 % 1.24 % 0.76 %
Belgium 2.50 % 1.85 % 1.41 % 0.15 % 1.48 %
Czechia 1.52 % 0.00 % 0.00 % 0.44 % 0.49 %
Denmark 3.83 % 0.00 % 0.73 % 0.91 % 1.36 %
Finland 3.75 % 3.89 % 0.00 % 2.98 % 2.65 %
France 2.86 % 1.86 % 6.11 % 9.30 % 5.03 %
Germany 1.29 % 4.55 % 2.12 % 3.42 % 2.84 %
Greece 1.04 % 0.00 % 0.00 % 0.00 % 0.26 %
Hungary n/a 0.65 % 0.00 % 0.00 % 0.22 %
Ireland 2.30 % 3.76 % 0.00 % 0.00 % 1.52 %
Italy 1.28 % 0.40 % 0.24 % 0.56 % 0.62 %
Luxembourg 3.21 % 0.81 % 2.93 % 1.98 % 2.23 %
Netherlands 0.67 % 0.00 % 0.00 % 0.00 % 0.17 %
Poland 0.00 % 0.00 % 0.00 % 0.00 % 0.00 %
Portugal 0.55 % 0.71 % 0.52 % 0.58 % 0.59 %
Slovakia 0.00 % 0.00 % 0.00 % 0.00 % 0.00 %
Slovenia 0.00 % 0.00 % 0.00 % 0.00 % 0.00 %
Spain 1.33 % 1.29 % 0.17 % 0.30 % 0.77 %
Sweden 2.39 % 0.13 % 0.00 % 1.96 % 1.12 %
United Kindgdom 2.09 % 1.21 % 0.18 % 0.01 % 0.87 %
EU Commission 9.31 % 12.71 % 14.67 % 9.08 % 11.45 %
           
EU average 2.03 % 1.66 % 1.39 % 1.57 % 1.66 %
EU average without institutions 1.65 % 1.11 % 0.73 % 1.19 % 1.17 %

Source: ECA, based on https://stats.oecd.org/

Annex II – Audited contracts

Contract number Country Supported sector
Contract 1 282613 Jordan Education
Contract 2 365198 Education
Contract 3 377271 Energy
Contract 4 389306 Energy
Contract 5 357967 Public finance management
Contract 6 387662 Georgia Agriculture
Contract 7 344313 Vocational education
Contract 8 361908 Public finance policy reform
Contract 9 363227 Bolivia Fight against drugs
Contract 10 368977 Water
Contract 11 337591 Agriculture
Contract 12 377182 Agriculture
Contract 13 383001 Ethiopia Health
Contract 14 367551 Transport
Contract 15 348701 Moldova Visa liberalisation action plan
Contract 16 353323 Public finance policy reform
Contract 17 371907 Agriculture
Contract 18 376376 Rwanda Agriculture
Contract 19 375269 Energy
Contract 20 364033 Environment
Contract 21 357701 Vietnam Health
Contract 22 337112 Pakistan Agriculture
Contract 23 289807 Education
Contract 24 356359 Education

Annex III – Disbursements of variable tranches in 2017 for sector reform performance contracts

Countries Total variable tranche 2017
(in €)
% Sectors audited Audited amounts
(in €)
Morocco 120 984 995 18.75 %
Jordan 69 506 667 10.77 % Education, Energy, Public Administration Reform 45 750 000
Georgia 37 900 000 5.87 % Agriculture, Employment, Public Finance 19 400 000
Bolivia 32 800 000 5.08 % Agriculture, Fight Against Drugs, Water and Sanitation 32 800 000
Ethiopia 29 520 000 4.57 % Health, Transport 29 520 000
Moldova 29 345 111 4.55 % Agriculture, Justice, Public Finance 26 345 111
Rwanda 27 667 500 4.29 % Agriculture, Energy, Environment 27 667 500
Vietnam 27 000 000 4.18 % Health 27 000 000
Pakistan 25 665 625 3.98 % Agriculture, Education 25 665 625
Tunisia 25 000 000 3.87 %
Albania 20 775 000 3.22 %
Niger 17 850 000 2.77 %
Bangladesh 16 500 000 2.56 %
Colombia 15 000 000 2.32 %
Botswana 14 510 000 2.25 %
Honduras 12 030 000 1.86 %
South Africa 10 466 458 1.62 %
Senegal 10 450 000 1.62 %
Burkina Faso 9 700 000 1.50 %
Kyrgyzstan 9 500 000 1.47 %
Algeria 9 000 000 1.39 %
Benin 8 000 000 1.24 %
Indonesia 7 500 000 1.16 %
Ukraine 7 500 000 1.16 %
Cambodia 7 200 000 1.12 %
Armenia 7 000 000 1.08 %
Ghana 6 200 000 0.96 %
Nepal 6 000 000 0.93 %
Peru 5 880 000 0.91 %
Greenland 4 634 634 0.72 %
Laos 4 000 000 0.62 %
Guyana 3 800 000 0.59 %
Dominican Republic 2 687 500 0.42 %
Samoa 2 360 238 0.37 %
Falkland Islands 1 000 000 0.15 %
Tonga 375 000 0.06 %
TOTAL 645 308 728 100 %

Source: ECA (In bold countries selected for the audit).

Annex IV – Summary assessment

BOLIVIA ETHIOPIA PAKISTAN RWANDA VIET­NAM
Con­tract 9 Con­tract 10 Con­tract 11 Con­tract 12 Con­tract 13 Con­tract 14 Con­tract 22 Con­tract 23 Con­tract 24 Con­tract 18 Con­tract 19 Con­tract 20 Con­tract 21
Comprehensive capacity assessment? N N N P N P N N N P P N P
Number of indicators (per 2017 annual tranche) 8 10 8 6 6 10 6 8 8 8 7 4 8
Number of indicators paid in 2017 from previous years 0 0 0 0 0 0 0 0 0 0 4 0 0
Independent targets 24 10 12 12 6 10 6 8 15 8 11 4 12
Indicators not specific 1 1 2 2 1 1 1 1 2 0 0 0 0
Without baseline (ECA: there should be baseline) 0 0 0 0 0 0 0 0 0 0 0 0 0
Wrong baseline 1 0 0 0 1 10 0 0 1 0 1 2 0
Target too modest 1 1 0 0 0 1 0 0 2 0 0 1 0
Field mission carried out by EU Delegations to assess fulfilment conditions? N N Y Y N Y N N N Y N Y N
External experts used to assess fulfilment conditions? N N N N N N Y Y Y Y N N Y
Arithmetical errors in disbursements? N N N N N N N N N N N N N
Methodology for calculating the payment was applied correctly? Y Y Y Y N Y y N N Y Y Y Y
Discrepancies with ECA assessment as to target/sub-targets fulfilled. 2.2 and 3.3 3.1 and 5 N N N 7 N 5,6 and 8 4, 6.2, and 6.3 N 5 2 and 4 N
Discrepancies with ECA assessment as to amount to pay Only impact in further dis­burse­ment No Impact in pay­ment. Per­form. >80 % N N N 2 M€ N 1.50 M€ 3.47 M€ N 1.2 M€ 0.57 M€ N
GEORGIA JORDAN MOLDOVA
Con­tract 6 Con­tract 7 Con­tract 8 Con­tract 1 Con­tract 2 Con­tract 3/4 Con­tract 5 Con­tract 15 Con­tract 16 Con­tract 17
Comprehensive Capacity assessment? N Y Y N N Y P Y Y P
Number of indicators (per 2017 annual tranche) 11 15 14 0 5 9 10 34 12 28
Number of indicators paid in 2017 from previous years 0 3 0 2 0 0 0 3 0 0
Independent targets 12 34 21 2 6 10 20 95 22 39
Indicator not specific 0 7 3 2 0 1 5 35 4 4
Without baseline (ECA: there should be baseline) 1 5 3 0 0 0 0 0 0 6
Wrong baseline 1 0 0 1 1 0 1 0 0 0
Targets too modest 0 0 1 0 0 0 0 0 3 0
Field mission(s) carried out by EU Delegations to assess fulfilment conditions? Y Y Y Y Y N Y Y Y Y
External experts used to assess fulfilment conditions? Y Y Y Y Y Y Y Y Y Y
Arithmetical errors in disbursements? N N N Y N N N Y N N
Methodology for calculating the payment was applied correctly? Y Y Y Y N Y N N Y Y
Discrepancies with ECA assessment as to target/sub-targets fulfilled. Ind. 1.2 and 1.7 N N Ind 5 N N N Y (all in­di­ca­to­rs) N N
Discrepancies with ECA assessment as to amount to pay. 1 M€ N N 6 M€ N N N 5.1 M€ 6.4 M€ 14.8 M€

NOTE: Y: Yes; N: No; P: Partially.

Replies of the Commission

Executive summary

I

Budget support is considered as a package, which includes financial transfer, policy dialogue, capacity building and performance monitoring. These elements are all important for the effectiveness of the instrument. Fixed tranches are subject to progress against general conditions derived from budget support eligibility criteria, while the amount of variable tranches will be proportional to the performance against some specific indicators.

II

The Budget Support Guidelines define the criteria to be applied when defining variable tranche indicators and their process of verification. Preference is given to taking well-defined indicators of partner countries policies and performance monitoring frameworks.

III

As variable tranche indicators are designed to be measured on an annual basis, the targets to be met are by essence of a short-term nature. The Commission needs to identify targets that can reasonably be achieved from one year to the other. This does not prevent from attempting to support more the use of outcome indicators, especially in sectors that have benefitted from long-term EU assistance, but the mix between different types of indicators needs to be ensured.

The Commission’s database capturing all performance indicators used during 2014-2018 budget support contracts show that a balanced mix of different types of indicators: inputs, process, outputs and outcome/longer term results are or have been used. The sample audited contained more programmes approved before 2014, hence less focused on longer-term results.

IV

The budget support guidelines recommend using between three to ten indicators. The Commission will reinforce the message that a high number of indicators may lead to a loss of policy focus and a more complicated assessment of the disbursement request. Nevertheless, in exceptional cases, more than ten indicators could be accepted, if justified by the policy framework and partner country preferences.

V

The Commission considers that the reliability of partner countries’ statistical system is analysed in the assessment of compliance with the public policy eligibility criteria to budget support and should also be found in the Risk Management Framework. The recently revised template of the disbursement note requires reporting on updates on country’s analytical capacity and data quality, with each disbursement request.

VI

The Commission considers that the payment to Moldova took into account the positive evolution of the situation of the respect of effective democratic mechanisms, the rule of law and human rights in Moldova at the time of payment but could have been better documented.

VII

First indent: The Commission accepts the recommendation.

Second indent: The Commission accepts the recommendation.

Third indent: The Commission accepts the recommendation.

Fourth indent: The Commission accepts the recommendation.

Fifth indent: The Commission accepts the recommendation.

Sixth indent: The Commission accepts the recommendation.

Introduction

01

See reply to the executive summary - paragraph I.

02

In addition to promoting results, while using countries’ systems and aligning with countries’ policies, budget support also fosters domestic accountability.

Observations

19

The Commission confirms that budget support disbursements are made “ex post” to reward good performance. In this context, as the ECA rightly points out, general conditions, i.e. progress with the implementation of the relevant public policy, public financial management reform, transparency and macroeconomic stability are all important in assessing results.

21

The design of variable tranches must suit the country and policy context. Indicators should be chosen according to their relevance in that context. Preference should be given to outcome indicators when applicable and generally, a combination of different types of indicators should be sought, including input indicators.

The annual frequency of budget support disbursements as well as the partner countries’ ability to control outcomes and impact should be taken into account when defining indicators.

22

An internal Commission assessment of the full set of indicators used in variable tranches (a total of 3642 indicators) covering the period from 2014 to 2018 shows that 33.7% of the indicators are outcome indicators, 26.8% output indicators, 35.4% process indicators and the remaining small share (4.1%) is made up of input and impact indicators.

The sample audited contained more programmes approved before 2014, hence less focused on longer-term results. A more balanced use of different indicators is currently the practice.

23

The selection of indicators is also linked to the maturity of the public policy and a number of context specific issues (attribution problems, sector governance, division of labour among development partners etc.).

To be noted that since 2017, Jordan is no longer considered an upper-middle income country but a lower-middle income one, according to World Bank classification.

24

The Commission considers that input indicators may play a useful role in certain cases and they serve in the initial stages of the reform to help set the conditions for longer-term results later. In budget support programmes since 2014, the share of input indicators in variable tranches is only 4% (see reply to paragraph 22).

25

Process indicators are important to assess progress in the capacities and good governance of a sector. Additionally, when the entire budget support programme is considered, qualitative aspects are looked at when assessing progress with the overall implementation of the public policy. They can also be addressed through dialogue and capacity building, hence the importance of not reducing the impact of budget support only to variable tranche indicators.

Box 2 – Examples of process indicators with no qualitative specifications

With reference to Moldova (contract 15: Visa Liberalisation Action Plan)

The visa liberalisation action plan (VLAP) is an exercise very much scrutinised by the European institutions and Member States, with a very strong external (EU) monitoring system in place. While there is no reference in the policy matrix of the regulatory framework to be adopted, the quality of the framework is clearly defined in the related acquis communautaire, so the framework (visa liberalisation benchmarks) is defined in other documents. It is noted that five VLAP reports, showing satisfactory progress, were presented to the Parliament and Council.

With reference to Jordan (contract 4: Energy support)

The focus of this target was to ensure that the policy dialogue structure was established and made operational (meaning that it was meeting regularly and not only set up administratively). In order to ensure ownership of the government, the Commission allowed the partner country to decide the exact structure, composition and frequency of meetings.

27

The Commission highlights that the ECA’s finding concerning insufficiently specific indicators stem from a variety of underlying reasons, as the examples in paragraph 28 and Box 3 illustrate. In the majority of cases, these did not lead to different conclusions on the assessment of the indicator by the ECA and the Commission.

Box 3 – Examples of non-specific indicators/targets

With reference to Jordan (contract 1):

The external expert analysing the target’s achievement did not fully develop its own criteria and relied on criteria that were already detailed in the Technical and Administrative Provisions (TAPs) of the Financing Agreement to determine that the school equipment was adequate, such as the fact that schools will be constructed and equipped according to the Jordan 2018 guidelines and provided with solar panels for the water heaters and other renewable energy and energy efficiency systems. The TAPs also set out that the newly appointed teachers will have followed at least the teacher training induction course.

Moreover, the “adequate” character of the means/learning environment and human resources in those schools needed to be kept flexible to adapt to a very fluid crisis situation with hundreds of thousands of new refugee pupils that needed to be brought to school. Finally, the “adequate” character was discussed in detail during regular policy dialogue held between the EU Delegation and the Ministry of Education.

With reference to Georgia (contract 8):

The specific objective of the indicator was to raise the awareness among stakeholders regarding the Fiscal Rules and Fiscal Governance. The means of interpretation were meeting minutes and interviews with the stakeholders. The indicator intended to promote a culture of transparency and accountability towards citizens. The assessment was thorough and looked at meeting minutes and interviews with stakeholders.

Box 4 – Measuring the quality of education – a good example

This practice of measuring indicators related to the quality of education is essential and the model initiated in this Budget Support Programme was then replicated by other donors in Jordan. It remains an essential focus of all subsequent support to the Jordanian Education system that has been provided by the EU since.

This indicator is now a part of the Education Strategic Plan 2018-22 and the relevant unit in the Ministry of Education received training on the methodology to take the assessments further, nationwide.

It is important to stress though that collecting data for an indicator should become part of the country’s regular data collection exercise, avoiding one-off, costly and lengthy methods carried out only for the budget support programme.

30

The Commission agrees that baselines should be defined for indicators for which they are relevant and updated, if necessary.

31

See Commission reply to paragraph 30.

32

The Commission agrees that baselines should be updated with relevant data, when feasible, keeping in mind the transaction costs of amending budget support contracts.

Box 6 – Examples of indicators with absent or incorrect baselines

With reference to Georgia (contract 7):

The “Employment and vocational education and training” contract includes four indicators which refer to an increase (e.g. of trainings) over time. In two cases, the Financing Agreement refers to a baseline of 31 December 2013, which was not available at the time of signing of the Financing Agreement.

As regards the specific case mentioned here, there were no baselines defined for the indicators in the Financing Agreement. Therefore, it was necessary to draw from additional sources to assess indicator 2.2.1 (15 % increase of teachers receiving initial training in line with new policy on VET teacher's development) and indicator 2.2.2 (15% of teachers receiving lifelong training). As the indicators did not have baselines, the only increasing trend could have been considered as sufficient for payment. Therefore, the evaluators and the Commission considered the developments in the sector to reach a meaningful conclusion given the nature of the indicator.

With reference to Bolivia (contract 9):

It is to be noted that although targets for 2016 were less ambitious compared to the baseline, targets for 2017 and 2018 were more demanding, thus ensuring the intended results.

34

The Commission points out that EU capacity building generally aims to contribute to sustainable capacity increase in partner countries, even after the budget support programme ends. Their scope is wider and they complement rather than replace the incentive provided by the indicators. The Commission considers that this was the case for the three contracts mentioned by ECA.

Box 7 – Examples of indicators with easy-to-achieve targets

It is important to underline that the indicators 4 “Reinforcement of the Bolivian Observatory of Drugs” and 7 “Institutional framework development of CONALTID” (Consejo Nacional de lucha contra el tráfico ilícito de drogas) both relate to strengthening of CONALTID’s role regarding inter-institutional coordination in order to improve the effectiveness of the supported strategy.

Given the previous lack of inter-institutional coordination, this indicator is a useful proxy to measure the level of the Government’s commitment to the strategy.

Although the indicator (in contract 9) may seem modest, it is very important in the given context as it incentivises the necessary coordination between different Ministries involved in the fight against drug trafficking and its connected crimes. It is an example where, knowing the local context, the EU Delegation recognised the importance of consolidating the coordination function of a new institution (CONALTID). The ambition of an indicator therefore needs to be assessed in the specific context. The same applies to indicator 4, which is considered a proof that CONALTID was playing its coordinating role and that the monitoring system was working in a sustainable way.

35

The number of indicators under the Visa Liberalisation Action Plan (VLAP) in Moldova stems from the policy dialogue and subsequent agreement with the partner country. In this specific case, preference was given to include all indicators in the policy matrix of the Government. This was considered by the authorities as an additional incentive to implement the whole Action Plan.

Box 8 – High number of indicators

The policy matrix in Moldova includes seven conditions or main indicators (as established in the financing agreement), corresponding to seven areas of the national strategy. There are then 28 sub-indicators, which are specific and relevant for the sound implementation of the strategy.

37

At the design phase, the Commission assesses the information pertaining to data reliability in order to decide which source of verification is the most appropriate and when it becomes available. Weaknesses in the statistical system of the supported sectors call for mitigation measures, e.g.external experts review, EU (or other donors’) complementary assistance to improve the statistical capacity.

39

See Commission replies to paragraph V.

41

The Commission considers that by combining the analysis of the public policy eligibility criterion and the Risk Management Framework, it is able to get sufficient information on the quality of the monitoring and evaluation system of the partner country.

42

A variety of verification procedures is employed to guarantee a level of assurance that justifies subsequent payments. See paragraph 43.

43

For each disbursement, the Commission applies a combination of the different types of verification procedures to assess compliance. The desk review of all the supporting documents is done for each disbursement by the Delegations and HQ services. When necessary, field visits may also be undertaken. Equally, when the nature of the information is highly specialised, or the assessment of the achievement of targets needs a strong qualitative assessment, external experts can be employed.

44

External expert missions are to be used in a complementary manner. Their added-value derives from their independence and additional expertise, and they should not substitute the Delegation direct verification and decision, but allow for cross-checking the results.

45

The Commission shares the analysis of ECA that the quality of data from partner countries should be an important factor in using experts for verification. However, the use of external experts may be justified even if the country’s monitoring system is reliable. In certain cases, the EU needs highly technical expertise in order to verify specific criteria relevant for indicator calculation.

46

Budget support, as an instrument of partnership and trust, works in several sectors, where development partners have been relying on the regular reporting of partner countries.

The Commission finds that, in the given cases, supporting documents provided by the implementing partner, combined with the external expert’s assessment and clarifications on demand from the authorities, provided sufficient information to cross-check and reach a conclusion on the reliability of the data.

Box 10 – Examples of shortcomings found in external expert’s work

(a) Sampling for contract 2 in Jordan was done in close collaboration with other relevant partners, including UNESCO. Sample selection criteria were based on elements, including:

  • Ministry of Education public schools in urban areas (including rented schools);
  • Schools and learning spaces in refugee camps;
  • Double and single shift schools (high density);
  • Representation of primary and secondary cycles;
  • Representation in EMIS system pilot sample (OpenEMIS, UNESCO);
  • Correlation of selected schools with the concentration of Syrian refugees in the region/city.

Thus, proposals for inclusion of schools were made in the interest of having a broad variety of criteria and to ensure the inclusion of 8 directorates with a high concentration of Syrian students. The Ministry of Education was not directly involved in selecting the schools for the sample in order to avoid any possible bias in the selection process. Rather it was the consultant team together with UNESCO who made the proposals after their selection from the schools' database (not the EMIS which was not functional at that time), while Ministry of Education of course needed to approve the sample in order to facilitate access to the schools.

(b) All 30 schools were included in the database for the field verification mission, however, the corresponding mission report only mentioned 17 schools (pages 14 and 16/17) since comparable EMIS data were only available for those 17 at the time of the mission. The conclusions drawn by the external experts were thus done with the best available sample at that time.

(d) The consultants looked at the actual school registers and the actual enrolment on the day of the visit, and that is what was recorded in the database in the respective columns. The Commission considers that such differences are to be expected, in a very fluid crisis context where school population and attendance were in flux.

47

With reference to Pakistan, the Commission agrees that for four indicators (indicators 5 and 6 contract 23 and indicators 4 and 6.3 contract 24) the evidence supporting the results had weaknesses and will remedy this in the future.

For contract 23 (Sindh education programme), indicator 8, the Commission considers that the evidence provided supported the fulfilment of targets.

For indicator 8, the target was considered as met based on the PEACe (Provincial Education Assessment Centre) annual work plan with a proposed budget for both PEACE and SESLOAF as well as the 74 page “Completion report on implementation of Sindh Education SESLOAF” supporting that SESLOAF was implemented to improve student learning.

The Commission accepts that the indicator could have been more clearly formulated which has led to different calculations, which only lead to marginal differences in the results achieved. Nevertheless whichever method is used, the results achieved are significant in terms of scaling up of the voucher scheme which has resulted in many more children attending school.

For indicator 4, the Commission agrees that there are minor anomalies in the available statistics. This weakness was already identified at the time of project identification and formulation. Consequently, the Commission provided technical assistance for Education Management Information System to address this. Hence, the Commission took the appropriate and efficient measures to mitigate the risk arising from the deficiencies identified in the statistics.

As regards Georgia and the reference to the seed law, indicator 1.2 (contract 6), the Commission acknowledges the slight delay of two months. The Commission considered that the additional time was used to improve the quality of the process (more participatory and inclusive process, involving all concerned stakeholders) and the quality of the result (the seed law complies now with international and EU standards and sets a solid bases for a well-regulated sector). In addition, the Commission considered the decision of the Government to fast-track activities to compensate for the delayed adoption of the Law and to ensure a quick implementation (preparation of secondary legislation even before the law was formally adopted, shortened timeframe for the certification of wheat, only 6 months after the entry into force of the seed law). Taking all of this into consideration, the Commission decided that such efforts should be rewarded, despite the short delay.

Box 11 – Evidence based on biased sample

Regarding the evidence provided for indicator 6.3 of the KP education contract in Pakistan (contract 24), it is not the Technical Assistance that determines the interpretation of evidence required.

The Financing Agreement does not specify a sampling method for measuring the attendance rate. Therefore, the Commission relied on the results of the school attendance rate as already measured, which was higher than the target value, and thus considered the target achieved.

50

The Commission considers that in three of the five cases (indicator 1.2 of contract 6, indicator 2.1 and 2.2 of contract 16), the slight delay of one to three month(s) in achieving the deadline was due to additional qualitative steps performed by the implementing authority to ensure a high quality performance (i.e. designing a law that was adopted in line with best public administration practices following an extensive consultation process) and carried no repercussions on the following stages of the reform.

It would have been counter-productive not to acknowledge the merits of the partner’s additional efforts by insisting on a hard deadline. In the fourth case (indicator 5 of contract 1), see reply to Box 12.

Box 12 – Targets met after deadlines, partly or not met

With reference to Jordan (contract 1):

In 2015, the contract was extended until end 2017 specifically to allow the government to finalize the construction of six schools and the fulfillment of indicator 2.2 as allowed by the Technical and Administrative Provisions of the Financing Agreement that states (art. 2.2) that “in the event that a portion of the funds should thereafter remain unspent, the Beneficiary and the European Commission may agree to one additional payment based upon an assessment of the outstanding targets, conducted along the same principles as for the previous two variable tranches. This additional disbursement, if justified, would be made by the end date of the operational implementation phase and could be added to the last variable tranche payment”. This was the case.

  • 1/ The initial deadline of two years was set in a political context of providing crisis support in an exceptional situation. It was very ambitious, considering the timeframes required for tendering public sector works, and the effort required for building and equipping six schools. This was well recognised by the Jordanian authorities who proactively requested to extend the deadline at the time of introducing this indicator, as well as later during implementation. The Commission monitored the implementation closely, and processed with necessary extensions to the financing agreement, adjusting to the situation.
  • 2/ The Commission had made important political commitments to support Jordan in coping with the consequences of the Syrian crisis, through various pledging conferences and high-level political declarations throughout the life of this contract. Adding indicator 5 to this financing agreement was part of the Commission package under these pledges.
  • 3/ The disbursement was released against achieved targets only. With the exception of one school which was finalized in 2017, the other five schools gradually took in Syrian pupils as of 2016’, considering the overall number of Syrian refugees at the time already in Jordan, this was a commendable effort to provide the refugee children access to schooling.

With reference to Pakistan (contract 23) the target for indicator 6 was largely met as the English language curriculum was developed and the textbooks manuscripts prepared.

With reference to Georgia (contract 6):

As regards indicator 1.7, the target was met in accordance with the provisions of the Financing Agreement as regards to the formulation of the target and its source of verification, which refers to the adoption of a state programme by decree. The State Programme for Tea Plantation and Rehabilitation aimed at, amongst others, increasing tea production including bio tea production. It was also considered that the choice of tea was strategic due to its significant potential for organic production. Although the indicator could have been better formulated, withholding the payment would have departed from the provisions of the Financing Agreement and breached our obligations vis-à-vis Georgian authorities.

Box 13 – Targets set using incorrect baselines – no actual progress achieved

With reference to Ethiopia (contract 14): Given the substantial progress in reducing the overload of trucks in recent years (as evidenced by the table below), the Commission considered that the payment is justified, despite the re-assessment of the 2011/2012 baseline in a sector review done close to the disbursement date. In fact, long-term objectives planned towards 2020 were achieved much earlier than expected.

At the time of the Financing Agreement, the Commission fixed the baseline based on the best possible information available and it had to pay based on what was signed by both parties. The Commission maintains that the legal arguments were there for payments based on the indicator.

Box 14 – Lack of sufficient evidence

With reference to Jordan (contract 5):

Indicator 1.2 (in contract 5, supporting public finance management in Jordan) was assessed by reviewing the lists of attendance, signed by the participants in the training courses offered to the Internal Control Units (ICU) staff. It should be highlighted that setting up the ICU is one of the main recent achievements in public finance management, supported by the present EU programme.

With reference to indicator 3 in the community development programme (contract 22, Pakistan), the calculations of the district cost estimates were neither explicit nor specifically linked to the district development strategies (DDS) for the payment over the financial year 2016/2017. This will be corrected and for the financial year 2018/2019 the district cost estimates will be linked to the DDS for the final payment planned for 2020.

53

See Commission reply to Box 15.

Box 15 – Documentation of payments to Moldova

The Commission maintains that the payment was justified, but the positive evolution of the situation of the respect of effective democratic mechanisms, the rule of law and human rights in Moldova at the time of payment could have been better documented.

Conclusions and recommendations

54

The Budget Support Guidelines define the criteria to be applied when defining variable tranche indicators and their process of verification. Preference is given to taking well-defined indicators of partner countries policies and performance monitoring frameworks.

55

The Commission’s database capturing all performance indicators used during 2014-2018 budget support contracts show that a balanced mix of different types of indicators: inputs, process, outputs and outcome/longer term results are or have been employed. The sample audited contained more programmes approved before 2014, hence less focused on longer-term results. A more balanced use of different types of indicators is currently the practice.

Recommendation 1 – Increase the use of outcome indicators in variable tranches

The Commission accepts the recommendation.

The Commission agrees with the recommendation to make more use of outcome indicators, when appropriate. Outcome indicators are, nonetheless, in certain cases not compatible with an annual disbursements schedule and are also confronted with attribution problems, i.e. the Government is not in control of meeting the targets.

Recommendation 2 – Improve the formulation of performance indicators

The Commission accepts the recommendation.

Recommendation 3 – Safeguard the incentive effect of variable tranches

The Commission accepts the recommendation.

As regards point b, the EU capacity-building contracts generally aim to contribute to sustainable capacity increase in partner countries. Their scope is wider and they complement rather than replace the incentive provided by the indicators.

Recommendation 4 – Simplify the disbursement process for variable tranches

The Commission accepts the recommendation.

59

The Commission considers that the reliability of partner countries’ statistical system is analysed in the assessment of compliance with the public policy eligibility criteria to budget support and should also be found in the Risk Management Framework. The recently revised template of the disbursement note requires reporting on updates on country’s analytical capacity and data quality, with each disbursement request.

Recommendation 5 – Improve the assessments of the countries’ capacity to provide performance data used in variable tranches

The Commission accepts the recommendation.

60

The Commission uses a combination of verification procedures for each disbursement request. A desk review by Delegation and HQ staff of all the supporting evidence and additional exchange of information with authorities, as well as carrying out field missions if relevant and employing external experts if there is a need for specialised expertise.

Recommendation 6 – Improve the verification of the performance data used to disburse variable tranches

The Commission accepts the recommendation.

The Commission agrees with point a, to review the underlying evidence supporting the performance data if this data had not been explicitly declared reliable.

The Commission also accepts recommendation b.

61

As explained in the reply to box 12, the Commission considers that the amount of EUR 6 million for Jordan should not be included in the overall amount of discrepancy, as this payment was made following achievement of the target and within the time period granted by the extension of the Financing agreement.

For Pakistan, Georgia and Moldova, please see reply to Box 15 and paragraph 47.

Audit team

The ECA’s special reports set out the results of its audits of EU policies and programmes, or of management-related topics from specific budgetary areas. The ECA selects and designs these audit tasks to be of maximum impact by considering the risks to performance or compliance, the level of income or spending involved, forthcoming developments and political and public interest.

This performance audit was carried out by Audit Chamber III External action/Security and justice, headed by ECA Member Hannu Takkula, supported by Turo Hentila, Head of Private Office and Helka Nykaenen, Private Office Attaché; Alejandro Ballester Gallardo, Principal Manager; Piotr Zych, Head of Task; Eva Coria Paramas, Roberto Ruiz Ruiz, Erika Söveges and Nita Tennila, Auditors. Richard Moore provided linguistic support.

From left to right: Turo Hentila, Helka Nykaenen, Hannu Takkula, Alejandro Ballester Gallardo, Nita Tennila, Piotr Zych, Erika Söveges, Roberto Ruiz Ruiz.

Endnotes

1 Article 236 (1) of Financial Regulation applicable to the general budget of the Union; July 2018.

2 EC Budget Support – Trends and Results, 2018, p. 61.

3 With the exception of Morocco, is the subject of an ECA Special Report to be published in 2020.

4 Budget Support guidelines 2017, p. 138.

5 In our sample there are 163 indicators with targets which do not require a baseline, because they do not measure the development of a given variable (e.g. ‘a law approved’ or ‘number of meetings held’). These indicators are not taken into consideration for this assessment.

6 Budget support guidelines, 2017, p. 139.

7 This number does not include indicators from previous years’ tranches.

8 Snapshot provides analysis methods for the following key areas necessary for assessing statistical systems: (i) The legal, institutional and strategic frameworks supporting the production of statistics and monitoring at national and sector level; (ii) The adequacy of resources (i.e. quantity and quality of human resources, equipment, financing); (iii) The determinants of data quality (i.e. quality commitment, professional independence, impartiality, objectivity, methodology and appropriate procedures); and (iv) Relations with users (i.e. relevance, accessibility).

9 Contracts 1, 2, 3, 4 and 5.

10 Contracts number 2, 5, 13, 14, 23 and 24.

11 Indicator 2 of Contract 20 , indicator 1.2 of Contract 6, Indicator 5 of Contract 1 and Indicator 2.1 and 2.2 of Contract 16.

12 Indicators 5, 6 and 8 of Contract 23, Indicator 6.2 of Contract 24 and Indicator 1.7 of Contract 6.

13 Indicators 3 and 6 of Contract 22 and indicators 1.2, and 3 and of Contract 5.

Timeline

Event Date
Adoption of Audit Planning Memorandum (APM) / Start of audit 20.11.2018
Official sending of draft report to Commission (or other auditee) 20.09.2019
Adoption of the final report after the adversarial procedure 12.11.2019
Commission’s (or other auditee’s) official replies received in all languages 5.12.2019

Contact

EUROPEAN COURT OF AUDITORS
12, rue Alcide De Gasperi
1615 Luxembourg
LUXEMBOURG

Tel. +352 4398-1
Enquiries: eca.europa.eu/en/Pages/ContactForm.aspx
Website: eca.europa.eu
Twitter: @EUAuditors

More information on the European Union is available on the internet (http://europa.eu).

Luxembourg: Publications Office of the European Union, 2019

PDF ISBN 978-92-847-4030-7 ISSN 1977-5679 doi:10.2865/36578 QJ-AB-19-023-EN-N
HTML ISBN 978-92-847-4036-9 ISSN 1977-5679 doi:10.2865/44301 QJ-AB-19-023-EN-Q

© European Union, 2019.

Reuse is authorised provided the source is acknowledged.
For any use or reproduction of photos or other material that is not under the copyright of the European Union, permission must be sought directly from the copyright holders.

GETTING IN TOUCH WITH THE EU

In person
All over the European Union there are hundreds of Europe Direct Information Centres. You can find the address of the centre nearest you at: https://europa.eu/european-union/contact_en

On the phone or by e-mail
Europe Direct is a service that your questions about the European Union. You can contact this service

FINDING INFORMATION ABOUT THE EU

Online
Information about the European Union in all the official languages of the EU is available on the Europa website at: https://europa.eu/european-union/index_en

EU Publications
You can download or order free and priced EU publications at: https://op.europa.eu/en/web/general-publications/publications. Multiple copies of free publications may be obtained by contacting Europe Direct or your local information centre (see https://europa.eu/european-union/contact_en)

EU law and related documents
For access to legal information from the EU, including all EU law since 1952 in all the official language versions, go to EUR-Lex at: http://eur-lex.europa.eu/homepage.html?locale=en

Open data from the EU
The EU Open Data Portal (http://data.europa.eu/euodp/en/data) provides access to datasets from the EU. Data can be downloaded and reused for free, both for commercial and non-commercial purposes.