Publish What You Fund



AidWatch Aid Transparency Survey MethodologyThe criteria used to assess donors are: (I) commitment to transparency, and (II) through a survey on the availability of a series of specific types of information (see Box 1 for more details). This section sets out the methodology and sources used as well as the limitations and challenges of this approach. This is the first time this survey approach has been applied to aid transparency. Feedback and suggestions on how to improve this approach would be much appreciated as the aim is to extend and roll out this methodology over the next year.Box 1: Criteria used to assess aid transparencyCommitment to transparencyExistence of a Freedom of Information Act (FOIA)Engagement in the emerging best practice on aid transparency (IATI)Survey availability of 35 specific types of information4 Organisation level questions – for the biggest donor agency in a country (e.g. aid allocation procedures; total development budget)3 Country level questions – the donor’s biggest recipient country (e.g. country strategy papers, annual audit)18 Activity level questions – for one project in the donor’s biggest recipient country (e.g. title, total overall cost, sectors the projects contributes to)Commitment to transparencyInformation was collected on the donor’s commitment to aid transparency, assessed in terms of a) whether or not they have a Freedom of Information Act; and b) specifically on aid transparency in terms of the donor’s engagement with emerging best practice being developed in the International Aid Transparency Initiative.Existence of a Freedom of Information Act (FOIA)The data source for the existence of a FOIA (or equivalent policy) was the September 2009 Fringe Intelligence research into the status of FOI legislation in Europe and beyond. Access Info Europe, who specialise in access to information, also checked this data to ensure it was up to date. Donors were scored 1 for yes or 0 for no. Countries classified as not having a FOIA are those where there is no law (Cyprus), it is only in draft (Luxembourg, Spain) or is adopted but not in force (Malta).Engagement in the International Aid Transparency InitiativeEngagement in IATI was selected as a proxy for commitment to aid transparency. While other mechanisms could have been selected, there is for example no way of assessing current levels of disclosure to TR–AID being developed by the Commission. The OECD’s Creditor Reporting System (CRS) is explicitly a reporting mechanism, not designed as a transparency or disclosure tool, and thus it does not contain current information. It is therefore not a source for establishing timeliness of information, a key component of aid transparency. IATI on the other hand is specifically designed to be and is a mechanism for more efficient and comprehensive reporting to both the CRS and TR–AID.This information was collected from the IATI website (for signatory status) and by request from the IATI Secretariat (for plans for implementation), and is correct as of 29 April 2011. The scoring was as follows:2 =Implementing IATI – has begun publishing data to the IATI Registry (UK) or has informed the IATI Secretariat that it will do so before HLF-4 on 29 November 2011 (Denmark, European Commission, Finland, Netherlands, Sweden)1 =Signed but no implementation schedule or plans to do so before HLF-4 (Germany, Ireland)0.5 =Observer to IATI (France)0 =No engagement to dateAvailability of 35 specific types of information The survey was designed to sample and collate data about the publication of key types of aid information for each donor and agency in ways that generate a comparable, robust data source that is specific, detailed and verifiable. National platforms of civil society organisations assessed the availability of 35 specific types of information at a) donor agency or organisational level (4 questions), b) recipient country level (3 questions) and c) project or activity level (18 questions). This is by no means exhaustive but was designed to examine the availability of information at all stages from policy to implementation, including design, evaluation and rmation availability was judged by whether a specific piece of information was:Always publishedfor organisation and country level questions: consistently or regularly;for the activity level questions: for all projects in the recipient country.Sometimes publishedfor organisation and country level questions: inconsistently or irregularly;for activity level questions: for some projects in the recipient country.Not published, but collectedThe results used are the amount of information that is published always and sometimes.The ranking is derived from how many types of information donors always publish, added to the score for commitment to transparency (FOIA and IATI). If the scoring is equal then the donor that published more types of information sometimes was placed higher. Information that is published sometimes is given lower valuation.Quality Control ProcessThe survey involved several steps to ensure that the results provided as comparable and robust assessment of donors’ transparency as possible:Donor organisation selection: National platforms selected the relevant donor’s largest or primary aid agency because it was thought this was most likely to consistently provide the most information across donors. For example, Germany’s GIZ and the Netherlands’ Ministry of Foreign Affairs.Country selection: National platforms then selected the current largest aid recipient country for that aid agency. If the current largest recipient country of aid from the agency was not known, the current largest recipient country of aid from the donor government as a whole was selected. If this was also unknown then the most recent OECD DAC figures (2009) were used to find the recipient of aid to survey. Activity level: National platforms answered questions about the availability of 35 specific types of information, by looking at the donor’s website. Data was actually collected on 36 information types but due to comparability problems question 12 has not been used.Responses were then reviewed by Publish What You Fund to ensure each piece of information was evidenced and standardised across the surveys. If information was not provided then an additional search of agency websites in English and the local language was conducted. If there was a difference in the amount of information provided in English compared to the local language then we used whichever provided the largest amount of information.National platforms were then asked to check and return the surveys to the relevant donor agency. Agencies were given a deadline of two weeks to reply, but replies were still accepted and actively sought for another two weeks.Donor replies were verified: The URLs provided were checked to ensure that all scores of “published” were accurate. In several cases the URL provided as supporting evidence did not show the information suggested, so the results were downgraded to either “sometimes published” if the information was published only for a few projects, or just “collected” if the information was not published for any projects.Donor websites were checked again to see whether any more information was available, in order to ensure that the maximum amount of information was found through the process.Challenges, limitations and lessons learnedAs mentioned above, this is an initial attempt to develop and apply this survey methodology to aid transparency, drawing on experience and approaches in the right to information field as well as in aid surveying. A number of specific challenges were faced which are set out below. Feedback and suggestions would be extremely welcome. Donor gaps – key data gaps include:Coverage of European donor countries – three countries are missing from this survey, namely Bulgaria, Ireland and Romania.Coverage of agencies is limited – only the largest agency of each donor was surveyed. Coverage would ideally be extended for donors with several agencies, such as France and Germany.Insufficient time for donors to respond – a number of agencies did not respond to the survey results sent to them. For the EC, Lithuania, Netherlands and Poland, the data was collected too late to give the donors an opportunity to reply. Additional searches were conducted in these cases in an attempt to ensure accurate responses. We apologise for this and these results should be considered in this light.The finding on the levels of information collected but not published is problematic. For a number of cases, donors did not respond and instead the judgement that an item was collected was based on existing knowledge by the respondent. The information types assessed are not a comprehensive list of all the information and data donors collect or make available. Impressionistically the research did not identify areas which were systematically available but were not captured by this list, but this is not a systematic finding.The survey did not look at the format that the information was provided in. So, while Denmark receives a similar score to Sweden and a better score than Estonia, the information Sweden and Estonia provide about their aid activities is more useful because it is provided in a machine-readable format.A binary yes/no assessment of FOIA is clearly not desirable: not all legislation, nor the implementation of that legislation, is to the same standard. The challenge of a lack of systematic collection of FOIA quality is being addressed by Access Info Europe who are currently undertaking research to develop such an index.Question 12 was excluded from the final results (Does this donor publish the type of finance given? e.g. grant, loan, export credit, debt relief). It was interpreted quite differently by different national platforms, sometimes being answered as “published systematically” if the question was implicitly answered, but not explicitly stated. In this case, national platforms’ own answers were accepted.Lack of current data. Often respondents could not find current data, or did not realise what they found was out of date. Generally this was rejected; however, it was accepted for a limited number of questions where there was no more current data and it appeared that the project was still in operation.As every with any questionnaire working across different languages and contexts, some questions were interpreted differently, were not clear to respondents or were insufficiently explained. Generally we managed to alleviate this in the standardisation process, however for Q30 (Is the tender for the activity published?), we are aware that there were particular challenges with and that might mean procurement data has been missed.There may be information we missed or even donors themselves are not aware of due to poorly designed and hard to navigate websites. Given the importance of accessibility in making the investment in publication useful, in future we would include some qualitative assessment of how easy it is to find information on websites. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download