Description
The overall score of this evaluation has been rated at 3.46 out of 5 on the Likert-type scale applied to assess the quality of government evaluations. This rating has been assigned to the evaluation as it is
viewed to have been done to a fairly good standard. The planning phase underpinning the evaluation was well-utilized to formulate the study and some important adaptations to the approach were made. The Department collaborated well with an external evaluation specialist to provide the preparatory framework for the study and the evaluation team, including an international Credentialed Evaluator, provided strategic guidance on the evaluation design at the outset of the evaluation. This led to the upfront acknowledgement that a quantitative impact analysis would be infeasible due to a lack of sufficient quantitative information on the programme’s impact. This quality assessment thus scored the planning & design stage the highest at 3.82. The follow-up use and learning stage was particularly well done as the Department has already begun engaging key stakeholders and drafting an improvement plan on the
basis of the recommendations. This phase was thus scored at 3.60. The evaluation report was well written and clear in terms of the evaluation approach and methodology. In the absence of quantitative data, the evaluation analysis relied on the identification of key issues and themes emerging from qualitative feedback. However, even with the adaptation of the evaluation approach, it is viewed that the evaluation yielded important insights to the Department in understanding how to improve its reach, impact and effectiveness in the rollout of the programme. All parties interviewed for this assessment felt that there was a very good communication channel through which the Department was able to coordinate with the evaluator’s to ensure alignment in the approach underpinning the study. This factor contributed to the evaluation being scored fairly well in terms of the ‘coordination and alignment’ (3.52)
and ‘free and open evaluation process’ (3.63) overarching considerations. Further, the involvement of an external evaluation specialist in the oversight of the evaluation supported the Department in building internal expertise in the implementation of evaluations and in ensuring compliance with best practice.
This factor contributed to the evaluation being scored well in terms of the ‘partnership approach’ (3.60) criteria. There was a view held that this impact evaluation may have been premature given that the MAP has only been running for 3 years, but the evaluation identified key information/data gaps which can only but assist the Department in ensuring future evaluations have secondary data to facilitate the kind of rigorous quantitative impact analysis which this evaluation could not achieve. This quality assessment supports the view held by parties interviewed that the evaluation was conducted to a good standard and was timely for the Department in its strategic planning for the improvement of the MAP’s design effectiveness.