The Metaevaluation Dashboard

For summarising the most relevant aspects of an evaluation, we have created the “Meta-evaluative Dashboard” that reveals what happened during the evaluation, using the evaluation report itself as source. The Dashboard visually presents the information given in the report about the methodology choices take. After a thorough analysis, and a process of deep synthesis, the most important elements of an evaluation methodology have been selected. This Dashboard was conceived to be as generic as possible, but has the potential to be customized according to your needs.

Complexity Assessment

According to Patricia Rogers, the complexity -or simplicity- of an intervention and its context can be defined by 7 issues: Focus – Are there multiple objectives (complicated) or emergent objectives (complex)? Involvement– Are there multiple decision makers (complicated) or is it flexible and shifting (complex)? Consistency – Is there a standard delivery simple) or does it need to be adapt to context in ways that can be identified in advance (complicated) or can only be known through implementation (complex)?   Necessariness – is the intervention the only way to achieve the intended impacts?…

Participation scan

Evaluation reports usually declare with pride that they have followed a participatory approach. However, participation has many levels and many layers.   Defined the main stakeholders involved who might participate in the evaluation exercise, we map their implication in each of the evaluation phases. Darker shades indicate higher levels of responsibility.   With such a participation scan, the Dashboard can show which stakeholders were involved in each of the evaluation phases and to what extent (leading or simply as informants) they have actually participated.

Mix-methods Scan

To map the methods and techniques actually used, the Dashboard displays the number of techniques used in each phase. These are displayed in relation to the entire timeframe of the evaluation, to easily assess how these techniques complement each other, and how mixed methods (multi-methods) or a mono-method are utilised. This example only includes icons of some of the most commonly used methods, but other methods could be used and would be accordingly indicated by their icons.

Sampling decisions

Almost every evaluation study has to make some decisions in terms of sampling, as it would be cost-effective to reach the whole population of potential informants. To give a quantitative picture of these decisions, the dashboard gathers the estimated number of potential sources. Later it reflects the number of each type of source that has been finally consulted by the evaluators. Finally it represents its %.  Also it shows whether it was purposive or random sampling, as usual, according to the evaluation report.

Credible evidence

Accordingly, for assessing how credible the evidence found, a mix of alternative strategies should be adopted to be reasonably sure that the findings reflect reality. Not all are needed, but the mix should be complementary and convincing of causality or causal inference.   Based on: Davidson, J. & Rogers, P. (2010) Causal inference for program theory evaluation http://genuineevaluation.com/causal-inference-for-program-theory-evaluation/

Evaluative synthesis

Inspired by Jane Davidson, this is one of the most unique features that differentiates an evaluation from a research study. Having an Evaluative Synthesis, understood as a systematic judgement, is the essence of the evaluation.   Many evaluations do this to some extent, -defining the evaluation criteria and questions-, but higher levels of evaluative synthesis would require defining what is “good” in each particular context and the evidence that would demonstrate each element which is not so common.   Source: Davidson, J. (2014) It’s very core of evaluation and makes or breaks

Evaluation Standards

A mix of different standards and codes of conduct has been compiled, defining key aspects that should be taken into account and, at the same time, can be easily checked as specific behaviors.

Partager