In response to the challenges for education in South Africa, a range of actors aim to drive change and provide interventions that can positively impact education in South Africa. Many interventions take an approach which goes against the discourses centred around the failure of learners and instead focuses on the systems which should be in place to support learners with a sensitivity to, and understanding of, the complexities that learners face. Similarly, this post aims to think fidelity with regard to some of the constraints and pressures that exist in the system of education intervention. That is, to reflect on the components of the system of interventions and the system in which interventions occur. We are offering reflection, from our perspective as monitoring and evaluation (M&E) partners, on recent education projects. We have seen up close, the hard work and care that such interventions require and experienced the challenges and complexities.
Let us begin by briefly exploring the concept of implementation fidelity which is key in M&E. Implementation fidelity refers to the degree to which an intervention or program is delivered as intended. The monitoring of the implementation fidelity is the way in which the provision of an intervention is held accountable to the research, aims and theory of change by which it was designed.
By understanding and measuring whether an intervention has been implemented with fidelity, evaluators, researchers, and practitioners can better understand how and why an intervention works and the extent to which outcomes can be improved. Implementation fidelity contributes to the robustness of the evaluation of the impacts of the intervention programmes. The more expansive the measure and assessment of fidelity, the more this can contribute to attribution of outcomes to the program, to understanding how and why the intervention works and where it can be improved. Implementing fidelity analysis for attribution and a record of implementation allows for the transportability of the programme. That is, if the programme is shown to produce results and the mechanisms of implementation are well documented, then there is a blueprint for the programme to be reproduced elsewhere. Moreover, implementation fidelity analysis allows for deconstruction of the programme and an analysis on a piece-by-piece basis allowing for the identification of components in which it was successful (“pockets of success”) and the components that produced challenges. This analysis, and capacity for piece-by-piece evaluation, allows for a greater capacity to identify the effects of components of implementation on mediating variables which may be valuable information. That is, as an analysis of implementation, and thus how results are achieved (or not), the analysis may reveal other relationships and their effects that are of interest and/or of importance[1]. Implementation fidelity analysis is, in short, an assessment of the extent to which an intervention was delivered as per the design. This allows for greater evidence for attributing the results to the intervention, a greater understanding of the how the results came about, and a more piece-by-piece understanding of those workings.
The education intervention environment can see, to varying extents, a significant lack in fidelity. These gaps, particularly where they are more significant, often do not appear by accident but as the results of constraints and pressures. These programmes face limitations in resources, and in finances and time in particular. These interventions’ timeframes are not only subject to the implementor’s evaluation of the time required to develop, prepare, and run the project but are often subject to external funding or research cycles. This is clear area for potential problems. Project timeframes are only so malleable and are external to the particularity of the intervention. This is exacerbated in cases of external funding where the project designer has to abide by the timeframe of funders. Simply put, the timeframes of interventions often precede their design and are then adjusted. This incentivises programme designs that can fit these time scales and programmes with less time required for preparation in general. Preparation here refers to the preparation for implementation after the proposal for the intervention has been approved. The paper Education in Africa: What are we learning?[2], written by Evans and Acosta, offers a meta-analysis of 145 empirical studies across Africa from 2014 onwards. This paper also identifies the timescale of studies as an area that limits learning in the field, but it focuses on the brief timeframe for the measurement of impacts. Evans and Acosta state that, “The vast majority of interventions measure outcomes within 12 months of the onset of the intervention, with little information on the longer run time path of impacts”. While it is noted that studies at a policy level show greater long term impacts the point remains that interventions, and their measurements are structured by their timeframes in ways that are more restrictive than they need be.
There are also the constraints of what resources they are able to provide. Financial constraints always threaten to distort the intention of the design. Keeping in mind the expansiveness and complexity of pedagogy, small changes in accordance with budget can have profound effects on the process and outcomes. That is to say, the programme cannot always be reduced to the sum of its parts and is often rooted in theory and evidence of, not only the effects of components but the details of their interrelation and calibration. In other words, the adaptation to constraints and the pressure of expediency allows the intervention to move forward and deliver solutions to immediate problems but threaten to undermine the theoretical underpinnings. Insofar as the programme deviates from the interrelation and calibration of components based in pedagogical research then the aims of the programme should be revaluated. This is another point that overlaps with Evans and Costa who, while noting that there is a shift to context-based design, highlight the importance of attending to the theoretical underpinnings of the study and connecting findings to theory. They quote Duflo who states that “[o]ur models give us very little theoretical guidance on what (and how) details will matter. This article aims to highlight that it is precisely the implications of the details and the theoretical underpinnings that threatens to be overlooked in the name of expediency.
With the constraints of resources and the particularities of the school it is often those innovative and complex components of the design that go missing. Restricted development time, the restricted funding coupled with the effects of alterations and substitutions in the pedagogical ecosystem of the design, and the challenges that schools face create a tendency towards simplified interventions. Again, this is not a critique of that choice to simplify as, often, this is a response to the more widespread and documented issues that’s schools face that could well undermine programmes that did not attend to such issues. There is a constellation of variables involved and the impact can have great, and nuanced, variation. This article is not arguing that these interventions have no capacity to be impactful on schools and student performance. In fact, as has been said, this ‘simplification’ is the utilisation of documented responses to documented problems. What is being reflected upon is the question of programme fidelity and the tendency towards programmes that respond with similar strategies to a similar core range of challenges. These interventions are not designed as the rollout of these strategies but are part of the exploration of possibilities in education.
Schools also have their own systems and processes as well as constraints and challenges. These are not consistent across schools and every school offers a unique context for programme implementation. Project design must adjust to a variety of constraints as well as consider their relationship with the school and the consideration of interpersonal and professional relationships. Intervention designs must integrate themselves into the schools and do so in a way that recognises the school’s staff and their systems and challenges as well as realising that when the intervention ends the staff will largely remain at the schools.
This piece is based off of relatively small sample of interventions but, again, aims to reflect on more structural components of fidelity gaps to offer a consideration of these challenges. A paper titled A review of South African primary school literacy interventions from 2005 to 2020, authored by Meiklejohn et al.[3], offered a meta-analysis of 24 papers about 21 literacy interventions in South Africa. Their search was restricted to peer-reviewed journals that are included in the Sabinet database and the paper acknowledges the limitations of their search. The paper approaches the reflection on interventions from a different angle to this article and offers critical insights that are complementary to this discussion but, as a meta-analysis, it is also a useful consideration in this article.
As they put it:
It is noteworthy that many literacy interventions are driven by Non Governmental Organisation (NGOs) and are reliant on donor funding to be implemented. In many cases, donors are proponents of specific literacy approaches and their resources are used to implement interventions that fit within these approaches or agendas.
Similar to the discussion above the role of funders and organisations, while key to capacitating interventions, is posited as a limiting factor to implementation designs but they add that the funding organisations and other capacitating organisations are proponents of certain literacy approaches. While their paper does not explicitly speak to the tendency towards ‘basics’ there is some evidence of this in their findings. For example, they highlight the common features of the programmes and indicate that they are aligned with those advocated by the world bank. As we have argued the issue is not necessarily the repetition of elements but that, when faced with the constraints, which are well documented, these tend, in our experience, to crowd out new elements and disrupt the design insofar as it aims to build on the existing knowledge. In concluding Meiklejohn et al. say
It is thus hardly a surprise that this systematic review has indicated that the response to the literacy crisis in South Africa has generally been ad hoc, uncoordinated and somewhat NGO/donor-driven. The flip side of this is that, in spite of a few attempts to make an impact on a large scale,[…] the government has been unable and/or unwilling to deal effectively with pervasive literacy challenges. There is little evidence of large-scale, coordinated interventions implemented over sustained periods to make the required impact on national literacy levels. (p. 10)
This is largely in agreement with what has been discussed in this article. Their meta-review has found the repetition of core elements, the impact of funding and NGOs and an uncoordinated ad hoc approach from stakeholders. This article is aims to further identify that the constraints on implementers and project designers. It is not being posited, by Meiklejohn et al., that a singular government coordinated approach is the way, as this recommendation appears alongside the already mentioned critique that NGOs being proponents of specific literary approaches and approaches advocated for by the World Bank may result in issues. A review of South African primary school literacy interventions from 2005 to 2020 as previously stated, offers useful reflections, not least on the coordination of information and learnings from these interventions but responding to the constraint highlighted in this article, this piece will explore a few discussion starting possibilities. The possibilities are geared toward expanding the boundaries and constraints or reconfigure them in such a way that it allows for greater variety in programmes and to avoid the trappings promoted by the current structure. These are not posited as definitive solutions and, no doubt, appear in a simplified form that would have to be further articulated in practice.
How can we work towards a decrease in the repetition of fidelity failures and increase variety in designs thus allowing for more expansive interventions and thus a more expansive evidence base for interventions and intervention possibilities. Just as the intervention space moves to greater attention regarding the context of the implementation there is the space for further reflecting on the way this is connected to the context and structure in which interventions are developed and managed and in so doing opening the potential space for a reinvigorated education research and intervention environment.
- longer time and more financial support.
There is the option of longer intervention cycles with greater funding. Knowing that funding is a complex issue there is also the shift of the balance between resources and numbers of projects. For example, instead of two interventions over two years have one over two years with the first 6 months being dedicated to development and preparation. Here you have greater time for development and preparation and utilise 2 years’ worth of implementation funds on 1 year and 6 months of implementation and 6 months of preparations post the acceptance of proposals. The point here is the reconfiguration of the timeframe and funding. Without an increase in total funding this means a less projects but more funding per project. This is a difficult decision and would require consideration of the extent of this shift with the hope that findings from this approach are able to generate more largescale and sustainable interventions. Again, beyond the repetition of core elements the more expansive and nuanced approaches that might be facilitated by longer cycles offer greater opportunities for more robust approaches that reach beyond what are often identified as the core issues. It is also recommended that greater time for preparation even without the additional financial assistance is worthwhile. Or, more ambitiously, considering structures that can accommodate timeframes that emerge from the design and greater emphasis on evaluating the longer-term effects.
- Earlier school involvement
It is known that each school has unique factors and circumstances. The general model sees implementers develop an intervention then select schools then adjust the programme to the included schools. There is the opportunity to select the schools at an earlier stage in the development of the programme design, including the schools in the process and/or designing a programme with a more direct understanding of constraints as opposed to an the more ad hoc process we see. In some cases, it may be argued that this just displaces some of the gap from one between the implementation and the design to one between theory and project design insofar as it responds to the limitations at an earlier stage. But the displacement from implementation to design is key. Again, this is not said that this will expose ‘bad design’, it is rather to call for reflection on the design stage and the choices made to identify what is a failure of implementation, what is a failure of design and what theoretical elements might guide decision making. It may be argued that this authorises a greater level of programme differentiation. But it is also a more precise and intentional documentation of differentiation and the reasoning for it. This then allows for a clearer understanding of how much differentiation was a result of constraints, and thus an organised response, and how much was failure in implementation.
- Greater collaboration in general (pedagogy experts, M&E, pooling of resources)
Schools are worth specifying for an earlier role in the project but there is opportunity for more and greater collaboration in general. These interventions are not a specific set of tasks but attempts to address challenges and learn from the attempts. While there are different roles that different stakeholders have in the collaboration process the more attention is given to co-conceptualising the collaboration the greater the opportunity to align objectives and the processes to achieve those objectives. The work to consider the various interests allows the opportunity to evaluate the design from multiple perspectives. And there is the opportunity to collaboratively utilise resources more effectively. From an M&E perspective this includes assisting in considering what the key components, levers of change and indicators in a project are and setting up systems to monitor those points and provide feedback in a format that is easily usable for implementers. Bringing in M&E collaboration at an earlier stage allows for the utilisation and synchronisation of monitoring tools for internal processes and evaluation. Through the overlaying of various mappings of the design a more complex picture is likely to emerge from which each stakeholder can contextualise and reflect on their original mapping. This collaboration also allows for more adaptability. It was stated earlier that seemingly small changes in the implementation can have profound effects on the design and should invite a revaluation of aims. With greater collaboration with experts in pedagogy, for example, there is space for a constantly updating pedagogical implications of the fidelity question.
Reference List:
Dusenbury L, Brannigan R, Falco M, Hansen W (2013) A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Res. 18: 237-256. 10.1093/her/18.2.237
Evans, D. K., & Acosta, A. M. (2021). Education in Africa: What Are We Learning? Journal of African Economies, 30(1), 13–54. https://doi.org/10.1093/jae/ejaa009 Meiklejohn, C., Westaway, L., Westaway, A. F. H., & Long, K. A. (2021). A review of South African primary school literacy interventions from 2005 to 2020. South African Journal of Childhood Education, 11(1). https://doi.org/10.4102/sajce.v11i1.919
[1] Dusenbury et al., 2003
[2] Evans and Acosta, 2020
[3] Meiklejohn et al. (2021)