Adaptiveness as a Clunky Dance
The pandemic provides a reminder of how dynamic the world can be. It continues to surprise and challenge the creativity of even the most experienced program implementer working in community settings. In almost all of the public health interventions I know, the pandemic has forced a re-think of the implementation plans. The pandemic also has posed critical challenges for how evaluation as a field can be adaptive and nimble when faced with flux and the need to drastically change the playbook. The pandemic provides an opportunity for us to ask ourselves how evaluations can be adapted to be helpful at a time of crisis. Over the past month, I have had a chance to connect online with multiple implementers working both in community and policy settings. Despite the severity of the crisis, the creativity of many organizations in adapting to the pandemic has been striking.
One program implementer refers to her organization’s adaptiveness as a “clunky dance.” Another individual refers to ‘muddling towards an authentic response.’ This blog is written from the perspective that evaluators have a chance to learn from our implementation partners’ nimbleness and also incorporate their adaptiveness into the frameworks we use to value interventions. Given the challenges some community organizations are experiencing, evaluators might have a role to play in creating an enabling environment by raising questions around changing needs and the need for systemic coordinated responses.
Based on multiple dialogues with policy and community partners, I raise six sets of questions:
1. Narratives of programmatic adaptations
Most interventions have had to adapt during the pandemic. One somewhat straightforward area in which evaluation can be responsive is to highlight through simple, succinct narratives the adaptations that have occurred, the drivers of such adaptations, and if these adaptations were successful in responding to the emerging crisis. A number of the programs I am aware of are already documenting such adaptations. This can be both straightforward and challenging; but it requires coordination between program and evaluation teams. Attention needs to be paid to the constraints facing the programs, the timeframes the programs had to make the adaptations, the doors that were closed on the earlier service delivery platforms as a result of the pandemic and how new platforms were created to still provide the services. Evaluation teams need to leverage such narratives and help programs tell their stories systematically. What are exemplars of good evaluation stories of the adaptiveness/nimbleness of specific interventions?
2. Measuring nimbleness and adaptiveness
Few evaluation frameworks include adaptiveness and nimbleness as criteria to judge the success of interventions. As an example, it is interesting that the newly revised DAC evaluation criteria (by the OECD’s Development Assistance Committee; https://www.oecd.org/dac/evaluation/daccriteriaforevaluatingdevelopmentassistance.htm does not focus on either nimbleness or adaptiveness as desired attributes of interventions and criteria for evaluation.
Here is my attempt at defining these terms. Adaptiveness can be defined as the extent to which an intervention morphs to respond to changes in its context. In my judgement, adaptiveness is related to the contextual awareness of an intervention. Nimbleness can be defined as the agility with which an intervention responds to changes in perceived needs; there is a temporal aspect to nimbleness. In my judgment, based on a quick scan of the literature, neither of these constructs has been a key focus of evaluation frameworks. The pandemic provides an opportunity to learn from how programs have both conceptualized and operationalized adaptiveness and nimbleness. The pandemic reminds us that the needs of individuals are dynamic, and the opportunity for community, food, social protection, and health systems to respond to such needs are constrained by multiple factors. I have found dialogues with community organizations helpful in surfacing critical ideas on how we can measure adaptiveness and nimbleness. I raise this issue because evaluators have an opportunity to elevate the salience of adaptiveness and nimbleness as important criteria in evaluating interventions. What are innovative examples of measures of adaptiveness/nimbleness of programmatic responses during the pandemic
3. Deeper learnings about strengthening systems
Interventions are embedded within broader systems. No intervention is an island. This pandemic has highlighted the need to better understand the connections between the intervention and its underlying systemic contexts/supportive structures. One of the most striking insights arising from dialogues with policy and community partners has been that this crisis can sometimes potentially accelerate systems and coordinated responses to health and social problems. There is a literature around crisis-driven learning: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1468-5973.2009.00578.x
Some stakeholders have mentioned that there has been a greater sharpening of focus of the systemic response during this crisis. Others have noted that the coordination between different organizations has actually been enhanced during this pandemic. However, other organizations have described how the crisis has paralyzed their ability to respond to needs and their limited capacities to coordinate with other organizations. My view is that evaluation as a field has a role to play in exploring if and how crisis-focused coordination can be enhanced. Even in the happy situation where coordination has been enhanced, evaluation can still play a role by exploring what is likely to happen when the intensity of the crisis recedes. Are there examples of evaluations that have taken a developmental approach to enhance coordination between organizations at this time of crisis?
4. Re-thinking theories of systemic change
Many stakeholders also conjectured that the crisis has helped them re-imagine what is possible with system-level efforts. A number of community organizations are beginning to re-think their own theories of change, the role of different partners and how their organization relates to the overall system. In some settings, the pandemic has served to highlight and build support for interventions focused on poverty and social isolation in a way that was not there before the pandemic. It may be argued that responding to the pandemic has provided a greater understanding of the needs of the vulnerable and a deeper lived experience of what social isolation means. If this is the case, what does this suggest for refining interventions that are focused on poverty and social isolation? Are there examples of intervention-level theories of change that have been refined as a response to adapting to the pandemic?
5. Minimal components needed for an intervention to work
Evaluations often focus on what an optimal set of components is needed for an intervention to produce favorable impacts. We rarely focus on the minimal set of components that are required for an intervention to function. An important insight from a few community organizations has been that this pandemic has taught them lessons about how to simplify their programs; it has also provided insights on a minimal set of components required to meet the needs of individuals, especially the most disadvantaged. In the Behavioral Medicine literature, there is the concept of MINC (minimum intervention needed to produce change; https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3958586/). Once the intensity of COVID-19 has abated, a focus on the minimal set of components needed might help with thinking about our social programs in a richer light, especially if the focus is on using limited resources to address the needs of disadvantaged stakeholders. Can a focus on a minimal set of components needed to produce change help enhance a focus on meeting the needs of the disadvantaged given limited resources?
6. Intersectionalities and cracks in the network
One consistent feedback from a number of community organizations is that COVID-19 has helped surface or highlight key cracks in the existing “networks of care”. There is a need to pay attention to individuals who fall through such cracks and also focus on individuals who fall in the ‘intersections’ of multiple categories of deprivation. A number of interventions have adapted to respond to such intersectionalities and perceived cracks in the network. Over time, there is a need to re-think how our strategies address the needs of individuals who fall through such cracks. While many interventions focus on the needs of disadvantaged individuals/groups, in my experience, the theories of change rarely incorporate (in a sharp focus) the mechanisms by which the interventions can address the needs of such individuals who fall in such intersections.
Similarly, there is also a need to move beyond static definitions of vulnerability towards paying attention to the dynamics of vulnerability. (https://www.ncbi.nlm.nih.gov/pubmed/23549696). The issue is not just that the interventions meet the needs of the vulnerable, but additionally, that they help disrupt the generative mechanisms that COVID-19 might have amplified (https://www.brookings.edu/blog/future-development/2017/06/19/pandemics-and-the-poor/). Now, more than ever, there is a need to pay attention to syndemic (https://www.thelancet.com/series/syndemics) processes between poverty and social problems like homelessness that might exacerbate problems over time. Some important evaluation questions to consider are: Do the proposed solutions incorporate the lived realities of individuals who fall through systems cracks? Are the systems, structures and processes that are being set up to address the needs of such individuals consistent with the needs and expectations of the clients whose needs are intended to be served? Does the proposed solution pay attention to the dynamics of vulnerability that might be especially acute for marginalized individuals during and post-pandemic?
As we go forward, one ethical principle that should guide our work is that measurement and valuing should not interfere with the programmatic response during this crisis.
The above set of issues should be seen as questions, opportunities and challenges for the field of evaluation. In my view, much of the recent measurement work during this time of crisis has surfaced the problem space of COVID-19. The pandemic’s impacts on individuals, especially on highly vulnerable individuals, and such a problem space have also helped raise deep questions about how our existing systems of care might not be adequate. However, accompanying this crisis-driven problem space there has been a creative response to address as best as possible within existing constraints how programs have had to adapt in a timely manner to respond to the growing sets of needs. As a field, evaluation needs to help tell the story of such emergent solutions. Such a focus on the solution space (https://www.sciencedirect.com/science/article/abs/pii/S0149718912000213) can help grow the salience of evaluations as a useful tool in bridging problems and solutions during times of crisis.