Challenges of Sustainability, Inequities, and Discontinuities

Category: Webinar 1

Webinar 1 | Rogers Mutie

The “Other” Competencies for Effective Practice & Pandemic Response


VIDEO


SLIDES


SPEAKER SUMMARY

  • Inside the mind of an evaluation practitioner – the difference between theory and actual practice
  • Impact of pandemics, crises, and the rapidly changing context to evaluation practice
    • Disruption to normal processes
    • Uncertainty about the future
    • Discontinuity of core evaluation services
    • Inequalities unearthed or exacerbated
    • Redundancy as some services become unneeded
  • Implications:
    • Evaluation practice demands more than conventional set of competencies
    • Competencies previously thought of as peripheral become CORE
    • Evaluator Training must respond to need for additional skills
    • Evaluator Training must build Broad Based Capabilities
  • Three broad domains of additional competencies – REFEREE skills:
    • Relational Competencies
    • Enabling Competencies
    • Foundational Competencies

Webinar 1 | Thomas Schwandt

Evaluation Competencies in the Ages of Discontinuity


VIDEO


SLIDES


SPEAKER SUMMARY

Five competencies:

  1. Engage the broader architecture of evaluating practices
    1. Need to consider alliances with other applied researchers that are engaged in activities of valuing, judging and recommending. 
    1. Develop capacities to work in transdisciplinary/interdisciplinary teams.  Working with a variety of different kinds of knowledge producers, and blurring boundaries between practices to engage integrative knowledge to address complex problems.  
  • Do collaborative knowledge work
    • Participatory approaches. Experiential and the relational understanding of practitioners who are engaged with situations that they themselves are a part of.
    • Learning to work in uncertain futures, deal with contingent scenarios, and address issues of the wellbeing of underprivileged people marked by social and political change.  
    • Expanded notions of collaborative knowledge work include things like: collective social learning, co-production, collaborative adaptive management, and triple loop learning.
  • Expand the repertoire of questions evaluators ask
    • Focus tends to be on questions of what works, effectiveness, efficiency and impact, and to some extent sustainability. 
    • Need to ask questions like: What assumptions underlie our understandings of the problem we are addressing and our efforts to address it?  Who gains and who loses from what we plan to do or have done?  What should we do to address potential exclusion and marginalization of peoples in our activities?
  • Develop epistemic fluency
    • We work on real world problems and to do so we require different combinations of specialized and context dependent knowledge – different ways of knowing. 
  • Develop ethical and political fluency
    • Developing ethical fluency involves developing moral expertise and capacity for normative analysis.  It is the competency to state and clarify moral questions, and provide justified answers to those questions.   
    • Moral expertise involves conceptualizing and elaborating on the meaning of norms, values and ends that are at stake in a particular intervention.
    • Developing political fluency means that evaluators focus on the political dimensions of acting and learning, as well as learning to deal with policy discord and moral disharmony.

Current training in evaluation largely assumes that evaluators are dealing with values in a world of facts.  We need far more attention to value dimension of things than the factual dimension.

Webinar 1 | Benita Williams

Rethinking Evaluator Competencies in an Age of Discontinuity


VIDEO


SLIDES


SPEAKER SUMMARY

  • Diverse competencies on our teams are extremely important – great for professional development, as we learn from each other on the job
  • Our response to the manifestation of discontinuities in our work and the competencies required:
    • Change approach mid-project – Developmental Evaluation vs. Implementation and Outcome Evaluation. Understanding different approaches was critical.
    • Change role – Facilitator and Problem-solving partner, sometimes Activist vs. Evaluator and Subcontractor. Process competencies become very important.
    • Change focus – What is the big picture?  What else is happening vs. what is the project trying to achieve? Requires understanding of systems theory + complexity.
    • Get comfortable with uncertainty.  Requires a resilient disposition comfortable with constant change and chaos.
  • Developing evaluation competencies is a systems thing! 
  • What changed with capacity development?
    • VOPE activities moved online – Networking less effective, circles narrowed.
    • Academic programs with some contact time moved fully online – Peer learning and networking diminished.
    • Staff in organizations worked more offsite and more discontinuous – Mentoring and on-the-job training reduced.
    • Building teams tended to be with established relationships – Referrals to “weak ties” reduced.
  • Erosion of the settings for capacity building. This disruption is going to continue for a longer time. Our evaluation systems are going to become more fragile. This has implications for equity and diversity.
  • Implications for training evaluators in 2021 and beyond?
    • Resilience and Adaptability, at individual as well as system level.

Webinar 1 | Ray Pawson

COVID, Complexity, Counterfactuals, & Calamitous Conclusions: A Provocation


VIDEO


SLIDES


SPEAKER SUMMARY

  • Limits of social suppression interventions exposed in complexity theory – 8 modes:
    • Disparate Command and Control Systems
    • Interaction and Emergence
    • Policy Discord and Moral Disharmony
    • Contextual Heterogeneity
    • Implementation Heterogeneity
    • Ambiguity in Relations and Guidelines
    • Temporal Change in Public Attitudes
    • Exit and Sustainability Effects
  • A Calamitous Conclusion?
    • No public policy has ever been subject to more effort, management, investment, and scrutiny than the social interventions to overcome COVID-19
    • Yet, only a halting intermittent solution was provided
  • Complexity dynamics and the oscillating impact of major policy interventions: Have we seen the merry-go-round before?
  • Is there light at the end of the tunnel?
    • For policymakers: Remember you are designing complex, adaptive, self-transformative systems. The key task is to try to anticipate the complexity dynamics.
    • For evaluators: Remember you are researching complex, adaptive, self-transformative systems. The key task is to try to trace the complexity dynamics.

Webinar 1 | Madhu Khetan

A Program Designer’s Response to the Pandemic: A view from an Indian lens


VIDEO


SLIDES

No slides are available for this presentation.


SPEAKER SUMMARY

  • The pandemic has given rise to different vulnerabilities and inequities (e.g., In India, the lockdown prompted a flood of reverse migration).  It has also brought about many changes in the functioning of public systems and the services we assumed would always be available.
  • Any assumptions we make about the context may no longer be valid.  An understanding of the change context becomes very important as well as factoring this into our evaluations.
  • M&E has been largely project-focused/project-driven. It has been less focused on unravelling and bringing to light new vulnerabilities or rising inequities. To do this requires more independence in framing the M&E agenda, enlarging the scope to not only look at program activities but also new issues that may have arisen.
  • COVID-19 has spurred us to look at the need to use data for developmental purposes for internal learning.  For this, acceptance of data both within the organization and outside becomes very important. Building collaboration thus becomes very important, and all the skills that are required for building collaboration assume a much greater significance.
  • Real-time monitoring has assumed a much greater role.  As evaluators, we need to proactively equip ourselves with new technologies. Generally, there is resistance to accept the data.  The first step in bringing about change involves thinking about how to overcome this resistance.  More collaboration, sharing, and transparency about our methods is needed.  We also need to be more rigorous with our data validation.
  • COVID has encouraged us to become more conscious of the importance of humanitarian work. Our monitoring systems have also been nudged to respond to the compulsions of measuring humanitarian work and integrating it with the other kinds of development work we have been doing.
  • There is a need for leadership in evaluation because any one organization may not have either the capacities or the bandwidth to do it on their own.