Embedding Developmental Evaluation in activating Social Movements and Systems Change. This is a first in a series of blog posts that aims to introduce the experience and value of implementing Developmental Evaluation in Social Movement and Systems Change efforts.

EIT Climate-KIC defines system(s) innovation as integrated and coordinated interventions across whole value chains in economic, political and social systems, based on a portfolio of deliberate and connected innovation experiments - EIT Climate-KIC

In this first post, we focus on how to start with Developmental Evaluation

Why use Developmental Evaluation?

A lot has been written about how traditional evaluation approaches are not suitable to evaluate innovation and systems change efforts. They are designed for programmes and projects that run linear processes, in stable contexts, they aim to find solutions to defined problems, using well known practices. In these cases, evaluation assesses outcomes after implementing solutions, focusing on accountability to understand: Did we do what we said we would do? Did we deliver the planned results?

But how do you evaluate in complexity? How do you assess the value of an innovation process that is nonlinear, addressing a complex problem, operating in a context characterised by unknowns and where solutions are emergent?

Developmental Evaluation (DE) offers an evaluation approach that can support innovation processes, through real-time feedback loops, and helps capture emergent impact. DE goes beyond accountability and focuses on:

  • Testing hypotheses. It is about asking questions such as: is the approach we have taken helping us reach our goals?
  • Learning: what is working? what is not working? What needs to change? What do we need to keep doing? Learning and reflection spaces are key to continuously iterate and adapt our activities to generate the desired impact.
  • Communication: what is happening? What impact are we generating? Internal communication also keeps internal teams aligned and external communication is key to engage stakeholders in the movement building and systems change.
  • Management: what are our hypotheses to activate the transformation? What are the activities that we will do? How are we iterating the activities that we are doing? This last element is about capturing and keeping track of how DE informs the systems change process.

What are we involved in?

Agirre Lehendakaria Center (ALC) and EIT-Climate KIC (EIT-CKIC) are working alongside regional teams formed by public institutions, private organisations, NGOs, associations, citizens to explore a DE approach that supports  a number of Just Transformation interventions across Europe. We are working within the framework of EIT-CKIC’s  Deep Demonstration and ALC’s Open Innovation Platform approaches.

Currently, we are embedding the DE approach in the work that teams are conducting and we are starting to reflect on initial lessons learned that we share below.

What are we learning in....

Getting started - too early to evaluate?

In traditional evaluation approaches the evaluator is called in at project completion, impact is  captured and results presented. Indeed, this evaluation cycle hampers learning: coming late to inform decision making on key elements such as strategy and directionality of vision. 

DE is constant. It starts at the beginning of a project and follows it throughout. It is about situating evaluation within social change complexity to support decision making from the earliest stages of an intervention. However, this comes with challenges: how do you evaluate the value added of a process when you don’t know which road you are going to take? Nor where you might find yourself involved?

Introducing Developmental Evaluation as an approach to the regional teams has meant working with a mindset shift with regards to how evaluation is normally perceived. And we often heard: ‘we are not really sure we have any results to share and we feel it is too early to evaluate’.

Thus, explaining the different elements that form a DE and learning as a central component had to be explained throughout the process. We are taking the first steps into evaluation together, we are learning by doing and we are understanding the value that DE brings through exploration, experience, and experimentation.

We have initiated DE reflections from the beginning of our regional transformation processes, in the initial stages where system change efforts are still being identified. We are focusing on:

  • Support efforts to understand the system in which teams are engaging and seeking to transform. Here data is being collected through Deep Listening to understand transformation opportunities and to support decision making.

Document discussions and decisions relevant to assess what is emerging in the process and how this evaluates the chosen methodology to iterate it as needed in order to activate system change. Documenting what is happening through a complex and sometimes difficult innovation process is a challenge. Specific templates have been set up and are being iterated to capture discussion points and lessons learned.

Collecting and making sense of the data - where to start with DE?

As initial hesitation of Developmental evaluation data collection is evaporating, new obstacles raise: How do you make sense of the data collected?

DE can be explored using data that is already collected through other processes. You can also set up specific data collection processes to support decision making if needed. However, in complex innovation processes, the amount of data captured and deemed relevant can be overwhelming. Thus, it has been helpful to set up a structure for having recurrent reflection sessions to discuss and reflect on collected data at different levels: team, regional, inter-regional. 

Approaches that we are testing in these reflection sessions:

  • Sensemaking sessions following “What, So What, Now What” structure.
  • Guiding Questions: a set of guiding questions that can support teams in both reflecting on the information that needs collecting identifying the gaps in current activities and work streams; as well as analysing collected data.

Phases

Intent

Frame

Objectives

Supporting identification of Current System, System Vision and How to get there?

Defining the field and supporting development of portfolio principles and portfolio brief

Guiding Questions

•What narrative is emerging for how we can envision a system change?

•What are the barriers and enablers for moving the process forward?

•What baseline is emerging for understanding the current situation/system?

•What criteria emerge to tell whether or not we are progressing towards a future system (system change milestones)?

•What processes generate enthusiasm? why?

•What criteria are emergent to identify relevant stakeholders?

•What potential cause and effect relationships, interdependencies, possibilities, opportunities for breakthrough are emerging?

•How is new sources of innovation or innovation actions informing the positions we are interested in?

•What initial relationships with potential innovation partners are being built?

•What insight from reconnaissance is used to generate design principles?

•How is progress towards developing a portfolio brief in terms of relevant problem spaces, positions, design principles and potential innovation actions is occurring?

We are currently involved in framing and testing information structures to feed into decision making. Watch this space for a future blog on this.

Developmental evaluation as a collaborative approach - what does it take to be a good Developmental Evaluator?

In DE, the evaluator is not an evaluation expert – the evaluation exercise is a team effort, wherein every member of the innovation team plays a role in the evaluation process. The evaluation team combines both inside knowledge on the organisations involved, with an outsider ability to bring in new knowledge and ask open questions.

In systems change initiatives, the role of designers and evaluators are interconnected. We find ways to create and maintain a healthy innovation tension. Too many changes and too much uncertainty, and the team will get lost; too little change and you will be doing what has already been done.

We have developed a number of roles we are currently testing: Info Gatherer, Sensemaker, DE Challenger and Critical Friend. Getting into the roles and getting into the mindset that these require involves learning and unlearning, and it fundamentally takes time . We need to remind ourselves: “You can't produce a baby in one month by getting nine women pregnant”.

In particular, the role of the Critical Friend is a DE challenge. Being a Critical Friend can be frustrating and tiring. It is about finding a way to say things in a constructive way, creating the confidence to be honest, finding the energy to constantly challenge what is being developed (even what you yourself are doing! Don’t forget that it is easy and comfortable for all of us to fall into the moulds we know).

Perspectives moving forward

In the following months, we will continue to prototype,  test and iterate DE processes, tools and approaches. In our DE roles we will continue to reflect and learn:

  • What added value does the evaluation bring? Using DE requires a specific mindset and understanding what the evaluation brings to the innovation process. In our experience, the added value lies in holding the process together and asking questions that help guide challenge-owners in their decision making.
  • How to embed evaluation and make it sustainable in the longer term? Understanding culture and identity is key: what are the elements that will activate the change in each context and what might be the blockers? ALC calls this cultural element the K factor, a key element to explore in order to activate systemic change.

We are living the process, we will share the experiences and lessons learned in following blog posts. Watch this space.

Written by: Mikkel Nedergaard <mikkel.nedergaard@climate-kic.org> and Ione Ardaiz Osacar <ione@agirrecenter.eus>