comment 1

Exploring complexity and the implications for leadership and decision making in a changing world. In conversation with Marco Valente.

One of the nicer aspects of exploring, reading and learning about systems thinking and complexity is to discover interesting blogs, papers and books and to get in touch with the authors to learn more about their points of view and their journeys into systems. 

Some months ago, one of my Google searches brought me to a blog on complexity written in 2017 by Marco Valente. Marco wrote it as a way to take stock and organize his thinking around some of the key concepts of complexity and systems.

It really resonated with me. The amount of information and ‘must reads’ on systems and complexity is overwhelming and it can be quite a challenge to organize that information and draw the implications (the so what) for the research and advisory work I do in international development. 

Exploring some of the implications of complexity, Marco Valente

I was curious, and reached out to Marco to hear from him about his journey into complexity and systems thinking; what is he learning and does he make sense of it all? Marco has been living in Malmö for the past five years and is a team member with Cultivating Leadership. He works as a facilitator of multi-stakeholder and leadership dialogues with a diverse range of partners, and brings complexity thinking into these spaces of sharing and exchange.

Marco, thank you for making time for this conversation. When did you start to be interested in complexity and systems, and why? 

It’s been a long and non-linear journey. I have always had a deep curiosity for understanding the world. I studied social science at the University of Salerno and the courses I took were meant to provide the tools for understanding the world – so to be able to act in it for the better. At that time I genuinely believed that approaches such as linear causality and multi-linear regression were good enough to understand the social systems in the world. My interest in understanding “what makes societies better” continued to expand the more I learned about it. So, I went on to study sustainability science at BTH university, Sweden, in the Masters in Strategic Leadership for Sustainability (MSLS). During those studies I took a deep dive into ‘systems thinking’. I learned about causal-loop-diagrams and saw for the first time non-linear causality, tipping points, feedback loops, etc. The main insight for me was that there was a different way to look at problems by connecting the dots, seeing the whole system, and looking at the root causes through a different lens. When I think today about those studies, I realize that I was taught a very specific approach to systems thinking: in its mainstream applications it still aimed to figure out the world with exactitude, accuracy and finding the right answers. A few years later, I came across the work of Dave Snowden and Brenda Zimmerman which made me reconsider almost everything I thought I knew about systems.

What caught your attention from their work and why did it change your perspective on systems thinking?

An assumption among some of us was that systems archetypes could help us understand the challenges at hand. Reading their work made me realize that the view I had was too simplistic, that a system representation is always a rough approximation. Systems cannot be fully mapped or visualized; they are just too complex. A visualization through causal loops and lines between elements of the system is useful but is it not the same as understanding a system.

Getting to Maybe by Brenda Zimmerman et al., and Dave Snowden’s foundational Cynefin articles opened my eyes to the shortcomings of a narrow perspective on systems thinking that was part of the academic courses I had taken. Zimmerman’s book highlighted how our attempts to create positive social change could not rely on a mechanistic worldview and made me realize that some of the systems change theories I had studied and researched were often rooted in a mechanistic paradigm. Snowden’s Cynefin made clear to me the fundamental difference between the predictable and the unpredictable in the world, and that we need appropriate lenses depending on the nature of the problem we are looking at. All in all, reading Zimmerman and Snowden made me realize how infinitely complex, messy and unpredictable the world is, and made me continue my search for ideas, concepts and theories about complex adaptive systems. 
 

The Cynefin framework

I am new to the field of systems and complexity. I feel that there is something right in systems and complexity thinking but I also see that using words such as systems, systematic and complexity in my research work can put colleagues off. One colleague told me recently that the ideas and methods around systems thinking for researching education problems in low- and middle-income countries are interesting but also abstract and it is not always clear how to use them in a policy research project. What I struggle with is describing in words what my instinct tells me: that a systems lens on social and developing problems leads to much richer insights than a reductionist analysis of a problem separated from the systems and relationships that cause it. 

I see what you mean. It’s a dance. On one hand we want to be rigorous and to make sure we use precise language. At the same time, we don’t want to scare people away with too much technicality, because we are both experts and novices at seeing complexity (this is one of the many paradoxes that I find in complexity). 

Thankfully people around me see that the ‘usual’ ways of analysing problems have shortcomings and struggle with the rapid acceleration and increasing complexity in modern life. Probably, I am lucky, because in my work most of my clients readily admit that the world is messy, unpredictable, non-linear, and we take as a starting point that we need very new approaches 

How do you think we should look at the problems in today’s world to try to solve them? What new capabilities and lenses are required? 

To be fair, it is easier to describe the shortcomings of the traditional methods and approaches used to analyse social systems than to come up with new approaches and methods. Complexity science is relatively new and the type of praxes that many people are trying out are even newer still. I am very inspired by so much good work happening today, but I also realize that we are in new territory. 

First, it’s important to understand when a problem is predictable and when it’s not. In my training and with clients I use some questions to help with this. For example, Is the system resembling a zoo or a jungle? Is it more like a steamboat in calm waters, or like a kayak through the rapids? Is it a game of chess or more like the game of life?

The biggest challenge that I see in the people I work with during my training is for them to stop applying a mechanistic logic to a problem that is complex and accepting that the logic they have been taught of a predictable and stable world does apply to complex and unpredictable systems. Sonja Blignaut put it perfectly: it’s like trying to survive in a jungle when all you know is the management book written for a zoo. 

Let me give you a couple of examples. A complexity-informed world view tells us that solutions to a specific problem are context-dependent. And yet most research grant applications of development project proposals ask to state something along the lines of, “How could this solution scale in different contexts?” I appreciate the intent, that is to rapidly scale up our solutions that seem to work, but we know that complex behaviours cannot just be scaled and adopted elsewhere! It is possible to standardize and ‘scale’ the manufacturing of ready-made bungalows because that is not a complex problem. But it is not possible to scale the fabric of cultural, social and economic relationships that form a system. 

A second example. All EU funding applications state the specific outcomes of the project. It might be possible to project outcomes that address simple to complicated problems. But projects that address complex problems cannot state a priority and detail what the outcomes will be. I think that funders know this; operation and decision-making systems for the most part prefer to think along straight input–output–outcome lines, and by and large we play the game to access the resources we need for our research and project work. 

What you are saying about funders reminds me of many governments in high-, low- and middle-income countries who put a lot of effort and resources into producing five-year development plans that describe in great detail what the complex systems of ministries, line agencies, laws, policies and people (in other words a government, a society) will do, in great detail. All clearly measured in terms of indicators, milestones, timelines. It seems to be that the messiness of a society or a government system and its development challenges do not fit into this logic.

The five-year plans! Let me clarify some things here. If we believe in human agency and free will (I do! Do you?), a vision of the future can be a powerful force of individual and collective inspiration. A vision becomes problematic when it is confused with a plan, with clearly identified steps. This tends to happen when a vision becomes set in stone and closes us off from the constant changes that emerge in the contexts of which we are part. A locked vision leads immediately to a locked strategy defined by predetermined goals – and research strongly suggests that it can backfire. It is almost as if a locked vision, strategy and plans become the territory, but they are not. 

So, the issue is not so much to design a vision of the future, or a strategy and plans, but how we use these once we accept the principles and uncertainty that are built into complexity thinking. Alfred Korzybskiremarked that “the map is not the territory” and the abstraction derived from something is not the thing itself. A vision and a strategy are abstractions, simplifications of reality and intent, but not the territory.

That is another paradox of complexity. You need to plan for an uncertain world. One of the best books that highlights this for me is Simple Habits for Complex Times, by Jennifer Berger and Keith Johnston (full disclosure: Jennifer Berger is the CEO of the organization that I work with). In their book, they describe a vision as a sense of destination, and some guardrails and boundaries around that. Imagine two scenarios. You need to go to the bookstore from your house: you place a pin on the GPS map and it shows with precision the shortest route and where to turn. That is, a destination and a path to get there in a predictable world. Now imagine you are in the forest and need to head north before dark, without instruments. With a minimum of orientation, you know the angle you need in relation to the sun. Without chartered maps, you need to make moment-by-moment choices about where to cross a small river, what places to avoid, but you still have a clear sense of where you are heading. That is a lot more like a vision as direction and boundaries and less like a 10-step pathway to success, with each step clearly pre-determined. 

In your work you design and facilitate dialogues for people to make progress on complex challenges. You also work with leadership capability development. What does complexity and system thinking mean for leadership and decision making?

Everything. It means that we need to critically examine some of our approaches, and rethink the role of leaders, the place for expertise, the assumptions about rational decisions, and much more. Let’s touch on a few points. 

The role of leaders in complexity as I see it is increasingly about navigating paradoxes and polarities. They need to create (not set in stone) visions for an uncertain world. They need to look at big trends and statistical data and yet be open to granular data from anywhere. They need to navigate between the technical challenges such as products, supply chains and technologies and the adaptive, human challenges of how teams can be innovative together, and how people respond differently to ambiguity (some love it and probably more hate it). 

Decision making in complexity needs to let go of assumptions of deterministic science and predictability. As a facilitator, I have come to see the power of tapping into the larger wisdom of groups when appropriate. For making sense of a system, we can gather statistical, big-picture data while adding rich, nuanced, local pictures of what is going on at the more granular level. For decision making we do something similar: leaders create boundaries and outline some minimum specifications. In this way we honour local, tacit knowledge in distributed decisions, while also enriching our understanding of patterns and weak signals from anywhere in the organization. I think, for example, of international NGOs focused on poverty eradication and inequality and how they could embrace complexity. Leaders will probably need to place greater emphasis on the role of deliberate experimentation of controlled small actions before scaling up something in a system that we do not understand yet. But for such experiments we will need a culture around sharing and experimenting in complexity, and for that I bet you will need to educate the donors and your other funders as well! 

Marco Valente, thank you very much.

Top photo credit:  delfi de la Rua on Unsplash

Photo by delfi de la Rua on Unsplash

comment 0

The 72.000.000 question

The US presidential election, our digital future, and a question for the friends at ODI’s Digital Societies programme

A week has passed since Joe Biden and Kamala Harris won the US presidential election. Donald Trump has not yet conceded and has launched a flurry of probably ineffectual court cases. Votes are being recounted in Georgia but they are not going to change the outcome.

I listened to many commentaries since the election: NYT’s The Daily, the Guardian’s Politics Weekly and Today in Focus, Talking Politics with David Runciman. They all agree that the support to Donald Tump has not changed since 2016, it has even increased. On the other hand, a huge number of Democrats have cast their vote for Biden and Harris which has tipped the election in their favour. That shows also in the popular vote where Biden has received ca. 77.000.000 votes vs. the 72.000.000 for Trump.

During the last few days I found myself thinking about those 72.000.000 people who voted Trump. The large part of those voters are older, white, non-college educated, and living smaller towns and rural areas. They are angry. They believe that the economic and political system has let them down. Their factories have closed. Their jobs have gone overseas or have been taken by immigrants. Trump listens to them and speaks to their anger and fears.

The economy of today (and tomorrow) is marked by an increase in productivity and a decrease in the demand for labour. This trend is pushed by digital technologies and automation and will only accelerate during this century. So, what will happen to those 72.0000.000? Early retirements for tens of millions? Not likely. Tens of millions of bullshit jobs, in the word of the late Prof. David Graeber? Probably.

If this is the trend, are we going to see more populist and right wing leaders being voted in office as hundreds of million of citizen are left stranded by the digital revolution around the world?

I realise this is quite a bleak view of the future, but that number, the 72.000.000, left me with questions about our democratic future which are not simple to answer.

So, I turn to the friends at ODI’s Digital Societies programme: what do you think?

Photo credit: Trump rally in Wildwood NJ by David Todd McCarty on Unsplash.

comment 0

‘Make it Happen’ in Development: Sensemaking and Portfolios of Options in the Governance Space. In conversation with Emilia Lischke

The design and implementation of development initiatives in support of policy and governance reforms often contends with systems of interlinked problems and non-linear and unpredictable change processes (Harry Jones 2011). During the last decade or so, there has been recognition that the development sector needs to move away from linear results-based logic and adopt more politically aware, experimental and adaptive approaches to address complex governance challenges. 

But what does it mean in practice?

Duncan Green has written that most of the contributions around adaptive programming come from ‘academics and think tankers’. They often offer useful principles but do not always suggest concrete steps to the people who design and implement programmes and projects. 

In January 2020, I participated in a portfolio sensemaking workshop organised in Helsinki by the Finnish Innovation Agency (SITRA)Gina Belle from the Chôra Foundation was one of the presenters. She introduced the foundation’s work on designing portfolios of strategic options and portfolio sensemaking

To me, that presentation opened the door to a method and process that, on the one hand spoke the language of systems and complexity of development issues, and on the other provided a concrete method to focus, learn, reflect and inform strategic decisions and adaptation of a programme or project. 

So, I started to read more about it and also got in touch with Chôra to learn more about their work. Today I am speaking with Emilia Lischke who is a director at the Chôra Foundation to hear from her about the origin of the method and also her experiences introducing it to the United Nations Development Programme in Malawi.

Exploring social systems transformation, Emilia Lischke
Exploring social systems transformation, Emilia Lischke

Emilia, when did you come across portfolios and sensemaking and why did you find these concepts and ideas interesting?

I came across Chôra’s Portfolio Approach in 2015 at an Australian bank and insurance company called Suncorp. I was on a year-long Fintech fellowship at that time, exploring how customer-focused disruptions in financial services create new opportunity spaces to rethink the relationship between financial services and corporate responsibility. For three months at Suncorp, I joined the Risk and Innovation Division, led by Kirsten Dunlop. This is when I first met Gina Belle and Luca Gatti and became acquainted with their deep understanding and practice of designing and managing Strategic Portfolios. Since then, we have spread the idea and application of a strategic portfolio approach to the the non-profit & development space co-designing and managing a variety of Portfolios around cities, trust, governance & tourism just to mention a few.  

How do you define a portfolio? What is sensemaking? And what happens when we bring them together?

Portfolios remind us that there is no single magic bullet to solving problems, but that it’s the combination of trying out many things and approaches in a way that enables learning from experience and adapting. No one knows what can and what will ultimately catch a system’s attention or shift people’s thinking and behaviour. Portfolios give us a range of experiences and literally options to choose from when deciding on the next step.  What makes Chôra’s approach to portfolios unique is that we link the value of portfolios to the strategic decision making, adaptation and commitment to action. Here sensemaking plays a key role – that’s the process by which we make sense from the experience of evolving portfolios and build them up as genuine solution discovery systems.

You are testing the method with the UNDP country office in Malawi. Tell me more about this? 

We have been working with the UNDP country office of Malawi since 2019. We have started with Sensemaking to help them strategically reposition their governance portfolio. In the course of that we designed new interventions that are being implemented with the government of the country and we are continuing with Sensemaking as a method to dynamically manage their Portfolio. What we have been observing over time, having just recently completed our third sensemaking experience, is how learning and meaning are huge catalysers in human systems. Sensemaking unsettles some of our most hard-wired limitations on agency, hierarchy, and voice in the corporate and organisational world. It unlocks potential in teams and individuals and creates new impetus for collaboration, change, for bold thoughtfulness, and exploratory endeavours even in the least likely places, and I think it is exactly these least likely places that need it the most. 

A key concept underpinning the Chôra Foundation’s Portfolio Sensemaking method is that when we address complex social and policy challenges, we have to have freedom to experiment, and within it, fail. That seems a difficult thing to do or suggest to development organisations aligned with results-based logic.

Under controlled lab environments I can see how this notion of failure and experimentation can lead to incremental problem solving and eventually great discovery. The scientist designs, then performs an experiment and measures its effects. If unsatisfied, she can always stop, return to the starting point, and start again with the intent to make a deliberate small adjustment in her setup. But the more interesting question is whether that notion of experimentation and failure is useful for those who are doing the hard job of addressing complex & evolving social and policy challenges in the world out there. They cannot go back in time and repeat the same intervention, nor can they control the environment to single out the thing that worked or did not. I think such is the nature of policy and social challenges. They need an operating model that entertains principles of adaptation, dynamic exploration and constant reframing. There is no end point and therefore no failing or winning. Every action will have some sort of effect, positive and negative, intended, and unintended. It is not the effects per se, but how we build on them and what we make of them that will make a difference. This is where experimentation is coming short of the messy and interesting complexity of time and changing of contexts.   

I was recently in a conversation with Nora Bateson. She made a point that made me think quite a bit. Problems do not appear out of thin air. They are the result of the behaviour of a system. Thus, I think that in order to address a problem, one has to find a way to influence and change the way the systems behave. How do you support development initiatives to shift the focus from trying to address problems head-on to influencing how systems behave? 

Imagine a darts game where a problem and the solution are clearly defined. The player has several shots, and the result is objectively observable, measurable, and reliable – here success and failure are clearly defined. If you see the challenges and interventions in the development world in this way, I see how one can entertain the idea of experimentation and failure. However, others would argue that society and life are complex adaptive systems. Constantly changing, they do not follow a predetermined principle of points increasing towards the centre of a board. On the contrary, their centre shifts and jumps and areas and boundaries move, fade and reappear somewhere else. And if you start to look at it this way, then you start to realise that no matter how many arrows you have, that they won’t help you to discover where that fleeting central point went and where it might appear next. You literally need to play the game differently, in a way that you don’t shoot at it but you become part of it, you play and dance with it, you immerse yourself into it and learn to sense your way into its behaviour, its needs and possibilities, and derive from there a deeply informed sense of action and resolution.

Emilia Lischke, thank you very much.

Originally posted on Systems Change Finland blog

Photo credit: Alina Grubnyak on Unsplash

comment 0

A shift in perception: exploring ways to work with complexity in international development

I really enjoyed to go back this week to this paper from 2016 by ODI’s Anne Buffardi: When Theory Meets Reality.

Anne Buffardi, exploring ways to work with complexity

I went back to Anne’s paper because I am doing some work on defining some principle the can help design and set up monitoring, evaluation and learning (MEL) systems for development project/programmes that tackle complex social and governance problems.

I have been quite involved over the last few years with the discussions in various communities: PDIA, Doing Development Differently, adaptive programming, and to some extent thinking and working politically.

There are a lot of discussions and conversation about principles that can help shift the way development initiatives are designed, managed, and implemented (PDIA is a bit different as in addition to principles it proposes also concrete tools to manage uncertainty and complexity) . These principles speak to the shift of perception needed in the international development community to move from a mindset that sees problems and possible solutions in isolation from the context and systems that enable them to start dancing with the systems in which they operate.

This shift is important and necessary, but translating these principles into practical actions and ways of working is difficult. Anne mentions some of these challenges in her Methods Lab paper.

I have the same challenge in my work on problem-driven political economy and on MEL systems design. I am on board with the principles. I agree and believe in them but then struggle to apply them to, for example, coming up with a concrete outcome indicators that speak to the complexity and uncertainty that the project is trying to influence.

Anyhow, the journey continues and the paper Anne has written in 2016 has many insights that resonate with me. I am pasting below the key messages of Anne’s paper. Go online and download it. It is a great read.

  • Over the last 50 years ,repeated calls for adaptive learning in development suggests two things: many practitioners are working in complex situations that may benefit from flexible approaches, and such approaches can be difficult to apply in practice.
  • Complexity thinking can offer useful recommendations on how to take advantage of distributed capacities, joint interpretation of problems and learning through experimentation in complex development programmes.
  • However, these recommendations rely on underlying assumptions about relationships, power and flexibility that may not hold true in practice, particularly for programmes operating in a risk averse, results-driven environment.
  • This paper poses guiding questions to assess the fit and feasibility of integrating complexity- informed practices into development programmes.

Here the extract from the conclusions of the paper:

“Complexity thinking can offer useful insights on how to approach situations and challenges faced by development practitioners. However, these recommendations rely upon underlying assumptions about relationships, power and flexibility that may not always hold true in practice, where organisations have established internal and external ways of working. By posing guiding questions to assess the fit and feasibility of integrating complexity-informed practices, this paper aims to help development programmes identify which dimensions of complexity are most appropriate and realistic to address; and, how they can work within their operating constraints to take advantage of distributed capacities, joint interpretation of problems and experiential learning.

Classifying which practices are more and less feasible enables decision-making and management styles to be explicit, rather than couched in aspirational rhetoric about sharing and collaboration, learning and adaptation, which is not realistic in practice. If undertaken before programme implementation, it can help identify decision-making and communication mechanisms that can facilitate interaction among dispersed actors, create incentive structures that reward learning as well as delivery, and allocate sufficient time and resources for data collection, analysis and interpretation to enable monitoring to inform decision-making.”

Photo credit:  Unsplash

comment 0

From Cold Data to Warm Data

Warm Data are contextual and relational information about complex systems.In other words, warm data involve transcontextual information about the interrelationships that integrate a complex system, as well as interwoven complex systems.
Warm Data Lab

I wrote to a colleague yesterday. During a meeting he had asked about a research paper he had written with others. He was rightly worries about making general claims without making explicit the limitations of the work.

His question remained with me. Yesterday evening I spoke about it with my wife at the end of our dinner. About six months ago, she has participated in one of the Warm Data Lab trainings organised by Nora Bateson. The points she made during our conversation were inspired by the training she did with Nora.

Every piece of research has limitations. Every piece of research is like a sketch of a tiny part of the complex reality in which we live. As a sketch, it can only give a glimpse into the reality and systems that we observe and live in. These sketches help draw (or begin to draw) maps of reality, but, as Nora Bateson says, maps are not the territory.

Research findings are just data and information, they are not knowledge. As such they are cold data. To gain meaning and influence the systems they need to become warm data. To become warm data they need to become alive as part of the conversations with the people who are affected by the problems the research has identified. They have to become almost like the fuel of dialogues and of new links and relationships between individuals. Warm data help surface the hidden knowledge that people affected by the problems that we might research. Warm data can change systems because they become part for the web od co-developments of those systems and are part of our collective ability to perceive, discuss, and try to address complex issues. 

The journey continues.

Photo credit:  Sveta Fedarava on Unsplash

comment 0

Three questions about MEL for projects dealing with complexity

Last week I wrote to the Peregrine Discussion Group on Better Evaluation with a couple of questions on M&E or MEL of initiates that try ton address complex social and development problems. I have received many very interesting replies which i am posting below together with my questions.

My questions

1) What do you think are the key principles that we need to have in front of us when designing MEL systems for complex and adaptive (and uncertain) projects and programmes?

2) Do we need a different MEL system and tools?

3) What do you suggest I should read on this topic and question?

The answers

Russel Gasser

1) What do you think are the key principles that we need to have in front of us when designing MEL systems for complex and adaptive (and uncertain) projects and programmes?

Complexity is not cleanly separated from and unrelated to the causal and chaotic domains, the boundaries are not clear lines much of the time, and trying to put in clean separators between what is complex and what is not, is itself a causal/linear approach to a complex issue. Trying to define a fixed set of MEL principles for the complex domain is a causal/linear approach that is necessarily doomed to failure,  both in terms that (a) it assumes that it is possible to predict what will be needed and also (b) it is based on an assumption that a finite range of methods can cover every case.  Both of these assumptions are false in the complex domain – that failure of these assumptions is one way to define what complexity actually is, and separate it from “complicated”, “difficult to understand” and “nothing like what I have experienced before”.  Neither of the first two is complex (other than in everyday conversational use of “complex”) and the third may be complex but actually describes the previous experience of the person, not the complexity of the situation. Causality is a one-way street in the complex domain – it can only be seen with hindsight. If it is truly “complex” then measurement of predicted results can only really measure the degree of luck or skill at prediction in guessing the outcome, not the value or success of the intervention.  Perhaps the most important question to answer for the real world is: How can we create the link from funders who are locked in to the linear/causal model, to providing funding for and using an evaluation that is based on complexity ?  Until you get the complex domain evaluation (or MEL) funded, and until there is the possibility that organizations change what they are doing as a result of evaluation in the complex domain, then it is academic research that lacks clear application. Nothing wrong with academic research, but this forum isn’t likely to be the best place to address it.

2) Do we need a different MEL system and tools?

MEL is not a single process or system.  The toolkit is already huge. Some of the processes and tools are suitable for use in the complex domain, some are not. Answering this question has the same issues as mentioned above:  (a) it assumes that it is possible to predict what will be needed and also (b) it is based on an assumption that a finite range of methods can cover every case. A better approach might be to ask: “What do we need to know and understand in order to be able to decide which tools and methods are likely to be useful in the specific complex intervention we are looking at?”  The specific nature of the complex domain means that we can only really consider one intervention at a time.A really important feature of complexity is that solutions are NOT transferable.  The MEL that works on one complex intervention is unlikely to be useful without changes for other complex interventions (that is the nature of complexity) and we can be certain that it will not be useful for all possible interventions in the complex domain. Monitoring is about the measurement of what is happening. There may be no reason to significantly change this from previous practice, beyond being far more obsessive about knowing what is going on right now and being able to respond quickly to build on success and damp-down failure.  If monitoring is not providing useful information in the causal domain then that won’t change when moving to the complex domain.===Evaluation in the complex domain can’t ask “did we achieve what we expected to achieve?” as that is largely meaningless.  But evaluation can still be any one of a range of approaches, and parts of the OECD DAC are still useful (e.g. “unforeseen consequences” in impacts).  Evaluation in the complex domain can usefully ask questions about what changes resulted from what aspects of the intervention, what was cost-effective and what was a waste of resources, how the intervention approach could be improved, etc.  Techniques like Outcome Harvesting and various story-telling techniques that are capable of identifying what really made a difference, what changed as a result of any intervention, and for whom, may be able to capture outcomes and impacts if scope and focus are correctly set. Learning is still much the same, it involves many of the same core skills for people to appropriate new information, internalize analysis and consequences, reinforcement to embed the knowledge, and a willingness to change beliefs and practices in the light of new information and experience.  If a causal/linear system is not supporting learning and producing new insights and approaches within an organization then transferring it to the complex domain won’t change the lack of learning. Selecting who might learn what, and for what purpose, needs even more detailed attention as complexity increases. Politics and social interaction are examples of interactions that are mostly in the complex domain, insight into “learning” in political and social contexts may provide useful guidance. 

3) What do you suggest I should read on this topic and question?

Dave Snowden’s Cynefin Framework publications and videos, and the Cognitive Edge website.  If you are interested in the theory behind the Cynefin view of complexity then Max Boisot (a tough read…)- Jonny Morell’s blog- John Mayne specifically on Attribution by Contribution- Outcome Harvesting website and blogs – Better Evaluation website on “What is evaluation”. Any web search engine will turn up plenty of other content on complexity as you search for these, and probably lead you to other useful sources – if you find anything really useful then maybe you could share it on Peregrine ?


Silva Ferretti 

Current RBM are totally not fit for purposes. They allow very little space to understand the meaning of results within context and the processes through which they are achieved. Unfortunately, changing approach requires rewiring our way of thinking. From an idea of linear change (if i do A then B happens) and control (I plan, I achieve) towards an idea of complexity (many diverse forces drive change, and they can play differently) and adaptiveness (we have a purpose, we navigate options). 

It also requires us to put our principles first, and to understand and integrate different worldviews. 

As some students in a training c told me; “we had really to stretch and twist our thinking. we feel we, ourselves, changed”. 

If done properly, principled M&E exploring complex and adaptive systems is not only about setting a different way to collect data or process them. 

It is about asking the whole organization to think and work differently in relation to change. To decolonize our thinking. 

And this is the real challenge. 


Bruce Boyes

Hi Arnaldo,

Following on from Russell’s good advice, I would suggest exploring iterative impact-oriented monitoring, as recommended in one of the Overseas Development Institute (ODI) complexity publications, see https://realkm.com/2020/08/17/taking-responsibility-for-complexity-section-3-2-2-iterative-impact-oriented-monitoring/

Because of the uncertainty and unpredictability that occurs in complex environments, continuous and iterative M&E and the related adaptive management readjustment of actions aimed at achieving the original impacts is much more effective than hoping for the best and then doing MEL at the end. For an example of where I’ve successfully used this iterative MEL approach see https://realkm.com/2016/03/03/case-study-an-agile-approach-to-program-management/

For an analysis of RBM in the context of complexity see Box 2 in https://realkm.com/2019/11/11/managing-in-the-face-of-complexity-part-3-1-tailoring-management-approaches-to-complex-situations/


Thomas Winderl

I agree: RBM hasn’t turned out to be the revolution in adaptive management that it was supposed to be. But my feeling is: it’s not the idea that is flawed, but the implementation. In most cases, RBM turned into a fill-out-the-boxes, “do the same but give it a different name”, increasingly bureaucratic exercise.

That’s a shame. I think there IS value to the basic idea of RBM. My proposal to ‘rescue’ RBM: Follow five simple rules. These are:

1.      A relentless focus on outcomes, outcomes, and outcomes

2.      Keep adapting what you do based on what you learned from outcome monitoring

3.      Involve stakeholders at every step to create quick feedback loops

4.      Budget for outcomes and outputs, not for activities

5.      And overall: keep RBM simple, but not simplistic

I’ve written a longer post about this a few months ago. If you want to see more detailed argument, check out http://results-lab.com/five-rules-for-results-based-management.


John Hoven

Bruce, have you investigated qualitative methods for iterative, evidence-based investigation of cause-and-effect in a one-of-a-kind situation? The dominant method, process tracing, is used almost exclusively for backward-looking causal inference. However, it is easily adapted to forward-looking design and implementation of a context-specific development / peacebuilding project. Iterative updating of the causal theory proceeds at the pace of learning (every week or two, not every 6 or 12 months.) See https://www.researchgate.net/publication/324507012_Adaptive_Theories_of_Change_for_Peacebuilding.


Alix Tiernan

Hi Arnaldo

This is one of our favourite questions 😊.

Here are some practical thoughts on your three questions:

1) Key principles

a) Design the system around processes that repeat regularly and often, and expect to review project documentation at least twice a year, allowing for new strategies, approaches, and activities to creep in, and other originally planned ones to fall out – that’s the essence of adaptation

b) Try to avoid using targets if you possibly can – or if you can’t, then try to use the targets as a comparison to where you are at but NOT as a performance management tool

c) Find creative ways of triangulating and validating your data to ensure that you are taking account of all the different and changing variables and voices (ie. complement surveys with research studies, focus group discussions in communities with meetings with ministers, etc.)

2) Different MEL system and tools

a) you can adapt results frameworks to an adaptive approach but you need to identify some very good qualitative data collection mechanisms that allow you to take into account the effect of the complex context on your project/programme (we have been using Outcome Harvesting for that)

b) make sure you don’t use targets as a performance management tool, because not hitting your target could be the best possible indicator that you are being adaptive!

c) we like using theories of change instead of logframes or results frameworks, because they don’t have to be linear and don’t prescribe specific outcomes – whatever tool you use, make sure you have a way of picking up unexpected results of the work that relate to the expected outcomes, and that you analyse why they happened. Ideally that analysis would then lead to adaptation of further implementation, in an iterative manner.

3) What to read

I will just point you to this document: https://www.odi.org/publications/11191-learning-make-difference-christian-aid-ireland-s-adaptive-programme-management-governance-gender, which we co-authored, but there are many more from ODI exploring adaptive programming, if you google it (it’s sometimes called adaptive programming, sometimes adaptive management). 

Hope these (rather hands on) thoughts are helpful.


Bob Williams

I’m sorry if I appear to be trolling everyone on this list.  Maybe I’ve been around too long and have seen some of the histories and backstories the ideas being discussed here.  My understanding is that Adaptive Management was developed and adopted because of the widespread failure of RBM rather than being part of the adaptive management story.  

RBM has been criticised as a really bad idea both in theory and practice at least since the mid 1990s.  I recall a hugely negative evaluation of the use of RBM in the UN agencies written at least 15 years ago.  It concluded that it was a very bad idea because it wasn’t sufficiently adaptive for the kinds of circumstances that UN agencies.  The formal response from the UN was that RBM was a problem of implementation not the basic concept, without giving a single piece of evidence.  Hence my dismay to find myself working with a very large UN agency last year that was just introducing RBM at enormous cost.

None of which necessarily undermines the advice given by Thomas.  Just that a bit of history (or at least my version of it) might help a little.


Leslie Fox

… and yet, here we are with most of the major multi- and bilateral donor agencies not to mention a good number of international and local CSOs – those who pay our consultant fees — very much committed to an RBM approach. I’ve worked with most of these organizations and while evaluations have required an RBM approach (achieving results vis-à-vis measuring change in indicators), none of them has been limited to a single methodology to arrive at a full picture of whether change has/is taking place or why. I believe it’s called mixed method.


Frank Page

HI Arnando,

I am going to take a bit of a different tack than others who have answered this question more from the technical side. I am going to look at the institutional side. 

The simple answer I would give is to turn MEL systems into MIS – because, ultimately, M&E data/information is management information. 

For organizations to adapt quickly and effectively in complex environments a number of organizational practices are required.  Some organizational practices required include (this list is not complete) combining authority and responsibility in each position as far down the chain of command as to the right places in the organization possible (in other words, and I like this term, all positions has the agency to do their job and improve their performance) and then provide them and give them access to the information they need to make the best decisions. 

In other words, all staff from field workers, to project managers, to program managers, to VP’s and CEOs should be monitoring and evaluating performance in relation to the vision and mission and making adjustments themselves – as low down the ladder as possible. The M&E function of analysis (at least) needs to be integrated in their jobs, not in a separate department. In addition, doing this also moves much of the learning function into the chain of command, and the parts of the learning function that don’t go into the chain of command could very well go under R&D.

The current practice of separating program and m&el introduces many, some serious, coordination problems that are difficult overcome. 

Thus M&E when becomes MIS, it focusses on collecting original data, pulling data from the organization and its stakeholders, pushing data and information to where it is needed and allowing those in the chain of command to pull the data/information they need.  Then, those who are responsible for actually achieving goals in a complex environment have the information to make changes and adapt quickly. 

Photo credit: Tim Johnson on Unsplash