- Albert Einstein
Complex vs. Complicated Systems
Heidi Burgess and Guy Burgess
This video describes the difference between complicated systems and complex systems and explains why that difference is extremely important when considering how best to respond to difficult or intractable conflict situations. In short we advocate an "ecosystem-based approach" to intractable conflicts instead of the mechanical model which is often used to design conflict interventions.
- What do you see as the key challenges posed by complexity that we need to address?
- And, what do you see as promising strategies for meeting those challenges?
Years ago, when we started work on the Beyond Intractability system, we convened a series of conferences in which we brought together experts on various aspects of the problem to talk about what should be included in a comprehensive knowledge base on the subject. One of the attendees at this conference was Wendell Jones, who taught me about complex adaptive systems. The notion here was that it was useful to divide the world into two fundamentally different kinds of systems, and that doing so would go a long ways toward illuminating the problems posed by intractable conflict. This was really a transformative experience for me, because we literally have spent the last 10 to 15 years trying to figure out how to essentially retool our approach to conflict problems in a way that focuses on complex adaptive systems.
Now, the key to Wendell's argument is a distinction between complicated systems and complex systems. Complicated systems can be very complicated - but they're deterministic. All of the components of a complicated system react to one another and to changing conditions in reliably predictable ways. The laws governing the system - and here we're talking about the laws of physics, electronics, chemistry - are stable, and they are applied the same way every time. The other key thing about complicated systems is they don't have a mind of their own; they don't have goals. They have properties, and they react in certain ways, but they don't have their own agenda that determines how they respond to conditions.
The way to think about complicated systems is to think: certainly, mechanical systems, things that people build, tools, and then elaborate machines, and things that we think about using mechanical metaphors - like, for instance, the chain of command. It's a way of applying to the social system the sort of deterministic approach that you use for designing a physical system.
And the contrast to that is something called complex systems, or complex adaptive systems, and here you have multitudes of independent actors each seeking to advance their own self-interest based on their image of their environment, and there are often ambiguous decision processes. Such systems are not designed; they evolve. And, there are no central control points. They of course exist in the context of complicated physical systems - the planet Earth, with its no lithosphere, hydrosphere, and atmosphere - but fundamentally, they're driven by plants and especially animals that sense their environment, and have their own set of priorities, and decide how they're going to react to that environment.
So for complex systems, you want to think of organic and social systems, and similar metaphors. So when you're trying to describe a system, you'll do much better if you use organic metaphors. Another way to think about the distinction is the two kinds of a super-duper ultra pool game. With a complicated system, you have one player who's trying to position the perfect shot. If he gets everything lined up just right, he'll hit the ball, and all the other balls will bounce in just the right places so the end result will be whatever he wants it to be. And often, when we talk about policymakers just having the right strategy - if they have the right strategy, everything would work out - the implicit assumption is that it's a complicated system, and it is possible to have the right strategy. In reality, what you have is a complex system, which is a different kind pool. So what you've got is not one player, but you have a multitude of folks playing simultaneously, each trying to get the balls arranged in a different way for their own purposes. The laws of Calvin ball apply instead - that is to say, the rules keep changing, and that's part of what makes it so unpredictable.
So, once you start recognizing that conflict systems look like this, then you've got a series of very daunting challenges to deal with. The first, of course, is social complexity. You've got all of these independent actors, all pursuing their own agenda and their own self-interest, using often very flawed decision-making rules. And how you reconcile that gets very difficult.
It's even more difficult because there often are not win-win solutions. What people want can be fundamentally incompatible. There is often no zone of possible agreement - that is, an overlap between what two parties want - and the things they want are inherently competitive.
The other thing that you'll have is ruthless Machiavellian actors: folks that really seek power over others, and are willing to engage in a winner take all competition, that can be pretty brutal and sometimes deadly. So, you need a system that can deal with these folks as well.
There's also complexity associated with the ways in which people think. On one level you have social complexity, but the human brain is also a complex organ that has all sorts of nonrational and irrational ways of thinking about things. Because of this, it's not enough to just come up with a solution that will make sense based on rational cost-benefit calculations.
The world is also so complex that we can't even begin to design actions that will get us where we want to go without relying on experts. And there are a whole set of reasons why it's very hard to rely on experts or find ones that are trustworthy.
But it's more than that: there's also the sheer scale of the problem. And this is what is utterly daunting. I was once on a panel with a physicist who was involved in the Manhattan Project. He was telling us that one of the real privileges of being a physicist is to understand deep down inside what's meant by orders of magnitude, that is, factors of 10. He went on to explain that the Hiroshima bomb was four orders of magnitude more powerful than a conventional TNT bomb, the so-called blockbuster. Or, envisioned another way, walking around your neighborhood 1.7 miles an hour, roughly, is four orders of magnitude slower then buzzing around the planet on the international space station. Now, the thing that's really daunting is that the difference between your standard mediation triad - party A, party B, and a mediator - and even a moderate-sized conflict, say, Israel and Palestine, is roughly 7 orders of magnitude. That is a gigantic quantitative difference. Any quantitative difference that big is a qualitative difference, and we need strategies that can operate on that scale.
There is also chaos: that you simple predictive systems and you add them together, and you don't have to add very far to find something that seems totally chaotic and unpredictable. In Boulder, at the National Center for Atmospheric Research, they have a display in the lobby that explains why they can't predict the weather. You take the most predictable of simple machines, a pendulum, and you hang another pendulum from that pendulum and swing it, and it will produce the most bizarre and apparently unpredictable behavior. The other thing about chaos, though, is it's not totally unpredictable. It's unpredictable within limits. The way Kenneth Bolding used to describe it is that you need to be prepared to be surprised about the future. But then again, you don't have to be dumbfounded. There are things that you know that constrain your ability to predict the future in ways that are very useful.
And then, there's the notion of influence limits. I sort of have this image, which I tried to draw here, that we all live in influence clouds, and our ability to influence others diminishes with distance from ourselves. This isn't just straight physical distance. It's distance in terms of personal contact, and in the world of the Internet, that may involve contacts on the other side of the planet. But still, we only influence a relatively small number of relatively immediate associates.
Society actually is this gigantic sea of influence clouds, and there's not really any clear place where anybody's got enough influence to push more than a little bit of it in whatever direction they want. So, you need a strategy for dealing with conflict that works in such a squishy environment.
Now, part of what we want to do with the seminar is actually start thinking about how one does that. I don't think it's an impossible challenge, although it seems awfully daunting. Again, we'll go back to Kenneth Bolding, who we'll quote a lot, and in many ways is sort of the inspiration of the whole approach that we're taking here. As he used to say, the greatest catastrophe to occur to the social sciences was the success of celestial mechanics. For millennia, humans had looked out of the heavens and couldn't figure it out. And then they figured out that a couple of simple equations will explain the movement of the solar system, and the planets, and everything, and with astonishing accuracy. So, folks started looking for those same equations with respect to social problems. This is like thinking about the social system as a complicated system that's driven by simple deterministic rules, when in reality it is vastly squishier, and all of the rules of complexity apply.
So what Boulding advocated, and what we're going to spend a fair amount of time developing in conjunction with the seminars, is an ecosystem-based approach. This approach thinks about the way the world works not in terms of straight physical laws, but in the evolution of social and biological systems, and the eco-dynamics that govern the way in which that emerges.
At this point, there are two questions that we'd like to raise.
- What do you see as the key challenges posed by complexity that we need to find better ways of addressing?
- What do you see as the most promising strategies for meeting those challenges?
- Slide 4: . Steam turbine. By Siemens Pressebild. CC-BY-SA-3.0, via Wikimedia Commons. Oil gas station blueprints. Attriburion: Joy Oil Co Ltd [Public domain], via Wikimedia Commons. Manhattan Project Organization Chart. Attribution: US Army (Defense's Nuclear Agency, pp. 4-5) [Public domain], via Wikimedia Commons.
- Slide 6: Birds, Copyright Guy Burgess; Crowd shot. By Al Jazeera English (Crowd shot). CC BY-SA 2.0, via Wikimedia Commons.
- Slide 7: Billiard balls. By Andrzej Barabasz (Chepry) (Own work). CC BY-SA 3.0, via Wikimedia Commons.
- Slide 8: Billiard balls. By Andrzej Barabasz (Chepry) (Own work). CC BY-SA 3.0, via Wikimedia Commons. "Calvinball" taken from: http://calvinandhobbes.wikia.com/wiki/Calvinball
- Slide 9: Ukraine crisis By Amakuha (Own work) CC BY-SA 3.0, via Wikimedia Commons. Gaza explosion: By Al Jazeera CC BY 3.0, via Wikimedia Commons. Conflict Map from page 134 Coleman. The Five Percent. NY: Public Affairs. 2011.
- Slide 10: Newbury Racecourse, crowd: By Barry Skeates from newbury, UK (NRC 11). CC BY 2.0, via Wikimedia Commons. European Parliament. CC BY-NC-ND 2.0 © European Union 2014 - European Parliament"
- Slide 14: . StFX Physical Sciences Lab. By StFX (StFX) [CC0], via Wikimedia Commons.
Slide 15: Badger explosion. By Photo courtesy of National Nuclear Security Administration / Nevada Site Office [Public domain], via Wikimedia Commons. Atlantis taking off on STS-27. By NASA [Public domain], via Wikimedia Commons.
Silhouettes – by Gilbert Bages from the Noun Project. Permission: CC
- Slide 19: . Heliosynchronous orbit. By Heliosynchronous_Orbit.png: Brandir XZise. CC BY-SA 3.0, via Wikimedia Commons.
Copyright © 2016 Guy Burgess and Heidi Burgess
Guy Burgess and Heidi Burgess, Co-Directors
UCB 580, University of Colorado, Boulder, CO 80309-0580, (303) 492-1635, firstname.lastname@example.org