Tuesday, February 27, 2007

Kauffman's Rules, 1-7

by Sara

My apologies for the two days of quiet -- I've had one hell of a winter cold since Friday, and then completely lost my network connection all day Monday. But I'm starting to pull out of the cold, and the network problem seems to be fixed, so I'm back. (And Dave will be, too, come Thursday or so.)

One of the first things futures studies faculty try to pound into the little puddin' heads of budding futurists is that the world isn't built of separate pieces and parts; and history can't ever be reduced to a list of Great Men and Great Events. Rather, they tell us, the world is a vast interlocking matrix of complex systems -- and one of the biggest keys to cultivating good foresight lies not in examining the specific properties of each part, but rather in examining the relationships between the parts, and the way they function together as a whole to create a given situation. Look at it this way, and it becomes much easier to see what's working, what's breaking, what's likely to happen next, and what needs to change for a better outcome to occur.

This is the basic idea behind systems theory, which been around since the early 70s, when Jay Forrester founded the Systems Lab at MIT (and where three of his proteges soon produced the landmark World 3 computer model, and with it a best-selling book called Limits to Growth that was one of the earliest warnings of a global ecological crisis). Since then, a wide variety of disciplines have realized that systems thinking offers some singular tools for getting ahold of the complexities of our ever-more-chaotic universe, helping us find the proper focus on things, and distilling unwieldy problems down to their essence so that the right solutions can emerge.

One of the cool things about studying the behavior of systems that all of them -- economic, ecological, biological, political, cultural, or mechanical -- consistently behave in ways that cause them to succeed or fail in much the same way. This observation greatly simplifies our ability to understand of the world, once we learn to look for the common recurring patterns. As a quick way of teaching this awareness, an early systems teacher named Draper Kauffman set down 28 rules that seem to apply to the behavior of all kinds of systems. (It's kind of like those little "101 Life Lessons" books you get at Borders, only this one encapsulates the life philosophy of a bunch of systems geeks at MIT.)

They're not scientific laws, exactly, but a list of rules-of-thumb that should be kept in mind by anybody who's trying to suss out how any kind of system works, and how you're gonna make it change so it's more to your liking. And since that's pretty much everybody who reads this blog and wonders how we can restore a political and social system that's obviously in tremendous flux, I thought I'd offer y'all a set for your own cognitive toolkit.

Twenty-eight rules is a lot, so I'll start with the first seven tonight, and add the rest in future posts. The rule is in bold; Kauffman's original commentary follows in italics; and the plain text after each one is mine. Here we go:

1. Everything is connected to everything else. Real life is lived in a complex world system where all the subsystems overlap and affect each other. The common mistake is to deal with one subsystem in isolation, as if it didn't connect with anything else. This almost always backfires as other subsystems respond in unanticipated ways.

This is something that most educated Americans understood intuitively as we approached Iraq: there were interrelationships here that the White House hadn't accounted for, and pulling one string (removing Saddam) was going to create a cascade of effects that nobody could foresee -- though many of us knew it wouldn't be good. This was a triumph of foresight on the part of the American left, and a catastrophic failure of it on the right.

2. You can never do just one thing. This follows from rule #1: in addition to the immediate effects of an action, there will always be other consequences of it which ripple through the system.

Never follow a leader who can't explain at least four possible scenarios about what the second- and third-order effects of a proposed change will be. Not just one best-case scenario: you want to see a worst-case, a most-likely-case, and an off-the-wall case, too. If they haven't done these "what-if" thought exercises, they're not the person to be leading the change.

3. There is no "away." Another corollary of #1. In natural ecosystems, in particular, you can move something from one place to another, you can transform it into something else, but you can't get rid of it. As long as it is on the Earth, it is part of the global ecosystem. The industrial poisons, pollutants, insecticides, and radioactive materials that we've tried to "throw away" in the past have all too often come back to haunt us because people didn't understand this rule.

A lot of the people and problems Dave writes about came about because people haven’t yet given up on the naive fantasy that there is, in fact, an "away." We can send the brown and black folks "away," and that'll fix it. We can put criminals "away" in jail, and the things they learn there will never touch us. We can send our pollution "away" down the stream, where only the orcas will choke on it. We get in a lot of trouble when we overestimate the size of this tiny blue ball, and start to thinking that there's anywhere on it that's far enough "away" to hide our crimes against nature and each other.

4. TANSTAAFL: There Ain't No Such Thing As A Free Lunch. Years ago, bars used to offer a "free lunch" as a way to draw customers. Of course, the drinks in those bars cost twice as much, so the lunches weren't really "free" at all. Similarly, in complex systems, what looks like the cheapest solution to a problem often turns out to be the most expensive one in the long run. TANSTAAFL is a way of saying, "Don't expect something for nothing -- there's always a hidden cost somewhere."

Fossil fuels have been a big free lunch, until we found out that there was no "away" with those, either. And now we're going to get to spend the next 50 years trying to pay for that long lunch. There are a couple lunches that look considerably cheaper right now -- biofuels and nukes among them -- but anybody who thinks those are going to be free is kidding themselves, too.

5. Nature knows best. Natural ecosystems have evolved over millions of years, and everything in them has a role to play. Be very suspicious of any proposal to alter or eliminate an apparently "useless" part of the system. If it looks useless, that just means that you don't understand its function, and the risk of doing harm is that much greater. When in doubt, be careful, and always try to find a "natural" solution to a problem if at all possible.

This rule is expressed in terms of biological systems, but it applies to mechanical, economic, and political systems, too. Consider the fate of the American financial system in the 80s when New Deal-era regulations were repealed; or the state of our media since the original FCC laws were gutted.

People forgot those laws were there for a damned good reason, and let themselves get talked out of them. And now we may never get them back. In this case, it wasn't nature that knew best -- but older governments that had a much clearer grasp of the public good than our present ones do. (And isn't it a conservative value to start from the assumption that the traditional order is the way it is for very good reasons, and shouldn't be tampered with unless you're very sure about what you're doing?)

6. It ain't what you don't know that hurts you; it's what you DO know that ain't so. Beware of false assumptions about system behavior. When we are sure of something, we usually don't bother to look for proof that it is true and we may be blind to evidence that it is false. We are much more likely to make really big blunders when we act on false assumptions than when we are uncertain and aware of our own uncertainty.

Journalists wince when we hear this one; most of us are caught up by the wrong stuff we know on an all-too-regular basis. But the Bush adventure in Iraq is the ultimate morality play here, too.

7. "Obvious solutions" do more harm than good. All complex systems use negative feedback to negate external changes in the system. If you try to change something in the direct, "obvious" way, the system is going to treat your efforts like any other outside influences and do its best to neutralize them. The more energy you waste fighting the system head on, the more energy it will waste fighting back, and any gains you make will be only temporary at best. Finally, if you try hard enough and long enough, you will exhaust the system's ability to fight back--at which point the system will break down completely.

This lesson applies to almost every battle we're fighting these days. Implicit in this is that viral assaults (like the netroots) will be met by strong immune reactions; but the battle will go to the side that adapts fastest and can divert less energy toward the struggle. It also suggests that change (like, say, with global warming) isn't going to really start happening until the rationalizations, solutions, and chances for limited change within the current system have been completely exhausted.

OK: Kauffman's Rules, one thru seven. If you're intrigued by systems thinking and want to know more, one good place to start is The Fifth Discipline by Peter Senge, who applies these principles to organizations in one of the more remarkable management books I've ever read.

No comments:

Post a Comment