PDF Summary:Thinking in Systems, by

Book Summary: Learn the key points in minutes.

Below is a preview of the Shortform book summary of Thinking in Systems by Donella H. Meadows. Read the full comprehensive summary at Shortform.

1-Page PDF Summary of Thinking in Systems

Thinking in Systems is an introduction to systems analysis. Many aspects of the world operate as complicated systems, rather than simple cause-effect relationships. Understanding how systems work is key to understanding why they tend to produce problems that are stubbornly resistant to improvement.

This book teaches you how to start viewing the world in terms of systems, why we tend to misunderstand complex systems, and how to intervene most effectively in systems. Learn the systems way to view problems as diverse as the war on drugs, harvesting renewable resources, and business monopolies.

(continued)...

  • If they are equal, the population will stay the same.

Different circumstances can drive the relative strength of the birth or death loop:

  • Data suggests that as countries get wealthier, birth rates fall. Therefore, poorer countries with high current birth rates may not retain high birth rates as their economies develop.
  • A lethal, contagious disease could drastically increase the death rate. For instance, during the HIV/AIDS epidemic, projections of populations in areas with high HIV prevalence had to account for higher mortality.
  • Birth rate could also fall due to social factors, such as lower interest in raising children or fertility issues.

More Complicated Systems

Read the full summary to learn how:

  • One stock + two balancing loops represents a thermostat keeping a room’s temperature
  • Delays introduce oscillations into system behavior, as in a car sales manager trying to keep her inventory consistent
  • How to model extraction of a non-renewable resource, such as fossil fuels
  • How to model extraction of a renewable resource, such as fish in the sea

Why Systems Perform Well

Systems are capable of accomplishing their purposes remarkably well. They can persist for long periods without any particular oversight, and they can survive changes in the environment remarkably well. Why is that?

Strong systems have three properties:

  • Resilience: the ability to bounce back after being stressed
  • Self-organization: the ability to make itself more complex
  • Hierarchy: the arrangement of a system into layers of systems and subsystems

Creating systems that ignore these three properties leads to brittleness, causing systems to fail under changing circumstances.

Resilience

Think of resilience as the range of conditions in which a system can perform normally. The wider the range of conditions, the more resilient the system. For example, the human body avoids disease by foreign agents, repairs itself after injury, and survives in a wide range of temperatures and food conditions.

The stability of resilience comes from feedback loops that can exist at different layers of abstraction:

  • There are feedback loops at the baseline level that restore a system. To increase resilience, there may be multiple feedback loops that serve redundant purposes and can substitute for one another. They may operate through different mechanisms and different time scales.
  • Above the baseline loops, there are feedback loops that restore other feedback loops—consider these meta-feedback loops.
  • Even further, there are meta-meta feedback loops that create better meta-loops and feedback loops.

At times, we design systems for goals other than resilience. Commonly, we optimize for productivity or efficiency and eliminate feedback loops that seem unnecessary or costly. This can make the system very brittle—it narrows the range of conditions in which the system can operate normally. Minor perturbations can knock the system out of balance.

Self-Organization

Self-organization means that the system is able to make itself more complex. This is useful because the system can diversify, adapt, and improve itself.

Our world’s biology is a self-organizing system. Billions of years ago, a soup of chemicals in water formed a cellular organism, which then formed multicellular organisms, and eventually into thinking, talking humans.

Some organizations quash self-organization, possibly because they optimize toward performance and seek homogeneity, or because they’re afraid of threats to stability. This can explain why some companies reduce their workforces to machines that follow basic instructions and suppress disagreement.

Suppressing self-organization can weaken the resilience of a system and prevent it from adapting to new situations.

Hierarchy

In a hierarchy, subsystems are grouped under a larger system. For example:

  • The individual cells in your body are subsystems of the larger system, an organ.
  • The organs are in turn subsystems of the larger system of your body.
  • You, in turn, are a subsystem of the larger systems of your family, your company, and your community, and so on.

In an efficient hierarchy, the subsystems work well more or less independently, while serving the needs of the larger system. The larger system’s role is to coordinate between the subsystems and help the subsystems perform better.

The arrangement of a complex system into a hierarchy improves efficiency. Each subsystem can take care of itself internally, without needing heavy coordination with other subsystems or the larger system.

Problems can result at both the subsystem or larger system level:

  • If the subsystem optimizes for itself and neglects the larger system, the whole system can fail. For example, a single cell in a body can turn cancerous, optimizing for its own growth at the expense of the larger human system.
  • The larger system’s role is to help the subsystems work better, and to coordinate work between them. If the larger system exerts too much control, it can suppress self-organization and efficiency.

How We Fail in Systems

We try to understand systems to predict their behavior and know how best to change them. However, we’re often surprised by how differently a system behaves than we expected.

At the core of this confusion is our limitation in comprehension. Our brains prefer simplicity and can only handle so much complexity. We also tend to think in simple cause-effect terms, and in shorter timelines, that prevent us from seeing the full ramifications of our interventions.

These limitations prevent us from seeing things as they really are. They prevent us from designing systems that function robustly, and from intervening in systems in productive ways.

Systems with similar structures tend to have similar archetypes of problems. We’ll explore two examples of these; the full summary includes more.

Escalation

Also known as: Keeping up with the Joneses, arms race

Two or more competitors have individual stocks. Each competitor wants the biggest stock of all. If a competitor falls behind, they try hard to catch up and be the new winner.

This is a reinforcing loop—the higher one stock gets, the higher all the other stocks aim to get, and so on. It can continue at great cost to all competitors until one or more parties bows out or collapses.

A historical example was the Cold War, where the Soviet Union and the United States monitored each others’ munitions and pushed to amass the larger arsenal, at trillions of dollars of expense. A more pedestrian example includes how advertising between competitors can get increasingly prevalent and obnoxious, to try to gain more attention.

Fixing Escalation

The solution is to dampen the feedback wherein competitors are responding to each others’ behaviors.

One approach is to negotiate a mutual stop between competitors. Even though the parties might not be happy about it or may distrust each others’ intentions, a successful agreement can limit the escalation and bring back balancing feedback loops that prevent runaway behavior.

If a negotiation isn’t possible, then the solution is to stop playing the escalation game. The other actors are responding to your behavior. If you deliberately keep a lower stock than the other competitors, they will be content and will stop escalating. This does require you to be able to weather the stock advantage they have over you.

Addiction

Also known as: dependence, shifting the burden to the intervenor

An actor in a system has a problem. In isolation, the actor would need to solve the problem herself. However, a well-meaning intervenor gives the actor a helping hand, alleviating the problem with an intervention.

This in itself isn’t bad, but in addiction, the intervenor helps in such a way that it weakens the ability of the actor to solve the problem herself. Maybe the intervention stifles the development of the actor’s abilities, or it solves a surface-level symptom rather than the root problem.

The problem might appear fixed temporarily, but soon enough, the problem appears again, and in an even more serious form, since the actor is now less capable of solving the problem. The intervenor has to step in and help again to a greater degree. Thus the reinforcing feedback loop is set up—more intervention is required, which in turn further weakens the actor’s ability to solve it, which in turn requires more intervention. Over time, the actor becomes totally dependent on—addicted to—the intervention.

An example is elder care in Western societies: families used to take care of their parents, until nursing homes and social security came along to relieve the burden. In response, people became dependent on these resources and became unable to care for their parents—they bought smaller homes and lost the skills and desire to care.

Fixing Addiction

When you intervene in a system:

  • Try to first diagnose the root cause of the issue. Why is the system unable to take care of itself?
  • Then design an intervention that will solve the root cause, and that won’t weaken the system’s ability to take care of itself.
  • After you intervene, plan to remove yourself from the system promptly.

More System Problems

Read the full summary to learn more common system problems:

  • Policy resistance, where a policy seems to have little effect on the system because the actors resist its influence. Example: The war on drugs.
  • The rich get richer, where the winner gets a greater share of limited resources and progressively outcompetes the loser. Example: monopolies in the marketplace.
  • Drift to low performance, where a performance standard depends on previous performance, instead of having absolute standards. This can cause a vicious cycle of ever-worsening standards. Example: a business loses market share, each time believing, “well, it’s not that much worse than last year.”

Improving as a Systems Thinker

Learning to think in systems is a lifelong process. The world is so endlessly complex that there is always something new to learn. Once you think you have a good handle on a system, it behaves in ways that surprise you and require you to revise your model.

And even if you understand a system well and believe you know what should be changed, actually implementing the change is a whole other challenge.

Here’s guidance on how to become a better systems thinker:

  • To understand a system, first watch to see how it behaves. Research its history—how did this system get here? Get data—chart important metrics over time, and tease out their relationships with each other
  • Expand your boundaries. Think in both short and long timespans—how will the system behave 10 generations from now? Think across disciplines—to understand complex systems, you’ll need to understand fields as wide as psychology, economics, religion, and biology.
  • Articulate your model. As you understand a system, put pen to paper and draw a system diagram. Put into place the system elements and show how they interconnect. Drawing your system diagram makes explicit your assumptions about the system and how it works.
  • Expose this model to other credible people and invite their feedback. They will question your assumptions and push you to improve your understanding. You will have to admit your mistakes, redraw your model, and this trains your mental flexibility.
  • Decide where to intervene. Most interventions fixate on tweaking mere numbers in the system structure (such as department budgets and national interest rates). There are much higher-leverage points to intervene, such as weakening the effect of reinforcing feedback loops, improving the system’s capacity for self-organization, or resetting the system’s goals.
  • Probe your intervention to its deepest human layers. When probing a system and investigating why interventions don’t work, you may bring up deep questions of human existence. You might bemoan people in the system for being blind to obvious data, and if only they saw things as you did, the problem would be fixed instantly. But this raises deeper questions: How does anyone process the data they receive? How do people view the same data through very different cognitive filters?

Want to learn the rest of Thinking in Systems in 21 minutes?

Unlock the full book summary of Thinking in Systems by signing up for Shortform.

Shortform summaries help you learn 10x faster by:

  • Being 100% comprehensive: you learn the most important points in the book
  • Cutting out the fluff: you don't spend your time wondering what the author's point is.
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.

Here's a preview of the rest of Shortform's Thinking in Systems PDF summary:

PDF Summary Introduction: Seeing Things as Systems

...

Understanding the underlying system and how it behaves may be the best way to change the system.

Cause and Effect Isn’t Enough

When we try to explain events in the world, we tend to look for simple cause and effect relationships.

  • An oil company is blamed for greedily driving up the price of oil.
  • When you get sick, you blame the cold virus for attacking your body.
  • Drug addiction is blamed on the weak fortitude of the people addicted to drugs.

This simplicity is reassuring in some ways. Turn this knob, and you solve the problem—easy. In turn, it becomes easy to blame people who do not turn the knob the way you want it to be turned.

However, reality tends to be more complex than simple cause and effect relationships, because they are a product of complicated systems. Systems consist of a large set of components and relationships; a system’s behavior is not the result of a single outside force, but rather the result of how the system is set up.

  • The oil company’s actions could not cause global oil prices to rise, if the system didn’t allow it to exert this control. This relates to how readily people consume oil, the lack of viable energy...

PDF Summary Part 1: What Are Systems? | Chapter 1: Definitions

...

Elements don’t need to be tangible things. The system of a football team also consists of intangibles like the pride that fans have for their team, the reputation of players in the league, or the motivation to practice.

You can define an endless number of elements in any system. Before you go too deep down this rabbit hole, start looking for the interconnections between elements.

Interconnections

Interconnections are how the elements relate to each other. These interconnections can be physical in nature. Take the football team again:

  • The players line up in a particular formation, with specific roles in specific places
  • The individual players pass the football to each other
  • The players maneuver themselves against and around the opposing teams’ players
  • The team’s fans surround the players in a circular arena to watch the game

Interconnections can also be intangible, often through the flow of information.

  • During a game, the coach receives information from the field, then decides on a strategy and communicates that strategy to the quarterback
  • A television broadcaster communicates the information about the game to an audience...

PDF Summary Chapter 1-2: System Behavior

...

  • The bathtub starts off empty. You plug the drain and turn on the faucet. This causes the water level (or the stock) to rise.
  • When the bathtub is full, you turn off the faucet. The water level stays the same, because water is neither flowing in nor out.
  • You open the drain. The water level starts decreasing.
  • At some point halfway, you turn on the drain again. The water is flowing in at the same rate that it’s leaving, so the water level stays the same.

This behavior can be put on a graph, which visualizes the system over time.

thinking-in-systems-bathtub.png

Systems thinkers use graphs to understand the trend of how a system changes, not just individual events or how the stock looks currently.

The bathtub should be an intuitive model, and it’s simple as it represents just one stock, one inflow, and one outflow. But from this basic example you can find a few general properties of systems:

  • If the inflows exceed the outflows, the stock will rise.
  • If the outflows exceed the inflows, the stock will fall.
  • If the outflows balance the inflows, the stock will stay the same, at...

What Our Readers Say

This is the best summary of Thinking in Systems I've ever read. I learned all the main points in just 20 minutes.

Learn more about our summaries →

PDF Summary Chapter 2-1: Building More Complicated Systems

...

The stock and flow diagram looks like this:

thinking-in-systems-thermostat.png

So how does the system behave? It depends on which balancing loop is stronger:

  • If the room insulation is airtight and the furnace is strong, the heating loop is much stronger. The temperature will be consistently maintained near the thermostat setting (say, 68°F).
  • If the room is very leaky (say, a window is broken) and the furnace is weak, the cooling loop is much stronger. The temperature will hover closely to the outside temperature (say, 30°F).

Where exactly the room stabilizes its temperature depends on the relative strength of the balancing loops. The stronger loop will drive the stock closer toward its setpoint. The general takeaway: in a system with multiple competing loops, the loop that dominates the system determines the system’s behavior.

One implication of two competing loops is that the stock levels off at a point near to the stronger loop’s setpoint, but not exactly at it. If a thermostat is set to 68°F, the room temperature will level off slightly below 68°F, because heat continues to leak...

PDF Summary Chapter 2-2: Two-Stock Systems

...

thinking-in-systems-nonrenewable-resource.png

But we’ll add one complication to this system—the oil gets harder to extract as the stock goes down. In the real world, this is true—the oil gets deeper and needs more drilling to access, or it becomes more dilute and more costly to purify.

This builds a more complicated feedback loop that ties together the two systems:

  • The more oil that is extracted, the more the profit, and the more reinvestment into capital.
  • The more capital, the faster oil is extracted, and the lower the stock.
  • The lower the stock, the more costly it becomes to extract oil, and the lower the profit.
  • The lower the profit, the lower the capital investment rate.

This feedback loop leads to a predictable behavior of the system:

  • While the oil is easy to extract, profit grows and is reinvested in capital.
  • At a certain point when the stock gets low enough, the marginal barrel of oil becomes unprofitable to extract.
  • Since profits have dwindled, it no longer makes sense to reinvest in capital stock. The stock depreciates over time, which...

PDF Summary Part 2: Understanding Systems | Chapter 3: Why Systems Perform Well

...

  • Even further, there are meta-meta feedback loops that create better meta-loops and feedback loops.

To understand this, consider the human again. The body has baseline feedback loops that regulate our breathing and injury repair without our thinking about it. Above this, we also have a brain that can consciously regulate our behavior, discover drugs that influence our bodily feedback loops, and design economies that help us discover drugs. The human body is thus a remarkably resilient system.

Problems from Ignoring Resilience

At times, we design systems for goals other than resilience. Commonly, we optimize for productivity or efficiency. This can make the system very brittle—it narrows the range of conditions in which the system can operate normally. Minor perturbations can knock the system out of balance.

Examples include:

  • Modern agriculture, which has biologically selected for cows that produce more milk than is natural. To compound the problem, farmers then apply growth hormones to stimulate milk production, which further diverts the body’s resources from health. The cow has become less resilient, is more susceptible to being sick, and thus requires...

PDF Summary Chapter 4: Why We Don’t Understand Systems

...

Understanding how the system behaves over time can help you deduce the structure of the system, which can in turn help you predict the behavior into the future.

(Shortform examples:

  • If you looked at the performance of the stock market over time, you might see that it’s increased by double-digit percentages over the past three years. In this context, the daily small movements have little importance. Instead, you can focus on understanding the economic system that causes it to grow steadily over time, despite a bevy of world events.
  • If you looked at a football team’s performance over time, instead of focusing on individual games, you may better understand the system of how the team works—how players are recruited and trained, the influence of coaching, and how the team is managed by its owners.)

Limitation #2: Ignoring Nonlinearities

In our daily lives, we see the world act linearly. There is a straight-line relationship between two things. For example:

  • An object that is twice as heavy to move requires twice as hard a push to move it.
  • If you earn a salary, your bank account increases the same amount each month you work.
  • If it takes one hour to...

PDF Summary Chapter 5: How We Fail in Systems

...

Furthermore, like typical balancing feedback loops, each actor’s behavior is proportional to how far the stock is from the actor’s setpoint. The stronger one actor pulls the stock to its favored direction, the stronger the other actors try to pull back to the center. You might think of this like a game of tug of war.

The system state thus is pulled tightly in multiple directions by all the actors. But since the system stock isn’t at any one actor’s preferred setpoint, everyone is dissatisfied with the situation.

Examples of Policy Resistance

The War on Drugs

The war on drugs has multiple actors with different setpoints for the system stock of drug supply:

  • Drug addicts want the drug supply high.
  • Police want the drug supply low.
  • Drug suppliers want the drug supply in the middle to stabilize prices.
  • Most citizens who don’t use drugs just want to be safe and may not care about drug supply.

When one actor gains an advantage, the other actors pull the system back to where it was. For example, the police might increase border patrols to seize stockpiles of drugs. A number of events happen in sequence:

  • The lower supply raises prices.
  • ...

Why are Shortform Summaries the Best?

We're the most efficient way to learn the most useful ideas from a book.

Cuts Out the Fluff

Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?

We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.

Always Comprehensive

Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.

At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.

3 Different Levels of Detail

You want different levels of detail at different times. That's why every book is summarized in three lengths:

1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example

PDF Summary Part 3: Changing Systems | Chapter 6: Twelve Leverage Points

...

  • The Federal Reserve tinkers with interest rates, but this hasn’t ever stopped economic cycles from happening.

Adjusting parameters doesn’t work well because there are stronger system effects at play, such as feedback loops, incentive structures, and delays. If a system is described by a runaway loop, tweaking linear parameters does not meaningfully change the system’s behavior.

A minority of parameters can become effective leverage points when they affect higher leverage points in this list, such as the growth rate in a reinforcing feedback loop or the time delay. But the author argues these are rarer than most people think.

11: Stocks

We’ve learned that stocks are buffers that can stabilize the system over fluctuating flow rates. Your bank account is a stock of money that helps you withstand volatility in your income and expenses.

Changing the stock changes the behavior of the system. You can stabilize a system by increasing the stock, but this comes at the cost of efficiency—larger stocks cost more to build or maintain. In contrast, you can increase efficiency by decreasing the stock, but this comes at the cost of lower robustness.

  • For example,...

PDF Summary Chapter 7: Improving as a System Thinker

...

  • Human dignity
  • Contentment in life
  • Hope and inspiration
  • National pride

If you want to build a quantitative system model, you may need to put your unquantifiable onto a quantitative scale (such as measuring human dignity on a scale of 1 to 10).

Expand Your Boundaries

As discussed earlier, we are prone to drawing artificially narrow boundaries when understanding systems. To really appreciate a system’s complexity and guide it to a good outcome, you’ll need to relentlessly expand the boundaries by which you perceive the world.

The author focuses on three boundaries in particular.

Expand Your Boundaries of Time

As a society, we tend to fixate on the short-term. In how few years can this investment pay off? How do we get faster growth, sooner?

Systems, of course, can persist over decades, centuries, and much longer time scales. Focusing on the short-term is like hiking a treacherous path by staring down at your feet.

Instead, try thinking in centuries. How will this system behave 10 generations from now? System behavior 10 generations in the past affects your life today; system behavior today will affect lives 10 generations from now.

...