Do your decisions often backfire because something unexpected comes up? How can you minimize unintended consequences from throwing off your plans?
Life is unpredictable, and every decision or choice we make has the potential to backfire. According to Peter Bevelin, the author of Seeking Wisdom, we can minimize the unintended consequences of our decisions by engaging in systems thinking.
Let’s explore two concepts from systems thinking that can help us avoid unexpected consequences.
Concept 1: Limiting Factors Affect the Larger System
First, Bevelin emphasizes that most systems have a limiting factor that affects the performance of the entire system. A limiting factor is an element of a system that many other aspects of the system depend on. If the limiting factor fails or is slow, then the whole system will also fail or operate slowly. When you’re trying to improve or grow a system, you should identify a system’s limiting factor, then improve it, thereby improving the system as a whole. When we fail to understand the limiting factors of the systems in our lives, our efforts to improve them are doomed to fail.
Let’s explore Bevelin’s concept of limiting factors by thinking of a common system: your home and family. Imagine you recently had a child and you hope to continue growing your family. The city you live in is a limiting factor in your home system because if you have to keep paying such high rent, you may never be able to afford a larger home to accommodate a larger family. To change this limiting factor, move to a new city with lower rent so you can afford a larger living space.
Concept 2: Actions Produce Far-Reaching, Unintended Effects
Furthermore, Bevelin argues that because you exist in complex systems, your actions can produce far-reaching consequences, including unintended or unexpected consequences. If you fail to predict your actions’ consequences, one of your decisions may cause a ripple effect that produces negative outcomes elsewhere in the system.
(Shortform note: A term people sometimes use to describe the unpredictability of consequences is “the butterfly effect.” This term was coined by a meteorologist whose weather predictions led him to conclude that a small change in a system could produce unexpected, large outcomes elsewhere in the system. He reasoned that the flap of an additional butterfly’s wings in one region of the world could contribute enough wind current to create a tornado elsewhere.)
Let’s illustrate Bevelin’s idea of unintended, far-reaching consequences using the example of the family as a system. Imagine you accept a promotion at your company so you can earn a higher income and better provide for your family. You become so busy in your new role that you spend less time at home. Your partner is left with more parenting responsibilities, and this imbalance creates tension in your relationship. Your children are upset by this tension, and they begin to act out at school.
To increase the likelihood that your actions within a system will lead to your intended outcomes, Bevelin claims that you should try to predict the far-reaching consequences of your actions. For example, before taking the promotion you’re offered, discuss with your partner how it may affect them. Then, you two could brainstorm ways to reduce the burden of parenting on your partner. For instance, you could ask other family members for support watching the kids or enroll your children in a fun after-school program.
(Shortform note: In Thinking in Systems, Donella Meadows offers a visual technique for better understanding systems, which could help improve your ability to predict your actions’ far-reaching consequences. She recommends that you draw a diagram of your system by mapping out each of its parts and how they’re connected. Doing so forces you to notice how your systems’ elements interrelate, helping you notice how one change might trigger another change elsewhere in the system. For example, if you’re a manager who’s restructuring your department, create an organizational chart that visualizes the hierarchies within your department and the relationships among your employees.)
Bevelin offers the caveat that it’s impossible to closely examine the entire system surrounding your actions since systems are extremely complex. Rather than spending your time trying to predict every effect your actions will have (which is impossible), expect that your actions will have an unintended, negative consequence—and plan for those outcomes.
Bevelin shares a strategy from Warren Buffett on how to plan for unexpected, negative outcomes: Build safety factors into your predictions. A safety factor is a buffer that you can add to your prediction. If it turns out your predictions are too optimistic, then the safety factor helps you avoid disaster. Buffett claims that his company won’t buy a stock when they estimate the value will be only slightly more than its price. Instead, they build a safety factor into their purchasing decisions by only buying a stock if they estimate its value will be significantly higher than its price.
An Additional Way to Plan for Unintended, Negative Consequences Donella Meadows’s ideas on resilience in Thinking in Systems offer additional guidance on how to guard your system against unintended, negative consequences. While Buffett’s safety factor strategy may work well for predictions that involve calculations that can easily be adjusted to provide a buffer, Meadows’s ideas may work well for situations that don’t involve numeric predictions, such as making a plan to improve your mental health. According to Meadows, the best way to guard your system against unexpected outcomes is to make it resilient in advance. She defines a resilient system as one that can perform well in a wide range of situations, both positive and negative. Additionally, resilient systems have built-in backup mechanisms that can serve as a safety net. For example, you could make your mental health resilient by investing in multiple levels of support. You could engage in therapy, go on medication, and identify people in your life who can serve as your support system. That way, if one level of support unexpectedly failed (for instance, if you forgot to refill your medication), you’d have other backup systems that could prevent you from experiencing a crisis. |