Metaskills- Five Talents for the Robotic Age Page 10
The solution? Reject the tyranny of or and embrace the genius of and. Leave the sides behind. Look for a third narrative based on common ground instead of compromise.
Roger Martin, author of The Opposable Mind, calls this process “integrative thinking.” Integrative thinkers don’t break a problem into separate pieces and work on them one by one. Instead, they see the entire architecture of the problem—how the various parts fit together, and how one decision affects another. By resolving the tensions, or oppositions, that launched the problem to begin with, they can craft a holistic solution. Often this means rejecting the urge for certainty and grappling with the messiness of paradox.
Physicist Niels Bohr was fascinated by the discovery that electrons could be in one place, then appear in another place, with no apparent movement in between. How wonderful, he said, that scientists had met with a paradox, since now they could make real progress. Paradox, ambiguity, and conflict can be the triggers for innovation and discovery, providing we don’t settle for the quick compromise or remain boxed in by our beliefs.
“Beliefs aren’t nurture,” says Costa. “They’re nature.” They’re a basic human need. Yet we need to be aware that throughout history, whenever knowledge has been difficult to acquire, or when ambiguity has overwhelmed our biological ability to deal with it, we’ve “run home to mama.” We’ve defaulted to the emotional comfort of belief rather than using our rational brains to see the problems in their full, awesome complexity. If we hope to escape the fate of previous civilizations such as the Romans, Mayans, and Khmer, we’ll need to look for bigger answers.
Thinking whole thoughts
Drawing a picture in a visually realistic way is not really a drawing problem. It’s a seeing problem. Until we can clearly see what’s in front of us, free of misleading beliefs and partial knowledge, our picture will necessarily be distorted or fragmented. Painter Robert Irwin said, “Seeing is forgetting the name of the thing seen.” As soon as we label something, we put it in a box and move it to another part of our brain. We stop seeing it as it really is.
Seeing and thinking are related concepts. We claim to see what people mean; we look for answers; we envision a solution; we follow a line of thought; we draw conclusions; and with any luck we connect the dots. In trying to make these connections, we’re searching for patterns that show us how objects and events are linked, or need to be linked, in order to make sense. We’re looking for the emergence of a complete picture.
Leonardo da Vinci epitomized this relationship between seeing and thinking, as amply illustrated in his notes. His scientific insights came straight from his passion for drawing; he drew things to understand them. He was fascinated by the repeating patterns of nature, intensely curious about the fundamental experience of being human in the natural world. He made hundreds of drawings of the human eye, of whirlpools in streams, of sound moving through the air, of the similarities and differences among people, plants, and animals. He was trying to see how things are connected, and the way nature continually transforms itself. He was searching for a unified vision of the world.
We might well call Leonardo the father of holistic invention. His approach to art and science, and to feeling and thinking, was both simultaneous and seamless. The ideal of the Renaissance Man doesn’t suggest that we learn everything about everything, but that we see the world as an interconnected system of systems, instead of separate parts.
On a good day, this is exactly what designers do. They observe a situation—a product, service, experience, process, a communication, or business model—then devise new components, new relationships, new interactions that reshape the situation into something better. Their metaskill of visualization—of seeing how to see—makes this transformation possible.
Since the peephole of consciousness is so small, most people find it easier to focus on a single tree than a whole forest. But if our goal is to reshape a situation, we need to see the trees, the forest, and the relationships among them. We’ve been playing checkers when we really need to play three-dimensional chess. As design thinker John Thackara says, we need to employ macroscopes as well as microscopes. We need to understand how the parts interact if we want to improve the larger situation. The whole is not the sum of the parts, and merely improving the parts can court unwelcome surprises.
As a young designer, I used to wonder about the common remark, “I’m no artist—I can’t even draw a straight line.” A straight line had always seemed to me evidence of noncreativity, the refuge of the logical, the prosaic, the unartistic. Why would anyone think drawing a straight line was an important skill? We have T-squares for that! Someone finally took me aside and said, “It’s just an expression.” Oh.
Yet straight-line thinking has co-opted Western thought to the point that we have trouble understanding cause and effect. We forget that the world isn’t linear. It’s full of arcs and loops and spirals. It often seems more like a Rube Goldberg contraption than Newtonian equation. We can pull a lever here and get an unintended consequence over there.
For example, today the developed world is fighting recession. Should we balance the books with a massive austerity program? Or grease the wheels with a series of stimulus packages? Both “solutions” might seem logical, but either one could bring the roof down on our heads. Complex problems are embedded in complex systems, making them impossible to solve using linear thinking. We send aid to foreign countries to fight poverty, only to find we’re feeding corruption instead. We formulate new drugs for viruses, only to find that the viruses mutate into stronger ones. We develop pollution-free nuclear energy, only to end up with a nuclear waste problem that could haunt us for ten thousand years.
With complex problems, things are not always what they seem. Trying to turn a nonlinear world into a linear world for our emotional comfort is usually a bad idea, because linear planning only works with problems that can’t resist our plans. People, viruses, and atomic particles don’t conform to linearity—they have a way of fighting back. They seem to mock our naïve attempts to analyze them. As science-fiction writer Poul Anderson once said, “I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated.”
The Industrial Revolution has been a triumph in reductionist thinking. By focusing on narrow problems, we’ve learned how to move large weights over long distances, increase our production of food, eradicate a whole raft of diseases, transport people through the air, communicate instantly around the world, and perform any number of miraculous feats. Yet to the extent we’ve been thinking in fragments instead of whole thoughts, our solutions have only created bigger problems that are now beyond our comprehension. We now have to grapple with pollution, obesity, overpopulation, terrorism, climate change, and recession, to name just of few of our ailments.
The only way to address these so-called wicked problems—slippery conundrums that disappear around the corner when you try to follow them—is to heed the advice of philosopher Ludwig Wittgenstein. “Don’t get involved in partial problems,” he said. “Always take flight to where there is a free view over the whole single great problem, even if this view is not a clear one.” In other words, think in whole thoughts instead of fragments. Step back from the drawing board and notice the relationships among the lines, the edges, the angles, and the shapes. Check them against reality, one by one and all together.
This mode of seeing is variously known as systems thinking, adaptive thinking, cybernetics, and holistic thinking. Wicked problems don’t easily yield to hard analysis, linear logic, or propositional knowledge. They’re more likely to give up their secrets to observation, intuition, and imagination. Like an artist composing a canvas, a systems thinker squints at a problem to see the complete picture instead of the components.
How systems work
A system is a set of interconnected elements organized to achieve a purpose. For example, the plumbing in your house is a system organized to deliver clean water and flush
waste water away. A business is a system organized to turn materials and labor into profit-making products and services. A government is a system that’s organized to protect and promote the welfare of its citizens. When you think about it, even a product like a movie is a system, designed to create a theatrical experience for an audience.
A system can contain many subsystems, and it can also be part of a larger system. How you define the system depends on where you draw its boundary. And where you draw its boundary depends on what you want to understand or manipulate. If you’re the director of a movie, for example, you’ll want to draw a boundary around the whole project so that you can manage the relationships among the story elements, the locations, the sets, the performances, the technical production, the schedule, the costs, and so on. You might also draw boundaries around subsystems like individual scenes, stunts, and camera moves, so you can control the various relationships within each of those, as well as their relationships to the whole.
The structure of a system includes three types of components: elements, interconnections, and a purpose. It also includes a unique set of rules—an internal logic—that allows the system to achieve its purpose. The system itself, along with its rules, determines its behavior. That’s why organizations don’t change just because people change. The system itself determines, to a large extent, how the people inside it behave. It’s useless to blame the collapse of the banking industry on individual executives or specific events. The very structure of the banking system is to blame, since it’s tilted in favor of corrupt actors and selfish behaviors. Therefore we might think about redesigning the system so corruption is not so easy or profitable. Or we might make improvements to the larger system in which it operates, say capitalism itself. We might even question the cultural norms and beliefs that gave rise to 20th-century capitalism in the first place.
To improve a complex system, you first have to understand it—at least a little. I say “a little” because complex systems are always part mechanics, part mystery. Take the example of a company. Like all systems, a company has inflows and outflows, plus feedback systems for monitoring their movements. A CEO might have a good grasp of the system’s elements (its divisions, product lines, departments, competencies, key people), its interconnections (communications, distribution channels, partnerships, customer relationships), and its purpose (mission, vision, goals). She might also understand the system’s operational rules (processes, methodologies, cultural norms). But it’s still difficult for any one person to know how the company is actually doing in real time. So she uses the system’s feedback mechanisms (revenues, earnings, customer research) to get a read on the situation.
But there’s a catch.
The feedback mechanisms in most systems are subject to something called latency, a delay between cause and effect, or between cause and feedback, in which crucial information arrives too late to act upon. While the revenue reports are up to date in a traditional sense, they only show the results of last quarter’s efforts or last year’s strategy. They’re lagging indicators, not leading indicators, of the company’s actual progress. By the time she gets the numbers, the situation is a fait accompli. Any straightforward response based on the late information is likely to be inadequate, ineffective, or wrong.
How can she work around this problem? By looking at the company as a system instead of individual parts or separate events. She can anticipate the eventual feedback by finding out which indicators are worth watching.
For example, she might watch levels of brand loyalty as an indicator of future profit margins. Or products in the pipeline as an indicator of higher revenues. Or trends in the marketplace as an indicator of increasing demand. Of course, leading indicators can be tricky, since they’re only predictions. But if she keeps at it, continuously comparing predictions with outcomes, she can start to gain confidence in the indicators that matter. This is known as “getting a feel” for the business.
In systems theory, any change that reinforces an original change is called reinforcing or positive feedback. Any change that dampens the original change is called balancing or negative feedback.
For example, if a company’s product starts to take off, other customers may notice and jump on the bandwagon. This is an example of reinforcing feedback. If the product keeps selling, it may eventually lose steam as it runs out of customers, meets with increasing competition, falls behind in its technology, or simply becomes unfashionable. These are examples of balancing feedback. By keeping an eye on these two feedback loops, the CEO can get ahead of the curve and make decisions that mitigate or reverse the situation.
But let’s get back to the problem of latency. Every change to a system takes a little time to show up. This is best illustrated by the classic story of the “unfamiliar shower.” Imagine you’re staying at the house of some friends, and you go to use their shower for the first time. It’s a chilly morning. You turn on the taps, step in, and suddenly jump back. Whoa! The water comes out like stabbing needles of ice! So you gingerly reach in again and turn the hot up and the cold down. No change. Okay, one more adjustment and—ahhh—the water warms up enough to stand under the shower head. Then, just as suddenly, the reverse occurs. The water turns violently hot and this time you jump all the way out. “Yowww!” you scream, waking up the house.
What just happened? In systems-thinking terms, the delay between cause and effect—between adjusting the taps and achieving the right temperature—deprived you of the information you needed to make appropriate changes to the hot and cold water. After a few times using the shower (assuming you were allowed to stay), you learned to wait for the feedback before fiddling with the taps. And you later learned where the taps should end up to achieve the right temperature.
Here are some other examples of system delays:
The state government raises taxes on business. The immediate result is more revenue for public works, but over time businesses pull out and fewer move in, thereby lowering the revenue available for public works.
A mother is concerned that her children may be exposed to danger if they’re allowed to roam the neighborhood freely, so she keeps them close and controls their interactions with friends. At first this keeps them safe, but as they grow older they suffer from impaired judgment in their interactions with the broader world.
A company is hit by an industry downturn and its profits begin to sag. It reacts quickly by laying off a number of highly paid senior employees. While this solves the immediate problem, the talent-starved company falls behind its competitors just as the economy picks up.
A student feels that his education is taking too long, so he drops out of college and joins the workplace. He makes good money while his college friends struggle to pay their bills. Over time, his lack of formal education puts a cap on his income while his friends continue up the ladder.
A salesman meets his quota by pressuring customers to buy products that aren’t quite right for their needs. The next time he tries to make a sale, he finds them less agreeable.
A bigger child learns that she can bully the other children in school. At first this feels empowering, but over time she finds she’s excluded from friendships with other children she admires.
The common thread in all these stories is that an immediate solution caused an eventual problem. None of the protagonists could see around the corner because they were thinking in short, straight lines.
Our emotional brains are hardwired to overvalue the short term and undervalue the long term. When there’s no short-term threat, there’s no change to our body chemistry to trigger fight or flight. If you pulled back your bedsheet one night and found a big, black spider, your brain would light up like a Christmas tree. But if you were told that the world’s population will be decimated by rising ocean waters before the year 2030, your brain would barely react. We’re genetically attuned to nearby dangers and unimpressed by distant dangers, even when we know that the distant ones are far more dangerous. For example, I know I should e
arthquake-proof my house in fault-riddled California, but today I have to oil the squeaky hinge on my bedroom door. It’s driving me crazy.
Latency is the mother of all systems traps. It plays to our natural weaknesses, since our emotional brain is so much more developed than our newer rational brain. It takes much more effort to think our way through problems than to merely react to them. We have to exert ourselves to override our automatic responses when we realize they’re not optimal.
In this way, systems thinking isn’t only about seeing the big picture. It’s about seeing the long picture. It’s more like a movie than a snapshot. My wife can predict the ending of a movie with uncanny accuracy after viewing the first few minutes. How? Through a rich understanding of plot patterns and symbolism, acquired over years of watching films and reading fiction, that lets her imagine the best resolution. In other words, she understands the system of storytelling.
But there’s more to systems than watching for patterns, loops, and latency. You also have to keep an eye on archetypes.
Grandma was right
The first rule of systems is that they create their own behavior. During the protests of the late 1960s, students around the world took to the streets and raised their voices against the “system.” The overall system, they said, not individual people, was to blame for the broken culture. Civil rights, women’s rights, war, overpopulation, and the environment were seen as tangled strands in a single hairball. While this instinct was correct, hairballs don’t untangle easily. If you pull a strand on one side, you may find that another strand gets tighter on the other side. The only solution is to undo the knots one by one. And, of course, it helps to know what kind of knot you’re dealing with. Is it a simple overhand knot, or a real granny knot?