Metaskills- Five Talents for the Robotic Age Read online




  For Eileen

  CONTENTS

  Preface

  Prologue

  Ten questions

  THE MANDATE

  The arc of human talent

  The innovation mandate

  Where are the jobs?

  The Robot Curve

  A crisis of happiness

  The obsolete industrial brain

  Wanted: Metaskills

  Congratulations, you’re a designer

  The future in your hands

  FEELING

  Brain surgery, self-taught

  When the right brain goes wrong

  The magical mind

  Leonardo’s assistant

  The uses of beauty

  Aesthetics for dummies

  It’s not business—it’s personal

  On what do you bias your opinion?

  SEEING

  The tyranny of or

  Thinking whole thoughts

  How systems work

  Grandma was right

  The primacy of purpose

  Sin explained

  The problem with solutions

  The art is in the framing

  DREAMING

  Brilliant beyond reason

  The answer-shaped hole

  There be dragons!

  A most unpleasant young man

  The play instinct

  Dreaming together

  The bolt upright moment

  Six tests of originality

  MAKING

  Il discorso mentale

  The no-process process

  Every day is Groundhog Day

  The discipline of uncluding

  The art of simplexity

  A reality check

  Sell in, not out

  The big to-do list

  LEARNING

  Impossible is nothing

  The joy zone

  What’s the mission?

  A theory of learning

  Climbing the bridge

  Creativity loves company

  Unplugging

  The scenic road to you

  A MODEST PROPOSAL

  Epilogue

  1. Shut down the factory

  2. Change the subjects

  3. Flip the classroom

  4. Stop talking, start making

  5. Engage the learning drive

  6. Advance beyond degrees

  7. Shape the future

  Acknowledgements

  Notes

  Index

  Copyright

  PREFACE

  What happens when a paradigm shifts? Do we simply wake up one day and realize that the past seems irreversibly quaint? Or do financial institutions fail, governments topple, industries break down, and cultures crack in two, with one half pushing to go forward and the other half pulling back?

  This is a book about personal mastery in a time of radical change. As we address our increasing problems with increasing collaboration, we’re finding that we still need something more—the bracing catalyst of individual genius.

  Unfortunately, our educational system has all but ruled out genius. Instead of teaching us to create, it’s taught us to copy, memorize, obey, and keep score. Pretty much the same qualities we look for in machines. And now the machines are taking our jobs.

  I wrote this book to cut cubes out of clouds, put our swirl of societal problems into some semblance of perspective, and suggest a new set of skills to address them. While the problems we face today can be a source of hand-wringing, they can also be a source of energy. They can either lead to societal gridlock or the most spectacular explosion of creativity in human history.

  One thing’s for sure: There’s no going back, no secret exit, no chance of stopping the clock. The only way out is forward. Our best hope is that once we see the shape of our situation, we can turn our united attention to reshaping it. It won’t require a top-down strategy or an international fiat to get the transformation going. Just a relative handful of people—maybe people like you—with talent, vision, and a few modest tools.

  I’ve divided the book into seven parts. The first is about the mandate for change. The next five are the metaskills you’ll need to make a difference in the postindustrial workplace, including feeling, seeing, dreaming, making, and learning. The last is a set of suggestions for educational reform, written from the perspective of a hopeful observer.

  As you read about the metaskills, take comfort in the knowledge that no one needs to be strong in all five. It only takes one or two talents to create a genius.

  —Marty Neumeier

  PROLOGUE

  The Midi-Pyrénées, southern France. A spray-painted silhouette of a prehistoric hand, positioned low on the wall at Pech Merle, seems at odds with the other images in the cave—the fluidly stylized drawings of horses, mammoths, reindeer, and other herd animals of the prehistoric hunt. In fact, the hand stencils are the only naturalistic references to human beings at all, and the only subjects of any kind shown actual size. You could place your own hand over the stenciled hand of a cave painter, and even after 25,000 years of human evolution, your hand would fit.

  The stencils are highly mysterious. Why would artists with enough skill to conjure magnificent animals in full motion, and who entered the caves to make paintings only after practicing outside the caves, bother to tag the wall with a simple stencil that a kindergartener could make on the first try? Were the hands the equivalent of personal signatures, or perhaps clan symbols? Or were they examples of ancient graffiti, painted by nonartists after the “real” artists had left? Why were there no images of human beings drawn in the same style? And what exactly were the cave paintings for?

  No one can say for sure, but here’s a theory that fits the facts: The paintings were designed as a kind of magical mystery show to inspire greatness in the hunt. The caverns were prehistoric cathedrals, special places of elevated consciousness where the hunters could psych themselves for the coming hunt. Animals, not humans, were the subjects, because animals were what the hunters revered. They respected their immense power and beauty in a world where humans were lower on the food chain.

  When Pablo Picasso came to view the caves of southern France, he couldn’t help notice a particular nuance. The ancient artists had deftly arranged their two-dimensional images over the natural bumps and fissures of the stone, giving them a subtle depth. When viewed in the flickering light of a candle, voilá! the dimensionalized animals came to life, appearing to fly across the walls. The caves, in this construction of the facts, were nothing less than the ancient version of our 3D cinemas. On a good night (or with the right drugs) they were probably much better than our 3D cinemas.

  And the hand silhouettes? Poignant symbols of gratitude for a unique and surprising gift. No other animal of their acquaintance could fashion tools, hunt with weapons, or cast motion pictures onto cave walls. The human hand, with its sensitive fingers and opposable thumb, made all this possible. So for about ten thousand years our ancestors enshrined their thanksgiving in hundreds of caves, from Africa to Australia, to remind us of who we are and where we came from. They’re reaching out as if to say: “This hand made this drawing.”

  8.7 million homes, North America. “It’s a poor workman who blames these” was the clue that Alex Trebek read from the television monitor. It was February 16, 2011, in the very last round of a three-day Jeopardy! marathon that pitted IBM’S “Watson” computer against two human champions, Ken Jennings and Brad Rutter. Ken was the biggest money winner of all time. Brad held the record for the longest winning streak. To rack up a score, a contestant must be the first to hit the buzzer with the right answer, or more preci
sely, the right question, since the game begins with the answer.

  Even before Alex finished reading the clue, Watson’s 2,880 parallel processor cores had begun to divvy up the workload. Who or what is “workman”? What does poor mean in this context? Is workman penniless? Maybe out of a job? Meanwhile, other processors got busy parsing the sentence. Which word is the subject? Which is the verb? If this word is a noun, is it a person or a group? Making the task more tricky, Jeopardy! clues are displayed in all capital letters, so Watson had to figure out if “WORKMAN” was a proper noun or a common noun.

  Despite knowing almost nothing, at least in the human sense of knowing, Watson’s massively parallel processing system held a distinct advantage over its human counterpart. It was fast. At 33 billion operations per second, it could search 500 gigabytes of data, or the equivalent of one million books, in the blink of an eye. It could also hit the buzzer in less than eight milliseconds, much faster than a human hand.

  Yet Watson was programmed not to hit the buzzer unless it had a confidence level of at least 50 percent. To reach that level, various algorithms working across multiple processors returned hundreds of hypothetical answers. Another batch of processors checked and rechecked these answers against the stored data, assigning probabilities for correctness. During the three seconds that these operations took, Watson’s onstage avatar, a color-shifting globe with expressive lines fluttering across its face, gave the distinct impression of someone thinking deeply.

  Then the buzzer.

  “What are tools?” answered Watson in a cheerful computer voice. The confidence rankings for the top candidates had been “tools” at 84 percent, “Yogi Berra” at 10 percent, and “explorer” at 3 percent. So tools it was.

  “You are right for $2,000,” said Alex.

  By the end of the game Watson had passed Ken’s $19,200 and Brad’s $21,600 to win with $41,413, becoming the first nonhuman champion of Jeopardy!

  Below Ken’s written answer to the Final Jeopardy question he had scrawled the footnote: “I, for one, welcome our new computer overlords.”

  THE MANDATE

  The arc of human talent

  Over the last 13 billion years or so, the universe has been under the thumb of entropy. Entropy is the force that causes energy in a system to decrease over time. It’s a tendency for things to become disorderly, lose their purposeful integrity, and finally die or simply become meaningless. Think of decaying orbits, dying suns, rotting plants, rusting iron, forest fires, old newspapers, or the destructive path of war.

  Standing in opposition to entropy is life. Life is the impulse to fight against entropy. Remember Victor Laszlo in Casablanca? “If we stop breathing, we’ll die. If we stop fighting our enemies, the world will die.” Our common enemy is entropy; the impulse to resist entropy is called extropy.

  The battle between entropy and extropy is literally a life-and-death struggle. Creatures are born, fight for life, and die in huge numbers year after year. Yet considering the enormous power of entropy, the battle is going pretty well. Individual lives end, but important life lessons are passed down through DNA. In the case of humans, additional learning is passed down through various forms of culture. Since the day Homo erectus first fashioned a stone blade, around two-and-a-half-million years ago, the human race has evolved into the most awesome entropy-fighting species

  on Earth.

  If evolution continues on its path towards increasing order, complexity, and beauty, entropy will slowly recede into the shadows. At least that’s the theory. Whether it happens or not will depend a great deal on what humans do in this very century. At seven billion strong, we’re now the most populous mammals on the planet with the possible exception of rats. And since the rats aren’t likely to be game changers, it’s up to us. We’re the ones that will make or break the future.

  The arc of human evolution is really the arc of human talent. For our purposes, we can define talent as an inherited and learned ability to create beautiful things—whether they're tools, objects, experiences, relationships, situations, solutions, or ideas. If the outcomes are not beautiful, the maker is demonstrating creativity, but not necessarily talent. Talent works on a higher level than creativity. It requires highly developed “making” skills. And making skills can only be learned by making things.

  The name Homo erectus means upright man, but the real evolutionary advantage of erectus was not walking. It was working. When our hominid ancestors came down from the trees, their hands were finally freed to do other things. This, in turn, encouraged evolutionary changes to their hands. The human hand, with its articulate fingers and opposable thumb, turned out to be the lever that launched what we now call technology.

  Our long history matters because it tells us what evolution designed us to do. If our evolutionary purpose were only to eat and reproduce, we wouldn’t have needed large brains or opposable thumbs. We wouldn’t have needed technology or art. We wouldn’t have developed an interest in social networking or space travel. We wouldn’t have changed our biology as much as we have—more than any other mammal in history. Therefore we must be designed for something else.

  Does this mean we’re born with a purpose?

  While some might say yes, I say no. I think we come into this world with a set of evolution-derived capabilities that both suggest and limit what we can do. If any of us have a guiding purpose, it’s only because we’ve chosen one. Which, as I’ll go into later, might not be a bad idea.

  If you’re looking for a ready-made purpose, the transhumanists have a breathtaking one for you. They believe that humankind’s purpose is to accelerate evolution beyond its current biological limitations, so that increasing levels of intelligence, complexity, and love will spread throughout the universe. They see modern-day humans as midwives in this process. The life forms that will take it from here, in the transhumanist view, will be human-machine combos and human-made biological beings. The exact point at which machines will supercede humans is called the Singularity, a term coined by information theorist John von Neumann. They predict this to occur sometime between 2030 and 2050.

  Why would someone believe this? Hard to say. But the underpinnings of this notion are rational, and worthy of serious consideration no matter what conclusions you draw from them.

  Think about this: The world’s ability to store, communicate, and compute information has grown at annual rates of at least 23% since 1986. The total amount of digital information is now increasing tenfold every five years. A total of five exabytes of data existed in 2003. Today the world is generating the same amount every two days. If we could put this data on CD-ROMS and pile them up, the stack would extend past the moon. The amount of information in existence is around 1.27 zettabytes, or 1,000 exabytes, each of which is equivalent to 4 billion books.

  In a recent two-month period, more videos were uploaded on YouTube than have been aired since 1948 on the big three networks. Wikipedia has generated over 13 million articles in more than 200 languages over a single decade. This is what can happen when technology is open to everyone.

  The democratization of knowledge is a profit platform, too. Amazon has a goal of making every book ever printed available in any language in under 60 seconds. Google’s mission is to organize all the world’s information, not just book information, and make it universally accessible.

  Ten years ago, former IBM chief Lou Gerstner noted that complexity was spiraling upward faster than the capability of humans to handle it. “Therefore,” he said, “the infrastructure itself—from end to end—will have to be reengineered to have the ability to perform many tasks that require human intervention today. What is coming is a kind of computing that will take its cue from the human autonomic nervous system.” He characterized this as a “self-awareness” that would allow systems to defeat viruses, repel attacks, and repair themselves on the fly. Today our electronic networks are so rich and complex that they’re beginning to behave like biological systems.

  In John von Neumann’s
day, computing was a simple, step-by-step affair that precluded any humanlike abilities such as pattern recognition, learning, or self-awareness. Yet only a half century later, IBM and the US government are investing in “cognitive computer chips” that can perform massively parallel processing—the same kind that occurs in the human brain. These new chips consume very little power and have a fundamentally different design, using an arrangement of 256 neuron-like nodes to make a “neurosynaptic core.”

  Watson was a jury-rigged version of a massively parallel computer chip. Some critics dismissed IBM’S accomplishment on the premise that a room-sized computer is not a viable alternative to a three-pound human brain. But keep in mind that today’s iPhone has as much processing power as a whole Cray supercomputer had only 25 years ago. In 25 more years it will be the size of a blood cell.

  Growing computer power is enabling four interconnected technologies, which in turn are driving exponential change. These are: information technology, nanotechnology, genetics, and robotics. Kevin Kelly, in his excellent book What Technology Wants, has labeled this ever-growing community of tools the technium. Speaking of computer chips he says that “tiny synthetic minds no bigger than an ant’s know where on Earth they are and how to get back to your home (GPS); they remember the names of your friends and translate foreign languages [and] unlike the billions of minds in the wild, the best of these technological minds are getting smarter by the year.”

  The math behind the technium is similar to Moore’s Law, the 1965 prediction that the amount of computing power you can buy for a dollar will double every 18 months. Gordon Moore’s formula has surprised everyone with its consistency. Looking back, if Apple had invented the iPod in 1961 instead of 2001, each unit would have cost $3 billion and required a trailer to haul it around. Now the device costs only $50 and can fit on a watchband.

  “We’re up to something like the 28th doubling of computing power,” says Joel Garreau, author of Radical Evolution. “Doubling is an amazing thing. It means that each step is as tall as all the previous steps put together.” This amounts to an exponential increase of about 100 million times. “You’ve never seen a curve like that in all of human history.”