One trend I actively follow is automation. In 20 years, what jobs will be done by robots? With new advancements in technology, it’s not just the routine data entry and labor jobs that are in danger anymore. Robots can now analyze emotions, perform surgical operations, even drive semi-trucks across the country.
In my book Chasing Excellence, which I co-wrote with Lee & Associates founder Bill Lee, I discuss this topic. I argue, as does Geoff Colvin in the below article, that this is the wrong question to ask. Instead, we should be asking, “What are the activities that we humans…will simply insist be performed by other humans, even if computers could do them?”
We both believe:
– We can trust human leaders. CEOs, politicians, judges, we trust humans to make leadership decisions. If you are a leader in your field, your job will be safe for a long time.
– Humans can collaborate. Teamwork is vital to our society. A team of people working together not only knows how to solve a problem, but they can identify which problems are worth solving.
– Humans like interacting with other humans. This is the big one. How often do you call a customer service number only to get frustrated with the automated voice system? We like talking to humans. Period.
Want to discuss what this means for you or your business? Give me a call.
602.954.3762
Humans are underrated
By GEOFF COLVIN
July 23, 2015
As the Pepper robot from Softbank scurries about your home or office, it reads your emotions by your words, tone of voice, facial expressions, and body language. It then responds in all those ways; its hands and posture in particular are remarkably expressive. If you thought emotions were beyond the competencies of robots, you were right for a long time. But no more.
Maybe you believe that humans uniquely will always have to perform the highest-stakes, most delicate and demanding tasks in our lives, such as surgery. But researchers at the University of California at Berkeley are training a robot to identify and cut away cancerous tissue—not like today’s surgical robots, which are actually tools used by human surgeons, but entirely on its own.
Or perhaps you figure technology, for all its wonders, is just nibbling away at the edges of human employment. There aren’t that many surgeons, after all. But in May, Daimler began testing the first self-driving semitruck on the roads of Nevada. The No. 1 job among American men, held by 2.9 million of them, is truck driver. Not that women are safe. Technology will continue to devour clerical and office tasks, and the No. 1 job among U.S. women, held for now by 3 million of them, is administrative assistant.
The greatest anxiety troubling workers today is embodied in a simple question: How will we humans add value? Popular culture is obsessed by it. Humans, a new series on the AMC network, spins a story from the promise and perils of eerily humanoid robots called synths. That seems to be Hollywood’s 2015 theme of the year. Think of Ex Machina (humanoid robot outsmarts people, kills a man, enters society as a person) or Terminator Genisys (Arnold Schwarzenegger’s humanoid robot must again save the world) or Avengers: Age of Ultron (humanoid robot tries to eradicate humanity) or Chappie (bad guys try to destroy humanoid robot police officer who is reprogrammed to think and feel). The big idea is always the same: For good or ill, machines become just like people—only better.
We humans have good reason to be uneasy. Strange things are happening in the economy. Ever fewer men of prime working age—the group that historically has been the most thoroughly employed—are working (see chart), and while several factors are feeding the trend, most economists believe that advancing technology is one of them. In factories and offices, on construction sites and behind counters, technology keeps doing more jobs better than people.
Why are so many men not working?
The share of U.S. men in their prime working years who aren’t employed has risen sharply since 1980, through recessions and expansions—a dramatic and unprecedented long-term shift in employment. Many economists believe that technological unemployment is an important factor in the trend, suggesting it’s unlikely to turn around soon.St. Louis Fed from OECD Data
Fear of technological unemployment is as old as technology, and it has always been unfounded. Over time and across economies, technology has multiplied jobs and raised living standards more spectacularly than any other force in history, by far. But now growing numbers of economists and technologists wonder if just maybe that trend has run its course. That’s why former Treasury Secretary Lawrence H. Summers says these issues will be “the defining economic feature of our era.”
How will we humans add value? There is an answer, but so far we’ve mostly been looking for it in the wrong way. The conventional approach has been to ask what kind of work a computer will never be able to do. While it seems like common sense that the skills computers can’t acquire will be valuable, the lesson of history is that it’s dangerous to claim that there are any skills computers cannot eventually acquire. The trail of embarrassing predictions goes way back. Early researchers in computer translation of languages were highly pessimistic that the field could ever progress beyond its nearly useless state as of the mid-1960s; now Google translates written language for free, better all the time thanks to feedback from human users, and Skype translates spoken language in real time, for free. Hubert Dreyfus of MIT, in a 1972 book called What Computers Can’t Do, saw little hope that computers could make significant further progress in playing chess beyond the mediocre level then achieved, but IBM’s (IBM, -0.62%) Deep Blue beat world champion Garry Kasparov in 1997. Economists Frank Levy and Richard J. Murnane, in an excellent 2004 book called The New Division of Labor, explained how driving a vehicle requires such complex split-second judgments that it would be extremely difficult for a computer ever to handle the job; Google (GOOG, -1.49%)introduced its autonomous car six years later. Harvard psychologist Steven Pinker observed in 2007 that “assessing the layout of the world and guiding a body through it are staggeringly complex engineering tasks, as we see by the absence of dishwashers that can empty themselves or vacuum cleaners that can climb stairs.” Yet iRobot soon thereafter was making vacuum cleaners and floor scrubbers that find their way around the house without harming furniture, pets, or children, and was also making other robots that climb stairs; it could obviously make machines that do both if it believed demand were sufficient. And the Armar IIIa robot, developed at Karlsruhe Institute of Technology in Germany, can unload (and load) the dishwasher.
The pattern is clear. Extremely smart people note the overwhelming complexity of various tasks, including some, like driving a car, that people handle almost effortlessly, and conclude that computers will find mastering them terribly tough. Yet over and over it’s just a matter of time until the feat is accomplished, often less time than anyone expects. We just can’t get our heads around the notion of computer processing power doubling every two years. At that rate, infotech power increases by a factor of a million in 40 years. The computing visionary Bill Joy likes to point out that jet travel is faster than walking by a factor of 100, and that changed the world. Nothing in our experience prepares us to grasp a factor of a million. At the same time, increasingly sophisticated algorithms let computers handle complex tasks using less computing power. So year after year we reliably commit the same blunder of underestimating what machines will do.
Yes, figuring out what computers will never do is an exceedingly perilous route to determining how humans can remain valuable. A better strategy is to ask, What are the activities that we humans, driven by our deepest nature or by the realities of daily life, will simply insist be performed by other humans, even if computers could do them?
Humans will remain in charge
A large category of those activities comprises roles for which we demand that a specific person or persons be accountable. A useful example is making decisions in courts of law, which we will require that human judges render for quite a long time to come. It’s an example in which the human vs. computer question is not hypothetical. Parole decisions are made by judges in some countries, such as Israel, where researchers investigated how those decisions are influenced by the critical human issue of lunch. Over the course of a day, the judges approve about 35% of prisoners’ applications for parole. But the approval rate declines steadily in the two hours before lunch, almost to zero just before the lunch break. Immediately after lunch, it spikes to 65% and then again declines steadily. If you’re a prisoner, the number of years you spend behind bars could be affected significantly by whether your parole application happens to be the last one on the judge’s stack before lunch or the first one after. Data-driven algorithms have proved superior to human judges and juries in predicting recidivism, and it’s virtually certain that computer analysis could judge parole applications more effectively, and certainly less capriciously, than human judges do. Yet how would you rate the chances of that job getting reassigned from judges to machines? The issue isn’t computer abilities; it’s the social necessity that individuals be accountable for important decisions. Similarly, it seems a safe bet that those in other accountability roles—CEOs, generals, government leaders at every level—will remain in those roles for the same reason.
Humans must work together to set collective goals
In addition, humans rather than computers will have to solve some problems for purely practical reasons. It isn’t because computers couldn’t eventually solve them. It’s because in real life, and especially in organizational life, we keep changing our conception of what the problem is and what our goals are. Those are issues that people must work out for themselves, and, critically, they must do it in groups. Partly that’s because organizations include many constituencies that must be represented in problem solving, and partly it’s because groups can solve problems far better than any individual can.
Only humans can satisfy deep interpersonal needs
A more important category of people-only work comprises the tasks that we must do with or for other humans, not machines, simply because our most essential human nature demands it, for reasons too deep even to be articulated. We are social beings, hardwired from our evolutionary past to equate personal relationships with survival. We want to work with other people in solving problems, tell them stories and hear stories from them, create new ideas with them, because if we didn’t do those things on the savanna 100,000 years ago, we died. The evidence is clear that the most effective groups are those whose members most strongly possess the most essentially, deeply human abilities—empathy above all, social sensitivity, storytelling, collaborating, solving problems together, building relationships. We developed these abilities of interaction with other people, not machines, not even emotion-sensing, emotion-expressing machines. We may enjoy the Pepper robot, but we didn’t evolve to interact with it.
A U.S. Army officer meets with local elders in Afghanistan. The U.S. military has realized that its most important work is now conducted in “the human domain”; it’s ahead of most other institutions in training skills of personal interaction.ROBERT NICKELSBERG—GETTY IMAGES
We want to follow human leaders, even if a computer could say all the right words, which is not an implausible prospect. We want to hear our diagnosis from a doctor, even if a computer supplied it, because we want to talk to the doctor about it—perhaps just to talk and know we’re being heard by a human being. We want to negotiate important agreements with a person, hearing every quaver in his voice, noting when he crosses his arms, looking into his eyes.
To look into someone’s eyes—that turns out to be, metaphorically and quite often literally, the key to high-value work in the coming economy.
It isn’t just theory. Changes in the nature of work of exactly this type are happening on a significant scale. Ask employers which skills they’ll need most in the next five to 10 years, as the Oxford Economics research firm did, and the answers that come back do not include business acumen, analysis, or P&L management—left-brain thinking skills that computers handle well. Instead, employers’ top priorities include relationship building, teaming, co-creativity, brainstorming, cultural sensitivity, and ability to manage diverse employees—right-brain skills of social interaction. Those responses fit well with big-picture data on how Americans work today vs. how they worked in the 1970s. The biggest increases by far have been in education and health services, which have more than doubled as a percentage of total jobs; professional and business services, up about 80%; and leisure and hospitality, up about 50%. The overall trend is a giant employment increase in industries based on personal interaction. That’s why Oracle group vice president Meg Bear says, “Empathy is the critical 21st-century skill.”
Other research supports that impression. The McKinsey Global Institute found that from 2001 to 2009, transaction jobs (bank teller, checkout clerk) decreased by 700,000 in the U.S., and production jobs decreased by 2.7 million. But jobs of human interaction—doctors and teachers, for example—increased by 4.8 million. All those trends have continued. The institute reported that interaction jobs have become “the fastest-growing category of employment in advanced economies.”
No one should be surprised. Harvard professor William H. Bossert, a legendary figure at the school with wide-ranging interests in math and biology, taught a pioneering computer science course for undergraduates in the early 1970s, the first such course ever offered at Harvard. He devoted his final lecture to the future of computing and its likely effects. Intel had just produced its first chip, and people were worried about computers eliminating jobs. Bossert’s emphatic response was that computers would indeed eliminate jobs, and we should be grateful because we could then focus on the essence of being human, doing what we were meant to do. That observation led him to a memorable conclusion: “If you’re afraid that you might be replaced by a computer, then you probably can be—and should be.”
It has taken a while, but the large-scale takeover of many thinking tasks by computers, leaving people with the deeply human tasks of social interaction, is becoming a broad phenomenon.
Since the dawn of the Industrial Revolution—the machine age—much human success has derived from our being machine-like. For decades, most of the physical work in factories and the mental work in offices were repetitive and routine. They were designed to be that way; that’s why Henry Ford complained, “Why is it every time I ask for a pair of hands, they come with a brain attached?” It was the kind of work for machines to do, only the machines of the era couldn’t do it. The machines improved, slowly at first, then rapidly, driven by the ever-quickening advance of infotech. Now they can actually do most of the machine work of our world.
As a result, the meaning of great performance has changed. It used to be that you had to be good at being machine-like. Now, increasingly, you have to be good at being a person. Great performance requires us to be intensely human beings.
To put it another way: Being a great performer is becoming less about what you know and more about what you’re like.
The emerging picture of the future casts conventional career advice in a new light, especially the nonstop urging that students study coding and STEM subjects—science, technology, engineering, math. It has been excellent advice for quite a while; eight of the 10 highest-paying college majors are in engineering, and those skills will remain critically important. But important isn’t the same as high-value or well-paid. As infotech continues its advance into higher skills, value will continue to move elsewhere. Engineers will stay in demand, it’s safe to say, but tomorrow’s most valuable engineers will not be geniuses in cubicles; rather they’ll be those who can build relationships, brainstorm, collaborate, and lead.
As demand for empathy grows, supply shrinks
Researchers analyzed 72 studies that measured empathy in about 14,000 college students since 1979 and found a broad decline over time. Their empathy seems unlikely to increase; separate research suggests this quality declines with age.Sarah Konrath, Edward H. O’Brien, and Courtney Hsing, “Changes in Dispositional Empathy in American College Students Over Time: A Meta-Analysis,” Personality and Social Psychology Review (2010)
As a changing economy revalues human skills, it seems logical to see the trend as the latest step in a long progression: For centuries people have improved their living standards by mastering new skills that a new economy rewards. But the skills that are becoming most valuable now, the skills of deeply human interaction, are not like those other skills. Learning to be more socially sensitive is not like learning algebra or how to operate a lathe or how to make a well-functioning blog in WordPress. That means that some people will have a much easier time adapting than others will.
On average, women are better at many of these increasingly valuable skills than men are. Overall, they reliably score higher on tests of empathy and social sensitivity than men do. Since research shows that the best-performing groups tend to be those whose members are best at those skills, it follows that groups with a higher proportion of women tend to do better. In fact, some research shows that groups consisting entirely of women are more effective than groups that include even one man.
That doesn’t mean that men are doomed to irrelevance. Within genders are enormous differences in the interpersonal abilities that people bring to adulthood, even before any training they may receive, which for most people is little or none. Everyone can get better, but it will be hard for some people, and some simply won’t want to do it. It isn’t about what they know. It’s just the way they are.
Southwest Airlines (LUV, -2.33%) once hired a high-level employee for its information technology operations and quickly began to suspect it had made a mistake. After he’d been on the job for only a week or so, the company’s HR chief asked him how things were going.
“People here are strange,” he replied. “They want to talk to me in the hallway! They ask how my day has been, and they really want to know! And I just want to go back to my cube and work.”
An IT guy who wants to be left alone in his cube is not exactly a surprise. It’s practically a stereotype. But it was a big problem at Southwest.
This company succeeds in one of the world’s most miserable industries. It prospers because, as its managers have always understood, it knows the value of human interaction externally and internally. The ability of employees to engage customers with humor, energy, and generosity is crucial to creating value in an experience that is not, on its face, all that appealing. For employees who work strictly with one another behind the scenes, the business is as grindingly competitive as it is for any other airline, and doing the job is not a walk in the park. Co-workers who ask about each other and like to tell a joke are key to keeping everyone going.
So an employee who’s uninterested in human interaction is trouble. His immediate depressive effect on those around him, bad enough by itself, could start to spread. Even if it doesn’t, it’s a problem. The company’s culture is a big reason, maybe the main reason, that so many people want to work there. It’s why, when the company has 3,000 jobs to fill, it gets 100,000 applications. If a newly hired young person comes to work on his first day and meets this guy, he’ll conclude that the Southwest culture isn’t at all what he had thought. He’ll be unhappy, possibly resentful, and he’ll spread the word.
So Southwest’s managers decided that their new IT guy, despite his excellent credentials, had to go. He was dismissed in short order.
For people like him, life will be increasingly difficult. Organizations used to have places for them, in solid middle-class jobs in offices and factories. But those are the jobs that technology is already taking over rapidly. As the shift in valuable skills continues, organizations are finding not only that they have no jobs for the disengaged and socially inept, but also that such people are toxic to the enterprise and must be removed.
The Cleveland Clinic learned a similar lesson. Over the past five years it has developed a pathbreaking and dramatically effective program to train all employees and contractors in empathy and relationship building. The clinic found that a few of its people were in the wrong business. “Off-board people who don’t belong,” concluded Dr. James Merlino, who led the transformation effort. “One disengaged employee who does not support the organization or the mission can have negative consequences for an entire department. The hardworking and engaged employees will resent these people being around.” When the human experience is what counts most, one wrong human is one more than you can afford.
High-value skills: Medical researchers conducting clinical trials of an Ebola vaccine in Sierra Leone (above) confer after interviewing community leaders about cultural issues that could affect the trials; without deep cultural knowledge, obtained in person, the trials might not succeed.The current transformation of how people create value is historically quite sudden. Most people’s essential skills remained largely the same from the emergence of agriculture 12,000 years ago to the dawn of the Industrial Revolution in the mid-18th century. The transition to an industrial economy in the Western nations, and the accompanying shift in skill values, took well over 100 years. The subsequent turn to a knowledge-based economy took most of the 20th century. Now, as technology gallops ahead with longer strides every year, the transition to the newly valuable skills of empathizing, collaborating, creating, leading, and building relationships is happening faster than corporations, governments, education systems, or most human psyches can keep up with. That’s disorienting, and it gets more so as the fundamental nature of value shifts from what you know to what you’re like.
As economies have evolved over the centuries, we’ve always looked outward to get the new skills we require, to elders, schools, trainers, and employers that knew and could teach us what we needed to know. Now, for the first time, we must also look inward. That’s where we find the elements of the skills we need next. Developing those abilities will not be easy or comfortable for some, and it is likely to get harder for everyone, because as the abilities become more valuable, standards will rise. Even those who are good at them will have to get better.
If the prospect sounds worrying, it shouldn’t. On the contrary, it’s wonderful news. Just think of what we’re being asked to do—to become more essentially human, to be the creatures we once were and were always meant to be. Odd as it may sound, that’s a significant change from what we’re used to. For the past 10 generations in the developed world, and shorter but still substantial periods in many emerging economies, most people have succeeded by learning to do machine work better than machines could do it. Now that era is ending. Machines are increasingly doing such work better than we ever could. We face at least the opportunity to create new and better lives.
Staking our futures to our profoundest human traits may feel strange and risky. Fear not. When you change perspectives and look inward rather than outward, you’ll find that what you need next has been there all along. It has been there forever.
In the deepest possible sense, you’ve already got what it takes. Make of it what you will.