Why we shouldn’t be worried about robots taking our jobs just yet

By Michael Jones, University of Cincinnati

The U.S. economy added 2.7 million jobs in 2015, capping the best two-year stretch of employment growth since the late ‘90’s, pushing the unemployment rate down to five percent.

But to listen to the doomsayers, it’s just a matter of time before the rapid advance of technology makes most of today’s workers obsolete – with ever-smarter machines replacing teachers, drivers, travel agents, interpreters and a slew of other occupations.

Almost half of those currently employed in the U.S. are at risk of being put out of work by automation in the next decade or two, according to a 2013 University of Oxford study, which identified transportation, logistics and administrative occupations as most vulnerable.

Does that mean that these formerly employed workers will have nowhere to go? Is the recent job growth a last gasp before machines take over, or can robots and workers coexist?

Research as well as recent history suggest that these concerns are overblown and that we are neither headed toward a rise of the machine world nor a utopia where no one works anymore. Humans will still be necessary in the economy of the future, even if we can’t predict what we will be doing.

 

Rise of the Luddites

Today’s apprehension about technology’s effect on the labor force is nothing new.

The anxiety began in the early 1800s when textile workers, who later became known as Luddites, destroyed machinery that reduced the need for their labor. The fact that calling someone a Luddite today is considered an insult is proof that those worries were largely unfounded. In fact, labor benefited right alongside productivity throughout the 19th and 20th centuries.

Some worry that this dynamic has changed. Larry Summers, formerly the president of Harvard and director of the White House’s National Economic Council, for example, recently changed his tune about the unalloyed benefits of technology.

Until a few years ago, I didn’t think this was a very complicated subject; the Luddites were wrong and the believers in technology and technological progress were right. I’m not so completely certain now.

Derek Thomson, a senior editor at The Atlantic, sums up the arguments for why this time automation will replace labor permanently in an article titled A World Without Work.

First, the share of economic output that is paid to labor has been declining. Second, machines are no longer merely augmenting human work; they are rapidly encroaching on work that today is capable of being done only by humans. Finally, the hollowing out of prime-age men (25-54 years old) in the workforce indicates a more permanent end to work.

 

Crying wolf

My own look at the data suggests that just as the critics of the past were crying “wolf,” so are the pessimists of today.

Yes, it’s true that from 1980 to 2014, workers’ share of output fell from nearly 58 percent to just over 52 percent – evidence that Thompson believes shows that labor’s importance is in a slow decline.

However, recent work by Benjamin Bridgman, an economist at the Bureau of Economic Analysis, has demonstrated that once depreciation and production taxes are taken into account, the story for U.S. workers doesn’t seem as pessimistic. While the most recent data show that the U.S. net labor share has fallen over time, as recently as 2008, the share was the same as in 1975.

Because of the rapid pace of technological improvements, capital depreciates at a faster rate. Companies, or owners of capital, must therefore spend a larger share of profits to repair technology or replace obsolete technology. As a result, labor’s declining share of output is directly correlated to the increasing share of output spent on technology. Since 1970, the share of our nation’s output spent on technology replacement has increased from just under 13 percent to more than 15 percent.

In addition, whenever there are changes in production taxes (e.g., property, excise and sales taxes) the share of output paid to labor will decrease. As a result, while the gross labor share of income has declined, much of it can be explained by technological improvements and changes in government policy.

 

Replace or complement?

Machines are indeed replacing humans – and replicating what we thought were uniquely human skills – at a faster rate than many of us thought possible until recently.

For example, at the beginning of the 21st century, few people would have imagined that a computer could beat the best human in the world at Jeopardy. And yet, in 2011, IBM’s supercomputer Watson did exactly that by beating two former Jeopardy superstars, Ken Jennings and Brad Rutter.

But a focus on technology’s substitutionary (or replacement) role fails to appreciate how it can also be complementary. Job loss in some occupations will certainly continue, but they will be accompanied by gains in different fields, just as in the past.

Watson is a case in point. In 2012, a year after Watson’s Jeopardy victory, IBM formed a partnership with the Cleveland Clinic to assist physicians and improve the speed and accuracy of medical diagnosis and treatments. In this case, Watson augments the skills of physicians, creating more demand for doctors with access to the supercomputer.

The biggest risk is that this will polarize the labor market as the demand for workers grows on both the high and low ends in terms of education. It’s a trend that economist David Autor has been documenting since 1979. Highly skilled individuals in managerial, professional and technical occupations have all seen improvements, as have service jobs that require little education (in part because it’s difficult to automate the work of hairstylists or janitors).

While this polarization of jobs can have negative short-term effects in the middle of the distribution, it is a mistake to overstate the long-term consequences.

 

What’s really happening to all the men

Finally, it is true that since 1967, the share of men aged 25–54 without work has more than tripled, from five percent to 16 percent.

But the reasons they’re not working have less to do with the rise of the machines than we’re being led to believe. According to a New York Times/CBS News/Kaiser Family Foundation poll of Americans without jobs, 44 percent of men surveyed said there were jobs in their area they think they could obtain but weren’t willing to take them. In addition, around a third of those surveyed (including women) indicated that a spouse, food stamps or disability benefits provided another source of income.

An unwillingness to relocate geographically may also help explain the decline in labor force participation. In a 2014 survey of unemployed individuals, 60 percent said that they were “not at all willing” to move to another state.

These findings suggest that while the U.S. boasts the most job openings since the government began tracking them nationwide (5.6 million), many of those without work don’t want to apply for one reason or another.

It’s not man versus machine yet

These figures and polls paint a very different picture of the actual problem. In addition to geography constraints along with spousal and government income supports contributing to fewer people wanting to work, we also have a skills gap. Fortunately, this is a problem that we can overcome with better education and training, rather than resigning ourselves to an irreversible decline in the share of jobs that require a human.

During the most recent recession, there was a decline in construction and manufacturing jobs, which typically required lower levels of education, and an increase in health care and professional service jobs, which often require advanced degrees.

Instead of wringing our hands and blaming technology, we should be rolling up our sleeves to ensure that people who lose their jobs to technology are being retrained. This also requires patience – recognizing that it will take time for these workers to be reemployed in higher-skilled jobs.

Until the number of job openings declines and remains persistently low, one should be careful about pitting man versus machine.

The Conversation

Michael Jones, Assistant Professor, Educator in Economics, University of Cincinnati

This article was originally published on The Conversation. Read the original article.

COMMENTS