Think technology is disrupting the job market like never before? Think again.

A new report from the Information Technology and Innovation Foundation analyzes the US labor market from 1850 to the present and finds that we are in an era of unprecedented calm. And that's not good.

Tesla vehicles are being assembled by robots at Tesla Motors Inc factory in Fremont, Calif., in 2016.

Joseph White/Reuters/File

May 10, 2017

What if I told you that, contrary to the alarming headlines and eye-catching infographics you may have seen ricocheting around social media, new technologies aren’t shaking up the labor market very much by historical standards? You might think I was as loopy as a climate-change denier and suggest that I open my eyes to all the taxi drivers being displaced by Uber, the robots taking over factories, and artificial intelligence doing some of the work lawyers and doctors used to do. Surely, we are in uncharted territory, right?

Right, but not in the way you think. If you study the US labor market from the Civil War era to present, you discover that we are in a period of unprecedented calm – with comparatively few jobs shifting between occupations – and that is a bad sign. In fact, this low level of “churn” is a reflection of too little, not too much technological innovation: Lack of disruption is a marker of our historically low productivity growth, which is slowing improvement in people’s living standards.

A new report from the Information Technology and Innovation Foundation (ITIF) examines this trend in detail using large sets of US Census data that researchers at the Minnesota Population Center have curated to harmonize occupational classifications over long periods. ITIF’s analysis quantifies the growth or contraction of individual occupations, decade by decade, relative to overall job growth, and it assesses how much of that job churn – whether growth or contraction – is attributable to technological advances. The report concludes that, rather than increasing, the rate of occupational churn in recent years has been the lowest in American history – and only about one-third or one-quarter of the rate we saw in the 1960s, depending on how you measure contracting occupations.

In Kentucky, the oldest Black independent library is still making history

Has technology eliminated a lot of jobs? Yes, absolutely. In fact, ITIF’s analysis shows there was not a single decade from 1850 through 2010 in which technology didn’t destroy more than it was directly responsible for creating (although the last 15 years have seen the highest ratio of jobs created by technology to jobs destroyed by technology since World War II). Yet the US economy has created jobs at a steady, robust rate as the workforce has grown, and unemployment usually has been low. 

As new technologies eliminate jobs in some occupations, they create new employment elsewhere. For example, the number of automotive mechanics rose from zero in 1900 (before Henry Ford started the Ford Motor Company) to 425,000 just 30 years later, as millions of Americans bought cars and needed them repaired. Car mechanic jobs peaked at over 1.8 million in 2000. But as new cars have continued to improve, and as they have incorporated more sophisticated electronic systems that keep them from needing as many tune-ups, the number of mechanics’ jobs has been falling – to around 1.5 million as of 2015 – even as the number of cars has increased and they have been staying on the road longer. 

That is not to say there are no jobs available for the people who used to be auto mechanics. But for the most part, new jobs are not in occupations directly related to producing or servicing new technologies. Instead, technological change mostly creates new jobs indirectly – as the knock-on benefit of boosting productivity: New innovations allow workers and firms to produce more, so their wages go up and prices go down, which increases spending and in turn creates more jobs throughout the economy, from cashiers and construction workers to designers and engineers.

Think for a minute about bowling alleys. Before World War II, tens of thousands of men and boys had jobs setting up the pins after bowlers knocked them down. Then AMF developed the automatic pin-setting machine and those jobs soon vanished. The men and boys who used to set pins manually didn’t go to work at AMF. Instead, the productivity of bowling alleys increased, so their prices went down, and bowlers now had a bit more cash to spend on other things. That’s where the men and boys who used to set pins went: into jobs providing things their former customers now wanted to buy. It was the same for elevator operators who were displaced by self-service lifts and for railroad brakemen, motion picture projectionists, telephone operators, and generations of workers in other occupations made more productive by technology.

But now there is an overwrought school of thought that says today’s technological disruptions – powered by the Internet, robots, and artificial intelligence – are somehow completely different than anything that has come before, because they are purportedly automating every occupation in sight. A much-ballyhooed study by Oxford University researchers Carl Benedikt Frey and Michael A. Osborne set the tone in 2013 when it trumpeted the jarring conclusion that 47 percent of US employment is at risk of computerization. The study wasn’t peer reviewed, and in the bright light of day its methodology turned out to be conspicuously flawed. (Just a few of the more laughable examples of the people it assumed would be displaced by robots: fashion models, manicurists, barbers, and, my personal favorite: school bus drivers. Anyone want to let their middle-school child ride an autonomous school bus without an adult present?) 

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

Others have actually claimed that Frey and Osborne are underestimating the likely impact of technology. Silicon Valley gadfly Vivek Wadhwa asserts that 80 to 90 percent of US jobs will be eliminated in the next 10 to 15 years. Still others, like the opinion-leading World Economic Forum, regularly tweet legitimate-looking statistics that break down the risk of job loss in excruciating detail. (Computer programmers allegedly face a 48 percent probability their jobs will be automated; for personal financial assistants, the probability is 58 percent; and for bartenders, it is 77 percent. Wait. Bartenders? Who’d want to banter about sports or politics with an AI? Hmm…)

Aside from being methodologically suspect and, as ITIF shows, ahistorical, this false alarmism is politically dangerous, because it feeds the notion that we should pump the breaks on technological progress, avoid risk, and maintain the status quo – a foolish formula that would lock in economic stagnation and ossify living standards. Policymakers certainly can and should do more to improve labor-market transitions for workers who lose their jobs. But if there is any risk for the near future, it is that technological change and productivity growth will be too slow, not too fast.

So, let’s all take a deep breath and calm down. Labor market disruption is not abnormally high; it’s at an all-time low, and predictions that human labor is just a few more tech “unicorns” away from redundancy are vastly overstated, as they always have been.

Robert D. Atkinson (@RobAtkinsonITIF) is president of the Information Technology and Innovation Foundation, a leading science- and tech-policy think tank.