by Carl R. Tannenbaum, Chief Economist, Northern Trust
History suggests that it is better to embrace progress than hinder it.
Editor’s Note: This is an updated version of a piece we originally published in 2017, well before the hype surrounding artificial intelligence (AI) gathered momentum. The material is even more poignant today than it was seven years ago.
The first computer I ever used was a monster. I have no idea how they got the device up to the second floor of my grade school (which was more than 80 years old and had no elevator), and the lights in the library flickered every time we turned the thing on. To bring the machine to life, we had to dial a number on an adjacent telephone, wait for a tone and then plunge the receiver into a set of suction cups.
Interestingly, the first thing I ever did on that contraption was play Mancala, a game invented by Bedouins thousands of years ago using sand pits and stones. The computer defeated me the first few times, but then I began to recognize patterns in its play that made its moves easy to anticipate. After I began to beat it consistently, I rapidly lost interest because the opponent was overly robotic.
That vintage computer seems as primitive today as the idea of playing games in the sand was back in the 1970s. Computers have become much more powerful, and they are able to learn from their experiences; today’s software plays chess and Gomoku better than top grandmasters. With artificial intelligence now trained on commercial applications, many are wondering what other things machines might eventually do better than people. And that prospect is creating economic unease around the world.
Mankind has always had mixed feelings about machines.
The transition from man to machine is a long-running industrial theme. Two hundred years ago, steam-powered mills began replacing human weavers (prompting destructive retaliation from the Luddites). In the 1800s, the advent of automated reapers disrupted agriculture. In the last century, robots took deep root on the factory floor. And, most recently, artificial intelligence has started to transform a number of professions.
Economic theory suggests workers should welcome technological advances, not fear them. Anything that makes labor more productive raises its value, which generally results in better pay, more leisure time, or both. At a time when global productivity growth has been slow, the press for additional efficiency should be seen as a good thing.
Despite the inexorable march of innovation, employment has continued upward. When change occurs, it is fairly easy to identify the jobs that might be at risk but more difficult to identify the ones that will arise to take their places. But those new opportunities have always appeared, and market economies adapt to embrace them. Thirty years ago, few would have foreseen that thousands of people would be working on cybersecurity, gene-based therapies or driverless logistics. But all are growth fields today.
Further, the developed world has a demographic problem. Postwar generations are transitioning into retirement, birth rates are low and global attitudes towards immigration are under review. With labor force growth slowing to a crawl, advancing automation would seem to be an elegant solution. The example of long-haul logistics is often cited here: driverless transportation could compensate for substantial impending retirements from the ranks of truck drivers and train engineers.
Nonetheless, there seems to be significant apprehension about current advances in AI. The science seems to be moving ahead rapidly, affecting work across a broad range of industries. The race to disrupt has captured the imagination of venture capitalists, but strikes fear into the hearts of the masses.
It has been traditionally assumed that human beings would transition from repetitive work to more cerebral work as automation took deeper root. Programmers would provide instructions to the machines, technicians would monitor their performance and analysts would interpret the immense amounts of data collected in the process.
Technological change can create complicated politics.
But advances in artificial intelligence have enabled machines to analyze outcomes, communicate in natural language and direct their own actions. Interestingly, many different kinds of programming are being done by AI. Extrapolating on these developments, observers have created lists of jobs that will become obsolete (in whole, or in part) in the coming decades.
This evolution turns previous waves of automation on their heads. More educated workers in developed countries are most vulnerable this time around. Fields like finance, engineering and the law are among those most exposed. Workers with more education are more at risk.
Questions also surround who will reap the rewards of the impending transformation. AI is forecast to usher in a new era of productivity growth, which should raise the marginal value of labor. But there is some concern that the value unlocked by the new technology will be concentrated in the hands of a few companies and individuals, and not the population at large.
Add in concerns about the use of AI by cybercriminals, and you have the makings of broad-based anxiety. And this feeling can affect what people do within polling places.
While today’s threats seem pressing and proximate, it could be many years (or even decades) before we see broad-based worker displacement. New concepts require long periods of testing and adoption before they reach full potency. Some may prove too costly, while others may be hindered by social or regulatory restrictions.
To assuage concerns, some have proposed curbs on the application of AI and restricting its access to sensitive data. But history suggests that it is better to embrace progress than hinder it. There is certainly more work to do in many societies to sustain equality of opportunity, but few would recommend a policy of banning backhoes so that people can go back to digging foundations with shovels.
Nonetheless, I am hedging my bets. Futurists suggest there is a 50% chance that economists will be superseded by computers, but less than a 1% chance that dentists will. Goodbye, Keynes, hello, cavities.
Copyright © Northern Trust