Will AI be Taking Your Job?

The workers were furious. Believing that new mechanical looms threatened their jobs, they broke into factories, seized machinery, brought it into the street and set it afire, all with widespread public support, even tacitly from the authorities.

That was in 1675. And those English textile workers were neither first nor last in the long procession of worriers about the potential harm to jobs from labor-saving devices. Several centuries earlier, the adoption of the fulling mill caused an uproar among workers forced to find other occupations. Almost exactly 60 years ago, Life magazine warned that the advent of automation would make “jobs go scarce” — instead, employment boomed.

Now, the launch of ChatGPT and other generative A.I. platforms has unleashed a tsunami of hyperbolic fretting, this time about the fate of white-collar workers. Will paralegals — or maybe even a chunk of lawyers — be rendered superfluous? Will A.I. diagnose some medical conditions faster and better than doctors? Will my next guest essay be ghostwritten by a machine? A breathless press has already begun chronicling the first job losses.

Unlike most past rounds of technological improvement, the advent of A.I. has also birthed a small armada of non-economic fears, from disinformation to privacy to the fate of democracy itself. Some suggest in seriousness that A.I. could have a more devastating impact on humanity than nuclear war.

When it comes to the economy, including jobs, the reassuring lessons of history (albeit with a few warning signals) are inescapable. At the moment, the problem is not that we have too much technology; it’s that we have too little.

We’ve had forms of artificial intelligence, broadly defined, for millenniums. The abacus, thought to have been invented in Babylonia more than 4,000 years ago, replaced more laborious methods of mathematical calculation, saving time and therefore reducing work.

When I began my career in marketing in the 1980s, we had only hand-held calculators to help with our numerical analysis, which we painstakingly wrote in pencil on large sheets of paper (hence the term “spreadsheets”) and which were then typed by a secretarial pool. Any changes meant redoing the entire spreadsheet. Now, all that happens with the click of a mouse.

Less than three decades ago, library-type research could require hours of combing through dusty volumes; now, it necessitates a few strokes on a keyboard. Not surprisingly, the number of librarians has been flat since 1990, while total employment has grown by more than 40 percent.

Other job categories have almost completely disappeared. When was the last time you talked to a telephone operator? Or were conveyed by a manned elevator? In the place of these and so many other defunct tasks, a vast array of new categories has been created. A recent study co-authored by M.I.T. economist David Autor found that approximately 60 percent of jobs in 2020 were in occupations that didn’t exist in 1940.

And so the Great American Jobs Machine ground on. In the decade after Life magazine decried the robot invasion, the United States created 20.2 million jobs, and today, the unemployment rate sits at 3.6 percent, a hair above its 50-year low. Of course, the number of Americans employed in finance has boomed, even as computers, Excel and other technologies have made them far more productive.

Higher worker productivity translates into higher wages and cheaper goods, which become more purchasing power, which stimulates more consumption, which induces more production, which creates new jobs. That, essentially, is how growth has always happened.

This makes A.I. a must-have, not just a nice-to-have. We can only achieve lasting economic progress and rising standards of living by increasing how much each worker produces. Technology — whether in the form of looms or robots or artificial intelligence — is central to that objective.