Analysis#013 · September 9, 2025 · 5 min read

What History Tells Us About Technology and Inequality

Every major general-purpose technology in history, the printing press, steam power, electrification, computing, has been followed by claims that it would either democratize prosperity or concentrate it catastrophically. Both claims have usually been partially right, for different people, at different times. The historical pattern is more instructive than either utopian or dystopian framing.


The pattern across industrial transitions

The industrial revolution provides the clearest historical case. The first 60-80 years of industrialization in Britain (roughly 1760-1840) were associated with stagnant or declining real wages for most workers, rising inequality, and devastating dislocation of traditional crafts and agricultural livelihoods. The widely read 'Engels Pause' describes this period of technological advance without broad welfare gains.

After that pause, real wages rose substantially for workers who had accumulated factory skills, urbanization raised living standards broadly, and the productivity gains eventually diffused into wages through labor organization and competitive labor markets. The technology ultimately lifted all boats, but the transition period involved enormous costs borne almost entirely by specific people in specific places.

The skill premium cycle

Economic historians Claudia Goldin and Lawrence Katz documented a recurring pattern: new technologies initially create a premium for skills that are rare and complementary to the technology. Over time, education and training systems catch up, supply increases, and the premium compresses. This cycle takes 15-30 years to complete.

The computing revolution followed this pattern. In the 1980s and 1990s, workers with computer skills earned large premiums. By the 2010s, basic computer literacy was standard and the premium had narrowed. The leading edge of the premium shifted to data science, machine learning, and software engineering. AI appears to be accelerating this cycle: skills that earned large premiums in 2020 are being partially automated by 2025.

What makes this time different

The concern with AI that history can't fully adjudicate is that previous general-purpose technologies automated physical or routine cognitive labor while creating demand for more complex cognitive labor. AI is, for the first time, directly engaging complex cognitive tasks. If the ceiling of tasks subject to automation is higher this time, the mechanism by which labor historically captured technology gains (moving up the skill ladder into newly created demand) faces a genuine structural challenge.

History offers grounds for cautious optimism: technology has always created as many jobs as it destroyed, in aggregate and over long time periods. It offers less comfort on the distributional question: the aggregate gains have repeatedly been concentrated while the losses were diffuse and specific. Getting the policy right during the transition matters more than the long-run equilibrium.

XLinkedIn

← Previous
The Case for Thinking in Decades, Not Quarters
Next →
How Network Effects Actually Work (and When They Don't)

Enjoyed this issue?

Get the next one in your inbox.

Free, weekly, and worth your five minutes.

Preferences

No spam. Unsubscribe anytime.