The Part They Leave Out
Or why not everyone will be able to retrain on AI
In 1944, Margaret and Eleanor worked in the same calculating room at the Moore School in Philadelphia. Both were mathematics graduates from Penn. Both computed artillery shell trajectories by hand. Same education, same job, same salary. They took the same lunch hour.
Then ENIAC arrived. It calculated a single trajectory in thirty seconds. Margaret and her colleagues took twelve hours.
Six women were selected to learn how to operate the machine. Margaret was one. Eleanor was not.
The official reason Eleanor gave was practical: her mother was ill and needed care. This was true. What was also true is that Margaret happened to be in a section where the officer coordinating the selection could observe her work. Eleanor was on a different shift. Margaret mentioned she was curious about machines. Eleanor never got the chance to be curious because nobody asked.
Margaret helped invent programming. She developed subroutines. She became foundational to an entire industry.
Eleanor remained a human computer for three more years before taking a job at an insurance company. She died in a state she’d never planned to move to.
This is the part they leave out of the “technology creates new jobs” story.
The space between macro truth and individual reality
I spent years researching what happened to skilled workers during technological disruptions. Nineteen stories, from ancient Egypt to 1990s Atlanta. And the part I kept finding, the part no one talks about honestly, is the middle.
The macro narrative is almost always true. Economies adapt. New industries emerge. Productivity goes up. If you zoom out far enough, everything works out.
But if you’re Margaret or Eleanor, you don’t live at the zoom-out level. You live at the level of which shift you’re on, whether the right person can see your work, and whether your mother is sick.
The stories that get told about technological transitions skip the middle because the middle is messy, contingent, and uncomfortable. It doesn’t reduce to advice. It doesn’t make a good TED talk.
What the middle actually looks like
Across nineteen cases, the middle follows a consistent pattern.
First, there’s a bifurcation. The market splits. People who want the new, cheaper version and people who still want the old, premium version. For a while, skilled workers retreat to the premium segment and do fine. Janet, the travel agent in Atlanta, watched Expedia launch and thought it was absurd. Her luxury clients didn’t care about Expedia. Her phone kept ringing.
Then the premium segment shrinks. Not because the cheaper version gets better (though it does), but because market expectations shift. What used to be “basic” becomes “good enough.” Lisa, the travel agent at the next desk, watched her walk-in traffic thin. Families started saying: “Let me check Expedia and get back to you.”
Then there’s a scramble. The workers who waited for the premium market to sustain them discover it can’t. Lisa tried to specialize in 2004. But the niches were already staffed by agents who’d pivoted five years earlier. The luxury market was not waiting for her.
Finally, a shakeout. Janet survived, but lost 500 clients and ended up with 150. That’s not a success story. That’s what survival looks like.
The timeline problem
The part that struck me most was the timeline. The disruption was visible for years before it became urgent. The signs were there. But the window for effective action was much shorter than the window of visibility.
Rachel, a junior accountant in Chicago, taught herself Lotus 1-2-3 at her kitchen table in April 1983 while her firm slept. She had about five years before everyone caught up. The accountants who learned Lotus in 1988 found the advantage had collapsed to zero.
You could see the change coming for a decade. You could act on it effectively for three to five years. After that, the good positions were taken.
What this means
I’m not going to pretend I know exactly how AI will reshape knowledge work. But I can tell you what the pattern looks like, because it’s been the same for 4,000 years.
The people who came through weren’t the ones who correctly predicted the future. They were the ones who moved before they had complete information. Margaret moved toward ENIAC while it was still unreliable. Rachel learned spreadsheets while her colleagues waited. Thomas the blacksmith started selling reaper parts before he fully understood the machine.
None of them had a strategy document. They had a bias toward action when the ground shifted.
And none of them came through unchanged. That’s the honest cost. Adaptation works. But it costs something. The people in my book who navigated displacement successfully still lost the professional identity they’d planned to have.
The stories don’t promise everyone comes through. They show you what the people who did come through actually did. That’s the best I can offer. It’s also more than most of the AI conversation is giving you right now.



