The Marketplace is Not Your Master
AI hasn't taken your job. But it's doing something worse.
Of all the lazy journalism cluttering my feed — I particularly hate when reporters treat X threads as breaking news sources (thanks, I can read tweets myself) — The Economist consistently cuts through the noise with actual reporting and research that doesn't insult my intelligence.
🔎 SIGNAL
US employment data tell a story that contradicts the headlines. According to The Economist's latest analysis, AI isn't causing mass unemployment. Quite the contrary, in fact: “Over the past year, the share of employment in white-collar work has risen very slightly.”
While the current GenAI sensation still has to demonstrate its claimed value, the algorithmization of work has a long history already.
A recent Eurofound report reveals workers’ frustrations with tasks that are so automated, so procedurally micromanaged by AI systems, that they no longer believe their work has meaning.
We're not losing jobs to robots. Optimizing for efficiency alone makes us losing the soul of work itself.
The Financial Times reported similar findings across multiple industries. Workers describe feeling like "human accessories" to machines, monitored constantly, isolated from colleagues, reduced to filling gaps that algorithms can't quite handle yet.
I've built enough products to know that when we optimize for efficiency alone, we optimize away the very things that make work human.
🎥 STORY | The Last Hour Worker
Wednesday morning, Pope Leo XIV stepped out onto a sunlit Piazza San Pietro and reminded us of something Silicon Valley has forgotten: the universal nature of human dignity.
In his catechesis, the Pope told the parable of the vineyard owner who kept returning to the marketplace to hire more workers—even at the last hour of the day. Why? Because each person, no matter how “late,” still had something to offer. Their time wasn’t measured in utility, but in something more.
Contrast that with the vision being coded in Palo Alto. Take Amazon’s “Vulcan” and “Sparrow” robots—sort millions of items daily with precision that makes human workers look clumsy by comparison. Or consider UXAgent, a new system that uses large language models to simulate user testing. These systems aren’t just replacing jobs. They’re replacing the need to talk to humans at all.
Pope Leo speaks of a God who goes out of His way to make you feel seen. Even when the world has already moved on.
There’s a sacred economy here: not of productivity, but of participation.
And yet, our dominant AI ethic is one of subtraction: fewer workers, fewer steps, fewer voices. Efficiency as erasure.
When I worked in product management, we had a saying: "The user is not a use case." People aren't just data points to be optimized. But somehow, we've forgotten this applies to workers too.
The Pope's message cuts through the noise: dignity isn't earned through productivity metrics. It's inherent.
If you want to know what kind of world Big Tech is building, ask: who gets invited to the vineyard? And who’s left standing in the algorithmic marketplace, waiting for a call that never comes?
🧭 THE HUMAN OVERRIDE
Most automation initiatives start with the wrong question: “How do we reduce human effort?”
That framing alone reveals the flaw. It assumes humans are a cost, rather than a source of value. Instead, we should begin with this:
“Where is human judgment irreplaceable—and how do we protect it?”
How to Rebuild Systems that Recognize, not Replace
Here’s a 4-part framework I’ve recommended to companies struggling to build automation that respects their people:
1. Identify judgment zones
Start by auditing your processes to identify “judgment zones”—moments where human discretion, emotional intelligence, or ethical reasoning is essential.
For example:
In financial services, this might be recognizing when a loan applicant's credit score doesn't tell the full story of their circumstances.
In manufacturing, it's knowing when to halt production because something "feels off" even when all metrics appear normal.
In customer service, it's distinguishing between a frustrated customer who needs empathy and a hostile one who needs boundaries.
In logistics, it might be route changes due to inclement weather.
In HR, it’s understanding why a candidate didn’t make it through the algorithm.
Don’t automate these. Fortify them. Build tools that support the human in the loop, rather than trying to replace them.
2. Reverse the default
Instead of asking “why not automate?”, ask “why must we?” Make a case for each automation initiative that includes:
Its effect on worker agency
Its impact on social cohesion
The cost of false positives or system drift
This “reverse burden of proof” is gaining traction. The Dutch city of Rotterdam, for example, recently imposed a moratorium on algorithmic welfare controls after several false sanctions led to devastating consequences for lives.
3. Design for recognition, not compliance
Surveillance-based productivity tools (like keystroke trackers or camera-based attendance) are designed for compliance. But humans don’t thrive under suspicion.
Try this: design tools that start with recognition. That could mean systems that highlight when a worker showed empathy, solved a hard problem, or helped a teammate. Recognition fosters dignity. Compliance erodes it.
4. Include the last-hour worker
Echoing Pope Leo: every person deserves meaningful participation, not just the early adopters or A-players.
So ask:
Who is this system not serving?
What voices were missing from this design process?
Have we created a backdoor for latecomers—the ones with nonlinear paths, gaps in their CV, or just bad luck?
Hiring should still be human.
🔥 SPARK
Do we want to be wanted or just efficient?
There’s a deep ache at the heart of our AI moment. It’s not about fear of machines. It’s about fear of not being needed.
This ache isn’t new. But it’s getting more mechanical, more normalized. When even creative jobs—writers, illustrators, teachers—are flattened into LLM prompts or e-learning modules, the message is clear: “We like your output. We don’t need you.”
That’s the psychic wound of this new automation wave. And we’re covering it up with dashboards and KPIs.
But the Pope’s vineyard story reminds us of something wildly countercultural: that even those hired in the “last hour” are worthy—not because of what they produce, but because they showed up.
It’s time we designed AI systems that reflect that anthropology. One that says: you matter before you perform.
What if the future of tech wasn’t just “human-centered”—but human-restoring?
Further Reading:
Working with robots often carries mental strain, studies find (Financial Times)
Amazon's AI-powered warehouse robots (Engadget)
Thank you, wonderful insights. Humans really matter - to God and to one another.