Why Humans Become the Scarce Resource in a Post-AGI World
Dara Ladjevardian, CEO & Co-Founder at Delphi
When Dara Ladjevardian, CEO of Delphi, projects five to ten years forward, he doesn’t see a world where AI has replaced human judgment. Instead, he sees the opposite: humans as the scarce resource. As AI becomes smarter, faster, and cheaper, the things people actually crave are human experiences, perspectives, and wisdom grounded in real life.
“If AI is so smart, really fast, pretty much free, and there’s an abundance of access and information,” Dara explains, “what ends up happening is humans become the scarce resource. Humans become the things that we actually start to appreciate—things with lived experiences.”
This thesis underpins Delphi’s entire bet. While other AI companies race to make their models faster and more capable, Delphi is placing the human front and center—not as a user interface problem to solve, but as the actual product.
The Novelty Cliff of AI-Generated Everything
The intuition is straightforward but counterintuitive. For the next five to ten years, there will be intense novelty around AI-generated content, deepfakes, and synthetic media. But novelty has a shelf life. After the initial shock wears off, people hit saturation.
“The content becomes so much that you’re like, I don’t even know what to do anymore,” Dara says. “There’s too many things to consume. There’s too many options. Can someone just tell me what to do? Someone who’s actually lived it before, can they tell me what to do?”
This pattern already exists in domains that have AI competition. Chess is instructive: deep learning systems have beaten the world’s best human players for years, yet people still watch and study human chess players. Boxing offers another example. Robots can box, but they’re not as engaging as watching humans box. The added engagement doesn’t come from superior performance—it comes from the knowledge that a real human, with real stakes and real failure modes, achieved the feat.
Why Lived Experience Can’t Be Replicated
The difference runs deeper than mere engagement or nostalgia. There’s an unquantifiable element—something almost spiritual—in how humans connect with other humans who have lived through something.
“There’s an unspoken, almost spiritual connection that we have with other humans,” Dara reflects. “When a real human does something, there’s a different, almost contract that is being made.” This contract, he suggests, is rooted in 300,000 years of human evolution. We’ve been in communities, we’ve cared for family, we’ve made relationships. These patterns are primal, older than any technology, and they don’t change when a new AI model drops.
Your daughter’s piano recital isn’t impressive because of perfect technique. It’s impressive because she’s real, she practiced, she will fail and try again. The same performance from a robot generates a different response entirely—intellectually interesting, maybe, but not meaningful in the way a parent watches their child.
This principle extends to expertise. You follow someone on Twitter, you read their book, you seek their advice—not because they’re the smartest person alive (Wikipedia exists), but because their perspective is grounded in decisions they’ve made, failures they’ve lived through, and a track record you can evaluate for yourself. Their experience isn’t commodity knowledge. It’s proof.
The Practical Implication for Knowledge Work
For founders and professionals, this has immediate implications. The future isn’t about competing with AI on raw capability. It’s about making your lived experience—your unique perspective, your specific failures, your hard-won principles—scalable and accessible.
That’s precisely the problem Delphi tries to solve. As Dara puts it: “You can read books, you can skim a person’s personal website. There’s no way to ask someone a question to truly understand what they know and how they think about things.” A book is passive knowledge transfer. A conversation with the actual person is interactive knowledge transfer, personalized to your specific situation.
When someone learns from you—not from AI trained on your tweets and interviews, but from a representation of how you actually think—they absorb not just the information but the reasoning underneath it. That reasoning is built on your experiences, and it’s the part that won’t be commoditized.
FAQ
What does Dara mean by “humans become the scarce resource”?
As AI becomes ubiquitous and cheap, people will crave interaction with real humans who’ve lived through meaningful experiences. Expertise grounded in lived experience—not just information—becomes the differentiator. Your perspective shaped by real failures and real wins is harder to replicate than any technical skill.
Won’t AI eventually learn to capture lived experience the same way it captures information?
Dara’s point isn’t that AI can’t simulate human reasoning. It’s that people prefer the real thing. A perfect AI simulation of a person and the actual person are not equivalent to humans seeking authentic connection. The unquantifiable element of knowing someone made real choices with real stakes changes how people relate to that wisdom.
How does this apply if I’m not a celebrity or famous person?
The thesis applies anywhere you have inbound demand—customers asking you questions, people seeking your advice, colleagues wanting to learn from you. A founder might use Delphi to scale customer conversations. A consultant might use it to handle repetitive questions. An investor might use it to let entrepreneurs learn how they think. You don’t need to be Arnold Schwarzenegger.
If humans become scarce, won’t that create inequality?
Potentially. If access to wisdom from respected humans becomes a premium, those with existing platforms have an advantage. Delphi’s approach is to democratize access by letting any expert (not just celebrities) create a digital representation of their knowledge and monetize it directly.
What about the risk of deepfakes and AI-generated clones that impersonate humans?
That’s a real concern, which is why Delphi uses human verification (LinkedIn sign-in, photo ID). The platform’s bet is that people will trust verified humans more than they’ll trust unverified AI. The scarcity value comes from authenticity.
Can I just get this wisdom from a book or podcast?
Books are passive—you consume what the author chose to share. Podcasts are one-way. With Delphi, you ask questions specific to your situation and get answers tailored to your context. That interactive, personalized layer is what Dara argues becomes more valuable as information becomes abundant.
What happens when AI models improve and become indistinguishable from humans?
Even if an AI becomes indistinguishable, people will still prefer interaction with real humans when given the choice. The cognitive knowledge that it’s a real person with real experience adds a layer of meaning that’s psychological, not just informational.
Is this just nostalgic thinking about a pre-AI world?
No. Dara’s point is that as novelty fades, people naturally gravitate toward authenticity. The pattern already exists in chess, sports, music, and social media. The question isn’t whether this will happen, but how quickly and at what scale.
How do you scale access to your expertise if humans become the valuable resource?
That’s the paradox Delphi solves. A knowledge graph representation of how you think lets you scale one-on-one conversations asynchronously. You’re not there, but your thinking process is accessible. It’s more scalable than in-person mentorship but more personalized than a generic AI.
Does Delphi’s business model depend on this thesis being true?
Yes. If AI fully replaces the value of human wisdom, Delphi’s value proposition collapses. But Dara’s betting that the opposite is true: the better AI gets, the more people will crave authenticity and lived experience. If he’s right, Delphi becomes more valuable over time, not less.
Full episode coming soon
This conversation with Dara Ladjevardian is on its way. Check out other episodes in the meantime.
Visit the ChannelRelated Insights
AI Interfaces Are Stuck in the Chatbot Era—Here's What Comes Next
Mercedes Bent, Co-Founder & Partner at Premise Ventures
AI Literacy Isn't Optional — Here's How to Give Your Kids What Schools Can't
Steve Ruiz, Founder & CEO at tldraw
Canvas vs. Chat — Why Spatial Interfaces Win for AI Collaboration
Steve Ruiz, Founder & CEO at tldraw