August 2025 ยท 5 min read
The Death of Expertise (And What Replaces It)
When anyone can sound like an expert.
When anyone can sound like an expert, what happens to actual expertise?
I watched a friend write a detailed blog post about immunology last week. He's a graphic designer. He used ChatGPT to help him draft it, then polished it himself. The result reads like it was written by someone who deeply understands the subject. It cites studies. It uses the right terminology. It explains mechanisms accurately.
My friend didn't lie about being an expert. He didn't claim credentials he doesn't have. He just produced expert-sounding prose about a field he didn't understand a month ago. And when I asked him to explain one of the concepts he'd written about, he couldn't. The understanding was in the text, not in his head.
This is becoming normal.
The Expertise Illusion
Expertise used to have recognizable markers. Years of training. Credentials. A track record of work. If someone wrote confidently about a technical subject, you could assume they probably knew what they were talking about. The barrier to producing expert-sounding content was actually being an expert, or at least spending significant time researching.
AI collapses that barrier. Now the production of expert-seeming content is trivial. Anyone with access to an LLM can generate text that reads like it was written by a specialist. The form of expertise has been democratized. The substance hasn't. This creates a gap between appearance and reality.
Tell me more about verifying expertise in the AI ageThe problem isn't that AI makes false claims. The claims in my friend's immunology post were accurate, as far as I could verify. The problem is that the human serving as the "author" doesn't understand what they've written. They can't answer follow-up questions, identify nuances, recognize edge cases, or apply the knowledge in new situations. They're a channel for information, not a knower of it.
What We Lose
Real expertise isn't just knowing facts. It's having a mental model of a field, understanding why things work the way they do, being able to reason about novel situations using deep familiarity with how the domain operates.
When an actual immunologist writes about immune response, they're drawing on thousands of hours of study, research, clinical observation. They can tell you not just what happens but why it happens, what we're not sure about, what the current debates are, how this case differs from the textbook. They have judgment, not just information.
AI-assisted writing can reproduce the information but not the judgment. And readers often can't tell the difference. The writing looks the same. The credentials often aren't checked. We're creating an environment where pseudo-expertise proliferates while remaining indistinguishable from the real thing.
The Trust Problem
Our social systems rely on trust in expertise. You trust that the person who wrote the medical article has medical training. You trust that the financial analyst understands markets. You trust that the technical documentation was written by someone who knows the system.
When that trust erodes, several bad things happen. First, actual experts become harder to identify. Their writing looks the same as everyone else's. The signals we used to rely on, fluency, confidence, technical vocabulary, no longer distinguish the real thing from the imitation.
Second, expertise itself becomes devalued. Why spend years learning a subject when you can produce expert-sounding output in minutes? The incentive to develop deep understanding weakens when shallow understanding plus AI produces similar visible results.
Third, accountability gets murky. When something goes wrong because someone followed AI-assisted advice that missed something an expert would have caught, who's responsible? The person who generated the content might not understand it well enough to know what went wrong.
What Replaces It
I don't think expertise dies. But it changes form.
The new version of expertise might look less like "knowing a lot about X" and more like "being able to critically evaluate claims about X" and "knowing what questions to ask about X" and "understanding what good reasoning about X looks like."
It's the difference between being a library and being a good librarian. The library holds the knowledge. The librarian knows how to find what you need, how to evaluate sources, how to recognize when something is missing. In a world where information production is cheap, curation and judgment become more valuable.
Paradoxically, this might make traditional deep expertise more valuable rather than less. Anyone can produce expert-sounding content, but only someone who actually understands can evaluate whether that content is good, can catch subtle errors, can know when the AI is confidently wrong. The need for real expertise doesn't go away. It just moves up a level.
Living in the Transition
We're in the awkward period where the old signals of expertise have broken down but new ones haven't emerged. You can't trust that expert-sounding writing reflects expert understanding. You can't easily tell who actually knows something versus who just produced fluent prose about it.
Some practical implications: verify more, trust less. Check claims independently when they matter. Pay attention to whether someone can answer follow-up questions, engage with objections, explain their reasoning. Look for track records and demonstrated understanding, not just polished output.
And if you're producing AI-assisted content yourself, be honest about your actual level of understanding. There's nothing wrong with using AI to help you write about something you're learning. There's something wrong with presenting yourself as understanding what you don't.
My friend's immunology post is still up. It's accurate, well-written, genuinely helpful for the people who read it. He was honest about his process. But it exists in a strange space: correct information delivered by someone who doesn't understand it, and indistinguishable from what an expert would produce.
This is the new normal. Expert-sounding content everywhere, from everyone. The question isn't whether this changes how we relate to expertise. It already has. The question is how we build new systems of trust when the old signals have broken down.
I don't have a complete answer. But I suspect it involves caring more about demonstrated understanding and less about polished output. About track records and reputations rather than individual pieces of content. About judgment and curation rather than information production.
Expertise isn't dead. It's just harder to see.