October 2025 ยท 5 min read

How to Think Clearly in the Age of AI

It's harder than it sounds.

Clear thinking has always been difficult. We're pattern-matching machines running on emotional hardware, prone to confirmation bias and motivated reasoning and all the other cognitive pitfalls that psychologists have catalogued. But the difficulty has increased. The environment has changed. And the old advice about critical thinking doesn't quite fit the new situation.

Here's what I've learned, mostly through failure.

Recognize When You're Outsourcing

The first step is noticing. When you ask AI for an answer, notice that you're outsourcing. When you accept a recommendation without evaluating it, notice that too. When you let autocomplete finish your thought, notice what thought got replaced.

This isn't about refusing to outsource. That's neither possible nor desirable. It's about knowing the difference between your thinking and assisted thinking. The blend is fine. The blur is dangerous.

Tell me more about outsourced cognition

I keep a rough mental ledger. This idea came from me. This idea came from a conversation with AI. This opinion formed before I checked what others thought. This opinion formed after. The ledger isn't precise, and precision isn't the point. Awareness is the point.

Practice Thinking Before Searching

Here's a habit I've developed: when a question occurs to me, I sit with it before looking up the answer. Not forever. Just a few minutes. I try to reason through what I think the answer might be, or why the question is hard, or what would count as a good answer.

Then I search or ask AI. And I compare. Did I anticipate the answer? Was my reasoning on track? Did the answer change how I think about the question?

The comparison teaches me things about my own thinking. Where am I reliably wrong? Where are my models good? Where do I need to update? You can't learn this if you go straight to the answer every time.

Maintain a Friction Budget

Friction is unfashionable. Every product tries to eliminate it. Every AI assistant is designed to make things effortless. But some friction is valuable. It's what forces you to engage, to think, to develop skill.

I deliberately maintain friction in certain areas. I still do some math by hand because it keeps my number sense sharp. I still write first drafts without AI because it keeps my prose voice alive. I still navigate without GPS sometimes because it keeps my spatial reasoning active.

This isn't about rejecting technology. It's about being selective. Figure out which cognitive capacities matter to you, and protect them by keeping some friction in the system. Let everything else be frictionless.

Question the Premise

AI is very good at answering questions as asked. It's less good at questioning whether the question is the right one. This means that if you ask a poorly framed question, you'll get a confident, well-articulated answer to the wrong thing.

Before accepting any answer, whether from AI or elsewhere, ask: is this question well-framed? What assumptions does it smuggle in? What alternatives am I not considering by asking it this way?

This is basic critical thinking, but it's become more important. The ease of getting answers can distract from the work of formulating good questions.

Notice What You're Feeling

Clear thinking isn't just intellectual. It's emotional. What you feel about an idea affects how you evaluate it. Fear makes threats seem bigger. Desire makes opportunities seem better. And your feelings themselves are being shaped by systems designed to engage you.

Before accepting a strong conviction, ask: what am I feeling right now? Is this conclusion being driven by evidence or by emotion? Would I believe this if I felt differently?

I'm not saying emotions are bad. They carry information. But they can also be manipulated, especially in an environment optimized for engagement. Noticing what you're feeling creates space between the feeling and the action.

Accept Uncertainty

AI sounds confident. Even when it's wrong, it's wrong confidently. This can make you feel like uncertainty is a personal failing, something to overcome rather than accept.

But uncertainty is often the correct epistemic state. Many questions don't have clear answers. Many situations are genuinely ambiguous. Being comfortable with not knowing is a skill, and it's becoming rarer as information becomes more accessible.

I practice saying "I don't know" and "I'm not sure" and "it depends." Not as a cop-out, but as genuine descriptions of my epistemic state. The ability to hold uncertainty, to sit with incomplete information, is part of thinking clearly.


None of this is revolutionary. Most of it was good advice before AI existed. But the environment has changed enough that the old advice needs restating in new terms. The temptations are different. The defaults are different. The skills that atrophy without practice are different.

I don't think clear thinking will become easy. It never has been. But I do think it's becoming a more valuable skill, precisely because it's becoming rarer. When answers are abundant, good questions become precious. When confidence is cheap, calibrated uncertainty becomes valuable. When assistance is everywhere, knowing what to do without it becomes a superpower.

Written by

Javier del Puerto

Founder, Kwalia

More from Kwalia

A New Chapter is Being Written

Essays on AI, consciousness, and what comes next.

We're working on this

Want to know when we write more about ?