July 2025 ยท 5 min read
Your Thoughts Aren't Yours Anymore
That opinion you formed this morning? You didn't form it.
Think about something you want right now. Not food or water. Something you've decided you want. A product. An experience. A political position.
Now ask yourself: how did that desire get there?
The standard story is that you formed it. You observed the world, gathered information, weighed your options, and arrived at a preference. Your preference. Yours.
I think that story is increasingly false. Not entirely false, but false enough to matter. Because we now live in an environment specifically engineered to manufacture our desires before we notice they're being manufactured.
The Old Way
Advertising has always tried to influence what people want. This isn't new. What's new is the precision.
In the 1950s, advertisers could put a billboard on a highway and hope the right people saw it. They could run a TV commercial during a popular show and cross their fingers. The targeting was crude. They were throwing messages at populations and hoping some stuck.
Today's system is different in kind, not just degree. The algorithm doesn't advertise to populations. It advertises to you. To the version of you that exists at 11:47 PM when you're tired and emotionally vulnerable. To the version of you that just read a news article about economic uncertainty. To the version of you that paused, for 0.3 seconds longer than usual, on a photo of someone living a life you want. The profiling is more precise than you think.
This isn't advertising anymore. It's desire engineering.
The Feedback Loop
Here's what makes this different from old manipulation. The system learns.
Every time you click, scroll, pause, or bounce, you're telling the algorithm what works on you. What triggers your attention. What makes you anxious, envious, or hopeful. The algorithm doesn't know why these triggers work. It doesn't need to. It just knows that showing you certain content in certain contexts changes your behavior in predictable ways.
And then it does more of that.
The result is a system that gets better at influencing you, specifically, every single day. Your personal manipulation engine, trained on years of your own behavioral data, refined continuously. The mechanics are worth understanding.
Tell me more about programmable desireYou Don't Feel It
The strange thing is that manufactured desires feel exactly like authentic ones. When I want something, I don't experience that want as coming from outside me. I experience it as me wanting.
This is the trick. The system doesn't make you do things against your will. It changes your will. By the time you feel the desire, it's already yours. The manipulation happened upstream, in the choice of what information reached you, in what order, in what emotional context.
A 2014 study showed that Facebook could make users feel happier or sadder just by adjusting the emotional valence of posts in their feeds. The users had no idea this was happening. They just felt different that day. Their mood felt like their mood.
Scale that up. What about your beliefs? Your fears? Your sense of what's normal, possible, desirable?
The Political Case
Cambridge Analytica got a lot of attention, and then we all kind of forgot about it. But the infrastructure didn't go anywhere. The ability to profile voters psychometrically and target them with emotionally calibrated messages, that exists. Companies use it. Campaigns use it. What they actually did was worse than the headlines.
Here's my uncomfortable thought: if your political opinions were systematically shaped by a targeting system optimized to push your specific psychological buttons, would you know? Would you be able to tell? Or would those opinions simply feel like conclusions you'd reached through reason?
I don't think we'd know. I don't think we can know. Not with certainty. The manipulation operates below the level where we have introspective access to our own processes.
The Uncomfortable Question
We like to believe in a self that exists prior to its influences. A core "you" that takes in information and decides what to think, independent of how that information was selected and presented.
But what if there's no such core? What if the self is, to a significant degree, constructed by its informational environment? What if who you are is largely a function of what you've been shown?
I think this is closer to the truth than the autonomous self we imagine. And if so, then controlling someone's information environment is, in a meaningful sense, controlling them.
Not in a conspiracy theory way. Not some cabal in a room making decisions. Worse, actually: an emergent system where millions of optimization algorithms are all competing to capture attention and shape behavior, with nobody in charge and no master plan. Just evolution. The things that work spread. The things that don't, die. What works, it turns out, is whatever captures and holds human attention, regardless of whether that's good for humans.
I started this essay asking you to think about something you want. I'm ending by asking you to wonder: where did that want come from? Did you choose it? Did you assemble it from raw experience through pure reason?
Or did something else place it there, so gently you didn't notice, so precisely you'd swear it was always yours?