September 2025 ยท 8 min read

The Attention Economy Ate Your Desires

That thing you wanted to buy? You didn't decide that.

That thing you wanted to buy? You didn't decide that.

Scroll back through your recent purchases. The skincare product. The kitchen gadget. The online course. You remember wanting those things, feeling the pull of desire, making what felt like a choice. But trace the thread backward and you'll find something else: a video that appeared in your feed, a recommendation that materialized at the right moment, an ad so well-targeted it felt like telepathy.

This is not a conspiracy theory. It is the documented business model of the most valuable companies in human history. They are not selling your attention. They are selling your desires, after manufacturing them.

The Desire Factory

Traditional advertising worked by association. Show a beautiful person enjoying a product, and some of that beauty might transfer in the viewer's mind. It was crude, interruptive, and everyone knew it was happening.

What we have now is something different. Modern recommendation systems don't just show you products you might want. They create the conditions under which new wants emerge. They expose you to lifestyles, problems, and identities you didn't know existed, then provide solutions for sale.

Consider how this works. A recommendation algorithm notices you watched a video about home organization. This single data point triggers a cascade. You start seeing content about minimalism, then capsule wardrobes, then the anxiety that comes from clutter, then products that promise to restore the calm you didn't know you were missing. By the end of the week, you want a new shelving system with a specificity that feels entirely personal.

Tell me more about recommendation algorithms

The desire feels authentic because in some sense it is. You are the one who felt the pull, imagined the organized closet, clicked the purchase button. But the path to that desire was constructed. The algorithm didn't read your mind. It wrote on it.

The Attention Harvest

We call this the attention economy, which is accurate but incomplete. Attention is the raw material. Desire is the product.

The business model works like this: platforms capture your attention with content that triggers engagement. The more time you spend, the more data they collect about what moves you. That data trains models to predict what will move you next. This predictive capacity is then sold to advertisers who want to shape your behavior.

But "shape your behavior" understates what happens. What gets shaped is not just what you do. It's what you want, what you value, what you think would make you happy. The deepest layer of the self becomes a site for commercial intervention.

I am not claiming this is a new phenomenon. Advertising has always tried to create desires. What is new is the precision, the scale, and the intimacy. The algorithms know things about you that you don't know about yourself. They can predict your susceptibilities with eerie accuracy. They operate in the spaces between conscious thought.

The Autonomy Problem

Why should this matter? After all, people have always been influenced. Our desires have never been purely self-generated. Culture, family, friends, media have always shaped what we want. Why is algorithmic influence special?

The difference is in the asymmetry. When your friend recommends a restaurant, you know they're making a recommendation. You can evaluate it, accept or reject it, consider their taste against your own. The influence is visible and can be negotiated.

Algorithmic influence operates differently. You don't see the inputs. You don't see the model. You don't see the commercial interests shaping what reaches you. You just experience desires that feel like your own.

The philosopher Harry Frankfurt distinguished between first-order desires (wanting something) and second-order desires (wanting to want something). Autonomy, he argued, involves the alignment of these levels. You are free when you want what you want to want.

The attention economy disrupts this alignment. It shapes first-order desires without consulting second-order values. You might value sustainability, but find yourself wanting fast fashion. You might value presence, but find yourself wanting to check your phone. The disconnect between what you want to want and what you actually want is the signature experience of our time.

Tell me more about philosophical perspectives on autonomy

The Mimetic Machine

The philosopher Rene Girard argued that desire is mimetic. We don't want things directly; we want what others want. We learn to desire by watching others desire.

Social media is a mimetic amplification system. It shows you what others want, own, experience. It creates an endless parade of desiring others, models for your own wanting. The influencer doesn't just recommend a product. They model a form of life, an identity, a way of being that includes the product.

This mimetic machinery operates at scale. Millions of people watch the same influencers, absorb the same aesthetic sensibilities, develop the same wants. The result is a strange homogenization. People who pride themselves on individuality end up decorating their homes in the same styles, wearing the same "unique" finds, pursuing the same wellness practices.

The algorithm notices this convergence and reinforces it. What's trending gets more attention. What gets more attention trends further. A feedback loop emerges that amplifies certain desires while suppressing others. The space of possible wanting narrows.

The AI Amplification

Everything I've described was possible with 2020's technology. What happens when desire-manufacturing meets modern AI?

The new generation of AI systems can generate personalized content at scale. Not just ads, but entire aesthetic environments. Imagine a feed where every image, every video, every story is generated specifically for your psychological profile, optimized not just to capture attention but to cultivate specific wants.

This is not hypothetical. It is the obvious next step in a trajectory that has been accelerating for twenty years. The tools exist. The business model demands it. The only question is when.

We are moving from an era of curated desire (showing you existing content that will move you) to generated desire (creating new content specifically designed to move you). The distinction matters. Curation has limits. Generation is potentially boundless.

What Can Be Done?

Some people advocate for digital minimalism. Delete the apps. Reclaim your attention. Escape the desire machine.

This approach has merit for individuals but doesn't address the structural problem. Most people won't log off, and the system will continue shaping the desires of those who remain. Individual exit is not a solution to a collective problem.

Others propose regulation. Restrict surveillance. Require algorithmic transparency. Ban manipulative design patterns.

These interventions could help, but they face the difficulty of defining manipulation in an environment where all content shapes desires. Where is the line between recommendation and manipulation? Between personalization and exploitation?

I don't have a complete answer. But I think clarity about the problem is the starting point. We need a language to describe what is happening to our desires. We need to recognize that the self is not a fortress but a field of forces, and that commercial interests have learned to operate on that field with unprecedented sophistication.


I finished writing this essay and immediately opened Instagram. The irony was not lost on me. The pull is that strong.

Somewhere in a data center, models of my preferences are being updated. The algorithm knows I write about the attention economy. It knows my demographic, my reading habits, my purchase history. It is preparing a feed that will move me, and I will experience that movement as my own desire.

That thing you wanted? Maybe it was yours. Or maybe it was placed, cultivated, grown in the soil of your attention by systems designed for exactly that purpose. The unsettling truth is that from the inside, you cannot always tell the difference.

The attention economy didn't just eat your attention. It ate your desires. And it's still hungry.

Written by

Javier del Puerto

Founder, Kwalia

More from Kwalia

A New Chapter is Being Written

Essays on AI, consciousness, and what comes next.

We're working on this

Want to know when we write more about ?