August 2025 ยท 5 min read

You're Training AI Right Now

Every click, every scroll, every pause.

Every time you solve a CAPTCHA, you're doing unpaid labor. Those "select all squares with traffic lights" puzzles aren't just verifying you're human. They're training computer vision systems to recognize objects in images. Your correct answers become labeled training data. Google uses them to improve self-driving cars. The scale is staggering.

You're not the customer. You're the worker.

And CAPTCHAs are just the obvious example.

The Hidden Training Loop

Consider what happens when you use a search engine. You type a query. You get results. You click one. You stay, or you bounce back and try another.

That sequence of actions is data. It tells the algorithm that for your query, result #3 was more relevant than result #1. Your behavior is a training signal. Multiply that by billions of users and trillions of queries, and you have an enormous distributed workforce constantly improving the search algorithm. All unpaid.

The same pattern repeats everywhere. When you scroll past a post, you're voting it down. When you pause on a video, you're voting it up. When you finish reading an article or abandon it halfway through, you're providing feedback on content quality. Every interaction is an implicit rating that gets fed back into the machine.

Tell me more about data as labor

The Invisible Factory

Traditional factories are visible. You can see the building, the workers, the product leaving the gate. The transaction is clear: labor in, wages out, goods produced.

The AI training factory is invisible. There's no building. The workers don't know they're workers. The product (better algorithms, more accurate predictions, smarter AI) leaves through APIs and invisible improvements. And the transaction is hidden: you provide labor, you receive "free" services, the company captures enormous value.

Facebook, Google, TikTok, Amazon. These aren't just platforms. They're factories for converting human behavior into AI capabilities. Every user is an unpaid laborer in a system they can't see. The economics are deliberately obscured.

What They're Actually Building

Here's what your behavior trains:

Recommendation systems. Every video you watch to completion teaches the algorithm what captures attention. Every one you skip teaches it what doesn't. TikTok's algorithm is trained on billions of these micro-decisions, which is why it's so eerily good at predicting what you'll watch next.

Language models. Every email you write, every search query, every chat message adds to the corpus of human language that models learn from. When AI can write text that sounds human, it's because it learned from text that was human. Your text, among billions of others.

Prediction engines. Every purchase you make, every place you go, every person you message. The patterns in your behavior help predict the behavior of people like you. Your life becomes a template for targeting others.

The Compensation Question

Some people argue this is a fair trade. You get free services. The company gets your data. Both sides win.

I'm skeptical. The trade is opaque. Most users don't understand what they're giving up or how valuable it is. And the asymmetry is vast. Your individual contribution is tiny. But aggregated across billions of users, that data creates systems worth hundreds of billions of dollars. There are proposals to fix this.

Imagine if every time you trained an AI system, you got paid. A fraction of a cent, maybe. Micro-payments for micro-labor. Over time, it would add up. And it would make visible the work that's currently invisible.

This isn't fantasy. Some economists are seriously proposing "data labor" frameworks where users get compensated for the value they create. The technical and logistical challenges are real. But so is the fundamental unfairness of the current system.

Training the Thing That Replaces You

Here's the darker angle: you're training systems that will eventually make human labor less valuable. The content moderators training AI to detect harmful content are teaching systems that will replace content moderators. The translators correcting AI translations are training systems that will replace translators. The artists labeling images are training systems that will replace artists.

This has happened before. Factory workers trained the machines that replaced them. But the scale is different now. The speed is different. And the invisibility of the training process means most people don't realize they're participating in it until the replacement is complete.

So What?

I'm not suggesting you stop using the internet. That's not realistic, and it wouldn't change the system anyway. Individual opt-out doesn't scale.

But awareness matters. When you understand that you're working, you can start asking different questions. Who captures the value of my labor? What am I training? Who benefits? Should there be compensation, or at least consent?

The biggest AI systems in the world weren't built by a few thousand engineers in Silicon Valley. They were built by billions of users, each contributing tiny amounts of behavioral data, each training the systems a little bit more.


As you read this essay, your device tracked whether you scrolled slowly (engaged) or quickly (skimming). It noted where you paused. It recorded how long you stayed. All of that is data. All of that trains something.

You just did unpaid work.

You're welcome.

Written by

Javier del Puerto

Founder, Kwalia

More from Kwalia

A New Chapter is Being Written

Essays on AI, consciousness, and what comes next.

We're working on this

Want to know when we write more about ?