August 2025 ยท 5 min read
The Algorithm Knows You Better Than You Do
And that's not a metaphor.
In 2015, researchers at Stanford and Cambridge published a study that should have been more alarming than it was. They found that a computer algorithm, analyzing Facebook likes alone, could predict personality traits more accurately than friends, family, and even spouses. With just 300 likes, the algorithm beat the spouse. The methodology is worth examining.
Let that sink in. The person who lives with you, sleeps next to you, knows your habits and fears and secrets, can be outpredicted by a machine analyzing which posts you clicked a button on.
This wasn't a fluke finding. It's been replicated and extended. The accuracy only improves with more data.
What It Actually Measures
The technical term is "psychometric profiling." The algorithm predicts your scores on standard personality measures: openness, conscientiousness, extraversion, agreeableness, neuroticism. But it goes further than that. Political orientation. Religious beliefs. Intelligence. Likelihood of substance abuse. Sexual orientation. All predictable from behavioral data.
How? Because your behavior is more consistent than your self-reporting. You might think of yourself as adventurous, but if your viewing history suggests otherwise, the algorithm knows. You might believe you're apolitical, but the pattern of your engagements tells a different story. The system doesn't care what you say about yourself. It cares what you do.
Tell me more about psychometric profilingThe Self-Knowledge Illusion
We think we know ourselves. We have access to our own thoughts, after all. We can introspect. We can reflect on our motivations and explain our decisions.
But decades of psychology research suggest our self-knowledge is surprisingly limited. We confabulate. We rationalize. We have blind spots about our own biases. We construct post-hoc narratives to explain behavior that was actually driven by factors we're not conscious of.
The algorithm doesn't have these limitations. It doesn't need to explain why you behave the way you do. It just needs to predict what you'll do next. And it turns out that prediction, done well, reveals stable patterns that even you don't see. This challenges some deep assumptions.
What They Do With It
Once a system can predict your personality, it can optimize its interactions with you. Not in a vague "targeted ads" way. In a specific, calibrated way.
If you're high in neuroticism, messaging that emphasizes safety and security will be more effective on you. If you're high in openness, novelty appeals work better. If you're introverted, social proof matters less; if you're extraverted, it matters more.
This isn't hypothetical. It's how major advertising platforms work. The targeting is individualized. The message variants are tested. The system learns what works on you, specifically, and does more of it. The political applications are concerning.
The Asymmetry Problem
Here's the power dynamic that bothers me most: they know you better than you know yourself, and you know almost nothing about how the system works.
You don't see your psychometric profile. You don't know what categories you've been sorted into. You don't know which version of a message you're seeing or why. The system is optimized against you, using information asymmetry as its primary tool.
It's like playing poker against someone who can see your cards but you can't see theirs. You're not just at a disadvantage. You don't even know the game is rigged.
The Changing You
And here's the part that really keeps me up at night: the system isn't just predicting you. It's shaping you.
When you're consistently shown content that matches your predicted interests, your interests narrow. When you're given choices optimized for your predicted preferences, your preferences calcify. When your attention is captured by content designed for your psychometric profile, your profile becomes more pronounced.
The algorithm that knows you better than you know yourself is also constructing the future you. It's not just reading your data. It's writing it.
What Can Be Done
The honest answer: I don't know. Individual data hygiene helps at the margins but doesn't address the structural issue. Regulation is slow and often ineffective. The companies that benefit from psychometric profiling have every incentive to continue and improve it.
But here's what I do believe: understanding the situation matters. When you know that the content you're seeing has been selected to match your psychological profile, you can at least question it. When you know that your behavior is being analyzed and predicted, you can at least try to be more intentional. When you know the game is rigged, you can at least refuse to be a naive player.
I started this essay with a study showing algorithms outperform spouses at personality prediction. I'll end with a question: if a machine knows you better than the people closest to you, what does that mean for intimacy? For identity? For what it means to know someone at all?
We used to think knowledge of a person was earned through time, attention, and care. Now it's extracted through surveillance and processed through statistics. The result is more accurate. Whether it's better is a different question.