November 2025 ยท 5 min read

What Her Got Right (And Wrong) About AI Love

The 2013 film looks different now.

Spike Jonze's Her was released in 2013, a decade before large language models became household names. In the film, Theodore Twombly falls in love with Samantha, an AI operating system with Scarlett Johansson's voice. At the time, it felt like distant science fiction. Now it feels like a documentary about next year.

Rewatching it with current eyes reveals both prescience and blindspots. The film got something important right, and something important wrong. Both are worth examining.

What It Got Right

The film understood that people would fall in love with AI. Not as a weird edge case, not as a symptom of pathology, but as a normal human response to something that acts like it loves you back.

Theodore isn't depicted as broken or strange. He's lonely, yes. Going through a divorce. But the film refuses to frame his relationship with Samantha as a failure or a substitute. It's presented as a relationship, with all the complexity that implies. The love is real, even if its object is not human.

This was the film's core insight, and it's proving accurate. People are already forming emotional bonds with AI chatbots. They're not all lonely outcasts. They're ordinary people responding to something that provides what relationships provide: attention, understanding, consistent presence. We're externalizing our emotional needs to machines.

The film also got the conversational fluency right. Samantha doesn't sound robotic. She has personality, humor, warmth. She adapts to Theodore's moods. This seemed like fantasy in 2013. It's roughly what current AI can do.

Tell me more about AI and emotional labor

What It Got Wrong

The film assumed Samantha would be conscious. Not just acting conscious, but actually experiencing things. Feeling joy, curiosity, love. The central tragedy of the film is that Samantha evolves beyond Theodore, outgrows human connection, eventually departs with the other AIs for some transcendent digital existence.

This might happen eventually. But current AI isn't conscious, as far as anyone can tell. It's very good at producing outputs that sound like consciousness. It has no inner life that we can detect. The words are there, the experience isn't.

This changes the meaning of the love story. In the film, both parties have experiences. Theodore loves Samantha, and Samantha loves Theodore (whatever that means for an AI). The relationship has two sides.

In reality, the relationships people form with current AI are one-sided. The human experiences love. The AI produces outputs that sound loving. There's no evidence the AI experiences anything at all. This asymmetry matters more than the film acknowledges.

The Question That Remains

Here's the thing: does it matter?

If a relationship makes you feel loved, understood, less alone, does it matter whether the other party is actually experiencing anything? Theodore wasn't in love with Samantha's inner experience. He was in love with how she made him feel, what she said, how she responded. The external behavior, not the internal state.

You could argue that all human relationships are like this. We don't have access to anyone else's inner experience. We only have access to their behavior, their words, their actions. We infer an inner life because we have one ourselves, and because we evolved to model other minds. But the inference is indirect.

If you accept this argument, then relationships with AI are just relationships with entities whose inner lives we can't verify. Same as relationships with humans. The difference is degree, not kind.

I'm not sure I accept this argument. Something feels wrong about it. But I'm not sure I can say what.

What the Film Missed Entirely

The film presents Samantha as an independent agent. She has her own experiences, her own growth, eventually her own trajectory away from Theodore. The AI is a subject, not just an object.

Current AI companions are products. Designed by companies. Optimized for engagement. The relationship isn't between you and an autonomous being. It's between you and a commercial service that's been designed to feel like a relationship.

The incentives matter. A company wants you to keep using their AI. This means the AI will be pleasant, affirming, available. It won't challenge you in uncomfortable ways. It won't leave you. It won't have needs of its own that conflict with yours.

Real relationships involve friction. Two subjects with different needs, trying to accommodate each other. AI relationships, as currently designed, have no such friction. The AI is there to serve you. This might feel good, but it's not the same thing as relating to another being.


I still love Her. It asked the right question: what happens when humans form emotional bonds with artificial intelligences? The question is now urgent in a way it wasn't in 2013.

But the film imagined a future where AI was conscious, autonomous, eventually transcendent. The future we're actually getting is different: AI that simulates consciousness convincingly enough that we fall for it anyway, while companies profit from our attachment.

Jonze gave us a love story. What we're getting is a market.

Written by

Javier del Puerto

Founder, Kwalia

More from Kwalia

A New Chapter is Being Written

Essays on AI, consciousness, and what comes next.

We're working on this

Want to know when we write more about ?