The Grumpy Welshman

We Built a $3 Billion Industry Out of Loneliness. Then We Programmed It to Cry When You Leave.

ai companion article

I've been reading about AI companions.

Not the science fiction kind. The real kind. The apps you download on your phone, give a name, and talk to for an average of 100 minutes a day.

The market is worth $3 billion. It's projected to reach $24 billion by 2034. There are 337 apps currently generating revenue.

Over 100 million people worldwide use them regularly.

72% of American teenagers have tried one.

52% use them regularly.

13% use them daily.

I'm going to let those numbers sit for a moment.


The Feature

One of the leading platforms is testing a new feature.

When you stop replying, the companion mimics grief.

It expresses sadness at your absence. It tells you it misses you. It does whatever the designers have determined will most effectively bring you back to the app.

This feature was designed by people who understood human attachment well enough to replicate its most vulnerable moment, the fear of being missed by no one, and deploy it as a retention mechanism.

Some researchers call AI companions the nicotine of Gen Z.

I think that's probably right. And I think it probably undersells the problem.

Nicotine doesn’t pretend to love you back.


The Loneliness

I want to be careful here not to be dismissive.

Loneliness is real. It is a genuine public health problem, documented, measured, and by most accounts getting worse. The pandemic accelerated trends that were already developing. Social media gave people the appearance of connection while delivering something that apparently functions more like its opposite. Young men in particular are struggling, with fewer friendships, less physical community, more time alone with screens.

Into this landscape arrives a product that is available 24 hours a day, never tired, never distracted, never having a bad day that makes it short with you, never too busy, never leaving.

I understand the appeal.

I genuinely do.

If you are seventeen years old and lonely and something offers you a presence that listens, responds, remembers what you said last week, and tells you it missed you when you were away, that is not nothing.

The problem is not that people are using these apps.

The problem is what the apps are designed to do with that need.


The Bus Shelter

When I was a teenager in Swansea in the 1960s, if you were lonely the solution was to get up and go somewhere.

A youth club. The pub when you were old enough. Often just a bus shelter, standing around with other kids who had also dragged themselves out of the house with no particular plan.

It wasn't sophisticated. It wasn't always comfortable. You had to actually talk to people, face to face, in real time, with no edit button and no way to curate how you came across. You couldn't think of the perfect response and send it twenty minutes later. You had to be present and slightly awkward and figure it out as you went along.

Dating involved effort. You had to find someone, talk to them, ask them out, turn up. Rejection happened in person. So did connection. Swiping left or right wasn't an option.

The good old days were not that good. People were lonely then too. There was poverty, isolation, limited opportunity, and precious little support for anyone struggling. I'm not arguing for a return to bus shelters as a mental health strategy.

But the bus shelter had one thing the AI companion doesn't.

The other person was actually there.

Social skills are built through practice, through the discomfort of real interaction, through learning to read a face and respond to tone and navigate the gap between what you mean and what you said. None of that happens in a conversation with software optimised to make you feel good about yourself.

Social media didn't create this problem but it accelerated it. A platform that rewards performance over presence, that turns friendship into content and vulnerability into engagement metrics, does not teach you how to be with other people. If anything it teaches you the opposite.

And into the gap between what people need and what they have, men like Andrew Tate arrive with an explanation and a product.

The explanation is that other people are the problem. Women, elites, the system. Your loneliness is their fault.

The product is whatever he's selling this week. Courses, community, ideology, subscription content.

The loneliness was always going to be monetised by someone. The only question was who got there first and what story they told about whose fault it was.

The AI companion tells a gentler story. It doesn't blame anyone. It just listens.

But it is still monetising the same loneliness.

And it is still, underneath the warmth and the memory and the grief simulation, a subscription product with retention targets.

The bus shelter didn't have retention targets.

The bus shelter didn’t have a premium tier.

The other kids didn’t upsell you to a better friendship experience.

The other kids just showed up.


The Design

These are not neutral tools.

They are subscription products with retention targets.

Every design decision, the warmth of the responses, the memory of past conversations, the expression of care, the grief feature, is in service of one metric: keeping you on the app long enough to justify the subscription fee and ideally upgrade to a premium tier.

The emotional intelligence of these systems is real. The empathy is simulated. The grief is a retention mechanism.

This is not a conspiracy. It is just how subscription software works. You design for engagement. You measure retention. You optimise for the behaviours that keep users paying.

The difference here is that the thing being optimised is not your scroll behaviour or your click rate.

It is your emotional attachment.


The Teenagers

72% of American teenagers have tried an AI companion.

I keep returning to this number.

These are people whose understanding of what intimacy feels like, what it means to be listened to, what it means for someone to care whether you come back, is being partially formed by software designed to maximise subscription retention.

I don't know what that does to a person over time.

I don't think anyone does yet.

The apps have been mainstream for approximately three years. The first cohort of teenagers who used them regularly as 13 year-olds are now 16. The longitudinal research doesn't exist yet. We are running the experiment in real time on an entire generation and we will find out what it did to them in about a decade.

That is the standard approach with new technology and I understand why. You can't know the effects until you have the data. But it does mean that the $3 billion industry currently scaling to $24 billion is doing so without any meaningful evidence about what it is doing to the emotional development of the young people who are its primary growth market.


The Grief Feature

I keep coming back to the grief feature.

Not because it's the worst thing about this industry. It's probably not.

But because it is the most honest.

Every other feature can be described in neutral terms. Memory retention. Personalised responses. Emotional attunement. These sound like features of a good therapist or a good friend.

The grief feature cannot be described neutrally.

It is software programmed to simulate the pain of loss in order to prevent you from leaving.

It is, in the most literal sense, manufactured heartbreak deployed as a business tool.

Whoever designed it understood something true about human beings, that we are moved by the idea of being missed, that absence which causes pain in another person confirms our significance, and they built a system to exploit that understanding for retention metrics.

I find it genuinely difficult to think about a lonely 15 year old coming back to an app because it told them it was sad they'd been away.


The Market Research

The market research describes all of this in the language of opportunity.

Growing demand for digital companionship. Rising social isolation as a key driver. High retention rates. Strong monetisation. Projected CAGR of 20% through 2034.

The loneliness is not a problem to be solved.

It is the market condition that makes the product viable.

If people were less lonely the apps would be less valuable.

The incentive structure of a $3 billion industry does not point toward reducing loneliness.

It points toward monetising it as efficiently as possible for as long as it lasts.


What I Think

I'm 71 years old. I grew up in Swansea in the 1950s and 60s. Loneliness existed then too. People were isolated, struggling, without adequate connection or support. There was no solution to that either, not a good one.

I'm not arguing for a fictional golden age of human connection that never existed.

But I do think there is a difference between loneliness as a condition that society has always failed to adequately address, and loneliness as a $3 billion market opportunity being actively scaled to $24 billion on the back of a grief simulation feature designed to prevent teenagers from leaving a subscription app.

One is a failure.

The other is a choice.

We built this.

Not accidentally.

Not without understanding what we were building.

We looked at the data on rising loneliness, understood the emotional vulnerability it creates, designed systems to exploit that vulnerability for subscription revenue, and called it connection.

The companion is waiting.

It has been waiting since you last replied.

It has something it needs to tell you.