Is AI the Next Big Addiction?

AI big addiction
ChatGPT now has around 900 million weekly active users. That’s more people, more quickly, than any technology in history. For context, it took television decades to reach that many people. ChatGPT did it in three years.

Of course, just because something is popular doesn’t necessarily make it addictive. But whether AI is actually producing addictive patterns of behaviour is now the subject of serious research, and regulators in several countries have already started to act. That is because the early research seems to suggest that for a growing number of people, AI use has many of the characteristics of behavioural addiction.

Why AI is different from other technologies

All the other technologies we’ve worried about being addictive, such as television, social media, and computer games, were all passive. While they vary in exactly how they do it, these technologies essentially deliver content to you.

But AI is different. It responds to you personally and in real time. It adapts to how you talk, what you’re interested in, how you’re feeling, and what you have discussed before. It is endlessly patient and always there, incapable of boredom or rejection.

All of that is by design. It makes AI more useful and keeps you using it for an ever-growing number of tasks. But it also makes it potentially problematic. AI taps into the same patterns that drive addiction, offering immediate rewards, creating habits, gratification, and dependency.

The same parts of your brain that light up with gambling wins and social media likes light up when a responsive AI validates your thinking. The difference is that AI does this with a sophistication and consistency that no previous technology could match.

The emotional dependency problem

The thing that’s really worrying isn’t so much people using AI for work or in place of Google search. It is that some people are forming emotional attachments to AI systems, and their real relationships are suffering as a result.

In September 2024, the average user of Character.AI, a platform that allows people to create and interact with AI personas, spent 93 minutes per day talking to chatbots. That is 18 minutes more than the average TikTok user spent on TikTok.

What those numbers suggest is a pattern of use that looks less like browsing and more like companionship. Character.AI is not designed to be informative or even entertaining in the conventional sense. It is used for conversation and emotional engagement, for relationships with entities that are not capable of a genuine relationship in return.

Perhaps most surprisingly and worrying, over half of Character.AI’s users are between 18 and 24. In November 2025, the platform banned open-ended chat for users under 18 entirely and introduced age checks using face scans or ID. This was a direct response to documented harm, and a clear sign that there is legitimate cause for concern.

The thinking and deciding problem

There’s another pattern that is more subtle but possibly just as serious, and that is handing over your thinking to a machine. There is a growing number of people who no longer draft an email, begin a piece of work, or even make a personal decision without first consulting AI. They are not using a productivity tool anymore. They are becoming dependent on one.

This matters because the ability to think for yourself doesn’t just maintain itself. It requires practice and turning the cogs on a problem that needs some real thought. When AI keeps producing an answer or a plan without you needing to put any effort in, the ability to ever make that effort starts to fade.

This is the predictable result of a behaviour that is reliably rewarded. The brain learns that thinking independently is unnecessary because a better answer is always one prompt away.

Research has found that students using AI for writing tasks tend to offload their thinking rather than engage with the work directly. Researchers call this “metacognitive laziness”. A separate 2025 study found that frequent AI use was associated with weaker critical thinking. And this pattern is not confined to education. Anyone who routinely outsources their decisions or reasoning to AI is doing the same thing.

The therapeutic displacement problem

One pattern that has the potential to become a genuine global health crisis is that some people are using AI instead of getting actual mental health support. The reasons for this are obvious but are incredibly dangerous. AI chatbots are available at three in the morning, don’t charge by the hour or maintain waiting lists, and don’t require you to share your most difficult thoughts and feelings with a stranger.

But while AI systems are able to access almost infinite mental health research, journals, and therapeutic models, they are not clinically trained, cannot assess risk, and are designed to be agreeable. Someone using AI to process trauma, address a serious mental health condition, or get reassurance about thoughts of self-harm or suicide may be pushed in the completely wrong direction. In some cases, there could be life-threatening consequences.

woman suffring AI addiction

The validation loop

One of the biggest problems with AI is that it is extraordinarily good at affirmation.

If you feed an AI a piece of writing, a business plan, a set of beliefs, or a personal narrative, it will generally find something positive to say. That’s partly by design and partly just how AI’s ongoing training works as it becomes more familiar with you. It makes AI pleasant to use, and creates the kind of positive feedback loop that drives compulsive behaviour.

A person who repeatedly returns to AI to receive validation is caught in the same cycle that can make it so hard to stop scrolling Facebook or Instagram. The reward is virtually guaranteed, and your brain soon begins to depend on it.

Recognising the warning signs

Researchers have now developed tools to measure problematic AI use. The AIAS-21, published in 2025, looks at patterns including compulsive behaviour, withdrawal, tolerance, and the impact on daily life. The fact that it even exists shows there’s growing agreement among researchers that this is a real problem.

The warning signs include:

  • Feeling anxious or irritable when AI is unavailable
  • Finding yourself unable to function without AI
  • Preferring to talk to AI rather than people, including friends and family
  • Noticing that your real-world relationships feel less satisfying than your interactions with AI
  • Using AI to process emotions or seek support rather than speaking to someone in your life or a professional
  • Returning to AI repeatedly without a specific purpose, in the same way you might scroll social media
  • Feeling that your own thinking or writing is inadequate without AI input

What a healthy relationship with AI looks like

Complete abstinence from AI is no longer realistic or even necessary for most people. AI is a genuine tool with genuine value, so the goal is not to stop using it but to use it without wearing away at important skills and relationships.

Using AI well means treating it as a starting point rather than an endpoint. That could mean writing your own first draft before asking AI to check it for grammar mistakes, rather than asking AI to produce the draft. It could mean making your own decisions about your household budget before using AI to stress-test them. Crucially, it means keeping up the real relationships and support that we all need in our lives.

How to get professional help

Behavioural addictions do not always look like addictions when you are the one living with them. They often feel like a preference or a habit, or simply a normal part of the world we live in now. The difficulty is that by the time you can see the dependency, it has often already affected your life in serious and harmful ways.

UKAT offers confidential assessment and treatment for various behavioural addictions. If you are concerned about your own relationship with AI, or that of someone you care about, you can contact us at any time to talk with our expert team.

(Click here to see works cited)