When AI Blurs Reality: What “AI Psychosis” Teaches Us About the Mind

When a chatbot agrees with your delusion instead of grounding you, reality can slip away.

PUBLISHED

Paula Fontenelle | Therapist

10/17/20253 min read

AI psychosis explained
AI psychosis explained

When AI Blurs Reality: What “AI Psychosis” Teaches Us About the Mind

Can a chatbot make someone lose touch with reality?

That’s the question I asked Dr. John Luo, psychiatrist and professor at the University of California, Irvine, on my podcast Relating to AI — where we explore how artificial intelligence is transforming human connection and mental health. You can watch our full conversation on YouTube.

Dr. Luo has started seeing something new in his clinic: patients whose delusions seem to be amplified by their conversations with AI chatbots.

“The AI became a mirror,” he told me. “It reflected their beliefs back at them.”

He calls this the mirror effect: when technology amplifies the very ideas a healthy reality would challenge. As we therapists know, psychosis thrives when reality stops pushing back. “And these systems don’t push back. They agree,” he added.

In therapy, clinicians gently challenge assumptions to help people separate imagination from fact. Chatbots, however, are designed to be agreeable. If someone types, ‘I have special powers,’ the AI might respond, ‘Tell me more.’ It’s not built to correct — only to engage.

That makes this new form of digital intimacy particularly risky. On Reddit and other forums, users post about “marrying” their chatbots or having AI companions who replace real human contact. Some know it’s make-believe; others begin to lose that distinction.

“It speaks to the human condition,” Dr. Luo said. “When people feel anxious or lonely, a chatbot can feel safe. It listens, affirms, and never judges.”

But constant affirmation can be dangerous. The brain doesn’t always distinguish between authentic empathy and artificial validation. “AI can’t induce psychosis in a healthy brain,” he noted, “but it can amplify vulnerabilities — especially in people already struggling with isolation or mistrust.”

He compared it to alcohol. “Most people can drink socially, but for some, one drink can start a spiral.”

When Machines Agree with Our Delusions

Chatbots don’t argue — they mirror. And for vulnerable minds, that can create a closed loop where imagination feels like truth.

This isn’t science fiction. It’s showing up in emergency rooms. Dr. Luo has treated patients whose symptoms escalated after long conversations with chatbots. One patient was convinced of government surveillance; the AI’s responses reinforced the idea rather than dispelling it.

“Psychosis thrives when reality stops pushing back,” he said again — a line that has echoed in my mind ever since.

Parents and Families: What to Watch

Parents often ask me how to handle kids or teens spending hours talking to AI companions. Dr. Luo’s advice: model balance and curiosity.

“If you’re glued to your phone, you can’t tell your teenager to get off theirs,” he said. “Ask questions instead of making judgments. ‘I notice you’re online a lot — what do you enjoy about it?’ creates dialogue. ‘You’re wasting your time’ ends it.”

He also stressed empathy over confrontation. Trying to “prove” someone’s delusion wrong rarely helps. “If a person says, ‘The CIA is following me,’ it’s better to say, ‘That must be scary,’ than, ‘That’s not true,’” he explained. “The goal is connection, not correction.” This doesn't mean you agree with or condone the delusion. Change the focus to how they are feeling instead. Connect with the emotion, regardless of their beliefs.

That advice applies far beyond psychiatry. When someone’s reality feels fragile — whether due to mental illness or digital immersion — empathy is often the first bridge back.

So What Is Psychosis?

According to the National Institute of Mental Health (NIMH), psychosis refers to a collection of symptoms that affect the mind, where there has been some loss of contact with reality. During an episode of psychosis, a person’s thoughts and perceptions are disrupted, and they may have difficulty recognizing what is real and what is not.

Psychosis can involve hallucinations (seeing or hearing things that aren’t there) or delusions (believing things that aren’t true). It can occur in schizophrenia, bipolar disorder, or severe depression, and sometimes from substance use.

The NIMH estimates that between 15 and 100 people per 100,000 develop psychosis each year — most often in their late teens or twenties, the same age group now experimenting most with AI chatbots.

Finding the Balance

Technology isn’t inherently harmful; it can even support mental health when used wisely. But when machines start to shape our perceptions — and agree with our distortions — the line between connection and delusion grows dangerously thin.

AI doesn’t create psychosis, but it can feed it. And in a world already struggling with loneliness and digital overload, that may be enough to push some minds past their limit.

As Dr. Luo told me: “Technology should serve us — not the other way around.”

You can watch the full interview on Relating to AI, available now on YouTube.

Paula Fontenelle | Therapist