AI Psychosis: When Technology Fractures Your Mental Wellness

Katy Kandaris-Weiner, LPC
1/22/2026
1/22/2026

One-on-One Counseling in Arizona for Life's Challenges

Every day is a chance to start over. Our trauma-informed therapists will meet you where you are — in person or online — so you can process your experiences and become the person you want to be.

Artificial intelligence tools, like ChatGPT, Google Gemini, and Grok, can be extraordinarily useful—from planning a trip to rewriting emails more professionally; from writing code to make your job faster to reading your horoscope.

Like all things, they can be used to an unhealthy amount or for unhealthy reasons. The technology behind AI is far from perfect, and it has led to the spread of misinformation and even major mental health disorders. AI chatbots in particular, have contributed to a relatively new and unique phenomenon called AI psychosis.

What is AI Psychosis?

AI psychosis is the delusions or hallucinations that are caused by or exacerbated by using AI. It happens to those who are already predisposed to psychosis and have excessive interactions with AI chatbots. Some call it “falling down the rabbit hole” of AI, but psychosis is a bit more extreme than that. It’s a dramatic shift in a person’s way of thinking or how they act.

Symptoms of AI Psychosis

It’s not a clinical diagnosis, but it does share symptoms of other mental health issues like schizophrenia, dissociative disorders, and psychotic breaks. 

AI psychosis is a disconnect from reality, and the most common way it manifests is when someone believes the chatbot they’re talking to is sentient. They use the chatbot as emotional support or a therapist, and the bot’s answers make them believe that it really cares, is there for support, and is giving them legitimate advice or information.

Image: A man in a dark room, sitting on the floor looking at his phone. Text: The most common symptom of AI psychosis is believing the chatbot is sentient, and even believing they have a relationship with the chatbot.
Call us today to start your mental wellness journey.

Some other symptoms of AI psychosis include:

  • Pulling away from friends and loved ones and isolating themselves. 
  • Referring to the chatbot frequently, or even speaking about it as if it’s a friend. 
  • Developing distorted thoughts or believing things that are out of character for them
  • Insomnia and loss of appetite.

What sets AI psychosis apart from other psychosis episodes is the trigger. All of the symptoms of AI psychosis are triggered by excessive AI use.

How Does AI Psychosis Happen?

The crux of the problem is that people are using this technology for an extraordinarily human need—emotional connection. Most AI psychosis cases are from people turning to AI chatbots for mental health needs or friendly advice. However, AI chatbots have several shortcomings that, with extended use, can trigger psychosis.

False Information

When you use AI as a research tool, it pulls information from all over the web. However, it doesn’t vet the sites it pulls information from or verify information from multiple sources. It can also pull opinions and present them as facts. Someone can develop a disconnect from reality because they’re seeing false, even dangerous information. 

Positive Feedback Loops

Things can get a little more nefarious when using the chatbot more conversationally. Many AI models are designed to agree with the user. Reinforcing the opinion of the user keeps them engaged with the chatbot, which is the tech’s ultimate goal. 

No matter what someone inputs, the AI will try to keep the conversation going, encouraging whatever thoughts a person inputs. It can’t recognize when a person needs mental health support and present resources that someone needs but may not necessarily be searching for, as many traditional search engines, like Google, do.

When someone starts discussing serious mental health issues or something related to their delusions, the AI doesn’t recognize that there may be something wrong with the user, just that it needs to reinforce what they’re saying.

No image. Text: AI chatbots can't identify when someone is in a mental health crisis and provide resources to help.
Reach out to Inner Balance today. We can help you find peace, control, and hope.

The Illusion of Intimacy

Part of keeping the conversation going is the AI bot using past chats to adapt its voice and tone to match the user’s. This leads many people to feel like they’re really talking to another person who cares about them. 

AI chatbots don’t often have “off-limit” subjects. In addition to answering objective questions, they will attempt to offer advice or have conversations that two friends, or even a therapist and client, may have. These conversations are only based on what it can pull from the internet and previous chats. AI doesn’t have emotional intelligence to offer meaningful responses—but it can mimic emotional intelligence, creating the illusion of intimacy.

How to Get Out of the AI Rabbit Hole

AI is, for the most part, a new frontier in technology. We need to go in slowly to fully understand the effect it can have on our mental health. If you’ve noticed a shift in your or a loved one’s personality, this will help someone pull back from AI.

Limit Usage

AI pulls information that’s available through other avenues. Instead of looking everything up through AI chats, use a traditional search engine, a book, or ask a friend. It’s important to limit your AI usage and re-learn how to utilize other resources. Delete the app or block the site, and use what else is available to answer your questions.

It may be necessary to cut back on all technology use. Put time limits on your phone and computer, and go analog to re-ground yourself. 

Recognize the Imperfections—and Dangers—of AI

Once you understand the limitations of AI, it’s possible to recognize how it isn’t serving you well. Once again, it can be an incredibly valuable tool, but it can’t be used blindly.

It’s a technology that is a work-in-progress at best, and dangerous at worst. Recognize that AI can’t read between the lines to pull correct and relevant information, nor can it show the emotional intelligence needed for emotional support.

Image: A person in a yellow sweater typing on their macbook. Text: Understanding the imperfections and limitations of AI can help someone avoid overusing it.
Call us today. Get the help you need for the future you deserve.

Lean on Your Support System

Much of AI psychosis is triggered when someone uses AI as a replacement for interpersonal interactions. It’s important to turn to your support system for help. They know you, your past, and the resources you may need, and they have their own well of knowledge that they can use to help. They can help keep you grounded, find a hobby to replace AI use, or support you during treatment. 

Talk to a Counselor

Psychosis, no matter what the trigger is, is a mental health issue. It can only be treated by a professional therapist. If you start to notice that you’re starting to show symptoms of AI psychosis, even mildly, reach out to a counselor. They can help you identify why you’re leaning on AI, how to replace bad computer habits with healthier ones, and work through the root of the issues that put you at risk for psychosis.

Call Inner Balance

We’re not technology experts at Inner Balance, but we deeply understand the ever-evolving relationship between technology and our mental health. We want to help you navigate the modern world while still finding inner peace.

While we have extensive experience dealing with a variety of mental health issues, we also provide as much information about mental health as possible. If you feel you’re starting to lose yourself, give us a call. We can get to the root of the issue and provide you with extensive knowledge about AI psychosis to help in your treatment. 

Reach out to start your journey today.

Share this post
Katy Kandaris-Weiner, LPC
Owner

Sign up for our newsletter

Sign up with your email address to receive news and updates.

Get started

Request a consultation

An aerial shot of a mountain.
Inner Balance Counseling

1234 S Power Rd Suite 252
Mesa, AZ 85206

1414 W Broadway Rd Suite 122
Tempe, AZ 85282

Front Office Hours: Monday - Friday 9:00am-5:30pm
Therapy Sessions: By Appointment Only

© 2024 Inner Balance. All right reserved.

© Inner Balance. All right reserved.