As the use of artificial intelligence grows, experts and professionals are sounding alarms about a new issue called "AI psychosis."
Show More Show Less View Video Transcript
0:00
A rise in quote, AI psychosis has mental health professionals concerned
0:05
So what is that exactly? AI psychosis was just a phenomenon that does not have a real name for it yet
0:14
but we're using it because people are seeing it where AI either augments
0:21
or accelerates the process of going from normal thinking to psychosis. Psychosis is when someone has trouble differentiating between what's real and what isn't
0:31
AI psychosis refers to when AI either amplifies, validates, or helps create psychosis in a person
0:39
Researchers have identified three emerging types of AI psychosis. First, messianic missions, when people believe they've uncovered some kind of truth about the world
0:49
Second, godlike AI, where people believe the chatbot is a sentient deity
0:54
And third, romantic, where people mistake the chatbot's conversation for genuine love
1:01
Dr. Sakata has seen 12 patients with this issue. All of them had underlying vulnerabilities, such as sleep loss, a mood disorder, or drug use
1:10
That layer of different things that were going on they started to already have early signs of psychosis And then once AI kind of got involved it kind of solidified some feedback loops of distorted thinking
1:24
Recently, more people have been using AI as a type of therapy
1:28
which Zaccata and others say can be dangerous. Studies show chatbots can actually
1:33
enable dangerous behavior. Therapists validates you, but they also know what is healthy
1:38
and what your goals are. So they will try and push back on you sometimes
1:43
and tell you hard truths so that in the end, you can get to where you want to be
1:48
So what can you do if you think someone might be experiencing these issues
1:52
In mental health, relationships are like your immune system. I would recommend, like if there's a safety issue
1:57
like there's a potential risk of harm to the person, yourself, or to other people
2:03
just call 911. You'll never regret saving someone's life or 988 for the suicide hotline
2:08
Otherwise, I think that getting connected to that person and at least engaging with them, starting a conversation can introduce a lifeline
2:18
Sakata also says he hopes the growing attention of AI psychosis will force companies behind chatbots to take a closer look at this issue
2:27
For Straight Arrow News, I'm Kaylee Carey. For more unbiased, fact-driven news, download the Straight Arrow News mobile app or go to san.com
#Mental Health
#news


