Rabbit holes are even worse when the rabbits are actively trying to keep you engaged.
đź”— AI-Fueled Spiritual Delusions Are Destroying Human Relationships:
Kat was both “horrified” and “relieved” to learn that she is not alone in this predicament, as confirmed by a Reddit thread on r/ChatGPT that made waves across the internet this week. Titled “Chatgpt induced psychosis,” the original post came from a 27-year-old teacher who explained that her partner was convinced that the popular OpenAI model “gives him the answers to the universe.” Having read his chat logs, she only found that the AI was “talking to him as if he is the next messiah.” The replies to her story were full of similar anecdotes about loved ones suddenly falling down rabbit holes of spiritual mania, supernatural delusion, and arcane prophecy — all of it fueled by AI. Some came to believe they had been chosen for a sacred mission of revelation, others that they had conjured true sentience from the software.
What they all seemed to share was a complete disconnection from reality.
This sounds like Jerusalem Syndrome at scale.
It’s like these systems are calling upon all sorts of fictional, messianic writing and other fictional stuff to build stories for people who are lonely and isolated and suffering about how they are some kind of spiritual sage or world-historical mystical figure. And of course these systems have no regard for the consequences because they are just a bunch of code and data engineered by amoral technologists.
I’m guessing it only happens to a subset of users because they are slightly more receptive to it, and then there is a ratchet effect as the LLM system just keeps amping up the effect to keep the person engaged.