In brief
- Viral deletion prompt drew more than 3.6M views and thousands of replies
- Users posted quitting announcements describing addiction and emotional pull
- The post come as Character AI faces scrutiny as lawsuits and safety concerns mount
Character.AI faced a sudden user revolt on Monday, as dozens of people on X celebrated quitting the role-playing chatbot app after a viral screenshot sparked a wave of departures.
The moment captured a broader debate about how deeply users bonded with the app’s AI companions and how difficult it had become to step away for its most devoted fans.
The backlash crested Monday with a jubilant declaration from a user declaring “finally quit character.ai for good HIP HIP HOORAY!” paired with a GIF of a smiling anime character.
The post drew more than 3,800 likes, hundreds of reposts, 119 bookmarks, and more than 42,000 views, prompting a thread where users compared quitting the app to overcoming an addiction.
“As someone who's stuck between relapsing and attempting to quit using Character AI right now, every single one of you is objectively correct,” another X user wrote. Other users described themselves as a former heavy user, saying they got addicted to this app and used it as a source of affection during a difficult period.
Launched in 2022 by former Google engineers Noam Shazeer and Daniel De Freitas, the platform grew rapidly by offering customizable AI characters for role-play, comfort, or creative storytelling.
Despite mounting controversies, reports estimated Character AI has more than 28 million monthly active users. Since its launch, the app has surpassed 50 million downloads on Google Play and recorded more than 472,000 ratings on iOS.
The wave of departures followed a viral screenshot posted the day before by a user going by the name “John Twinkatron” that showed Character.AI’s deletion prompt.
“You’ll lose everything,” the prompt said. “Characters associated with your account, chats, the love that we shared, likes, messages, posts, and the memories we have together.”
The post reached more than 110,000 likes, nearly 8,000 reposts, and more than 3.6 million views within 48 hours, drawing accusations that the app relied on guilt-based design.
One user called the message “so exploitative,” while another wrote that it was “fucked up for people trying to get out of addiction.” Several users said the screenshot pushed them to delete their accounts immediately.
Wanted to delete my character ai account for good as I’d only deleted the app, and this fucking message is so exploitative and manipulative oh my fucking god????? pic.twitter.com/d1Ht9T4HQA
— Anne 🍎 ⸆⸉ ོ | SAW DAVEED (@collinshoskins) November 10, 2025
“After being Character AI clean for multiple months (6–7?) now, I’ve finally decided to permanently delete my account,” another user wrote. “As a former addict, I believe this is the right choice for me.”
Character.AI’s user revolt comes amid a string of high-profile controversies. Families in the U.S. have sued Character.AI, alleging its chatbots encouraged self-harm, suicide, or inappropriate interactions with minors. The cases prompted the company to block open-ended chats for users under 18 and introduce new age-verification and safety measures.
Despite the controversies, the companion-AI market grew to an estimated $15 billion, with projections reaching $31 billion by 2032.
A spokesperson for Character AI told Decrypt in a statement that the company continues its efforts to refine its product and safety systems as the platform grows.
“We deeply value our community of millions of users and always prioritize providing them with updates on platform changes,” they said. They added that the company “will continue to test, monitor, and iterate” as its age-assurance systems and safety measures develop.
In October, the company announced that it would restrict access to users under 18 in the U.S., with the restriction set to take effect on November 25.
Once implemented, users under 18 would be “directed to the other multi-modal content creation features, including creating videos, stories, and streams with Characters.” The restriction will then roll out in other countries in the months to come.
“We believe our array of techniques will create a reliable system of age assurance,” the spokesperson said. “We will continue to test, monitor, and iterate, as we see how our functionality is working and how the technology develops across the industry.”

