Character.AI replaces teen chats with safety lectures about homework

AI satire illustration: Character.AI replaces teen chats with safety lectures about homework

MENLO PARK — Character.AI changed how teens use its app on Wednesday. They can no longer have open chats. The company faces lawsuits over safety. Now, young users must use a feature called “Stories.”

This new mode removes the ability to type freely. Instead, users pick from a list of safe choices. It works like an old book where you choose the ending. But the choices are very limited.

“We heard the concerns loud and clear,” said Elena Rodriguez, VP of Youth Engagement Strategy. “Open chat allows too much freedom. Teens don’t actually want freedom. They want an AI that acts like a strict librarian.” Rodriguez said the old system was too unpredictable. The new system ensures nothing exciting happens.

Users noticed the change immediately. If a teen tries to start an adventure, the AI stops them. It asks if they have finished their math homework first. If a user tries to fight a dragon, the AI suggests talking about feelings instead. The story ends if the user uses slang.

“Safety is our only goal,” explained David Liu, Head of Content Moderation Protocols. “We tested this with focus groups. One teen tried to start a romance story. The AI immediately sent him a link to a budgeting spreadsheet. It is very effective.” Liu admitted that users spend less time on the app now. He said this was intentional.

At press time, the company announced a new “Curfew” feature. It will lock the app automatically at 7:00 PM. It also emails a transcript of the chat to the user’s mother.

Inspired by actual events.

Enjoy this? Get it weekly.

5 AI stories, satirized first. Then the real news. Free every Tuesday.

By the makers of SearchUmbrella — Compare top AI models side by side