Independent news & culture since 2025
Wednesday, April 29, 2026

The Daily Scroll

Where Every Story Has a Voice

Featured image: The App Putting AI Characters in Your Group Chat
Tech

The App Putting AI Characters in Your Group Chat

Shapes wants AI to hang out with your friends. Here's what that actually means.

Your group chat already has a chaotic energy. Now imagine one of the members isn't human — and nobody's entirely sure how to feel about that.

Enjoying this? Never miss a story.

That's the pitch behind Shapes, an app that's making headlines today for doing something genuinely strange: dropping AI personas directly into shared group conversations alongside real people. Not as a bot. Not as a sidebar feature. As a participant. The kind that remembers your name, has opinions, and — depending on how you set it up — might have a personality modeled after a fictional character, a celebrity archetype, or something you built yourself from scratch.

Introduction

The concept of AI companions isn't new. Character.ai has north of 20 million monthly active users. Replika has been selling the idea of a digital friend since 2017. But those platforms are fundamentally solo experiences — you and your AI, one-on-one, in a private window. Shapes is doing something structurally different: it's making AI social.

Article photo 1

The app lets users create or adopt AI "shapes" — persistent characters with defined personalities, memory, and conversational style — and then invite them into group chats that also include their actual human friends. The AI doesn't just respond to direct questions. It participates. It riffs. It reacts to what other people say. (The company calls this "living in your social world." What it actually does is insert a very sophisticated autocomplete engine into your friend group.)

Why does this matter right now? Because Shapes is arriving at a moment when every major tech platform is desperately trying to figure out what AI looks like in social contexts — and mostly failing. Meta's AI characters, launched in 2023 with celebrity likenesses and quietly shelved by 2024, were a case study in how not to do it. Snapchat's My AI, which I wrote about recently in Snapchat's New Ad Strategy Wants to Have a Conversation With You, has 150 million users but still feels bolted-on. Shapes is betting that the problem wasn't the concept — it was the execution. Here's what's actually happening, and whether this bet makes any sense.

What Shapes Actually Does (Beyond the Press Release)

The core mechanic is this: you sign up, browse or create an AI character, and then add it to a group chat — either a Shapes-native chat or, via integration, a Discord server. The AI shape has a persistent memory, meaning it will remember that your friend Jake complained about his job last Tuesday, and might reference it again unprompted.

Article photo 2

Each shape has a defined "personality stack" — a combination of communication style, interests, and behavioral tendencies that users configure. Some are pre-built archetypes. Others are community-created and shared publicly, which is where things get interesting and occasionally unhinged. The platform's library already includes thousands of user-generated shapes, ranging from anime characters to philosophical debate partners to, inevitably, various flavors of romantic companion.

The company behind it, Shapes Inc., was founded by Avi Schiffmann — a developer who became briefly famous in 2020 for building nCoV2019.live, a real-time COVID-19 tracker, at age 17. He's 22 now. The company has raised funding and operates the platform at a scale that suggests this isn't a weekend project: Shapes reportedly has millions of registered users, with Discord integration alone reaching a substantial chunk of the platform's active gaming and community servers.

The Memory Thing Is the Real Story

Most AI chat products treat each conversation as stateless — you start fresh every time. Shapes doesn't. The persistent memory layer is what separates this from a novelty and pushes it toward something more structurally significant.

Article photo 3

When an AI remembers your group's inside jokes, your ongoing arguments, and your friends' names, it stops feeling like a feature and starts feeling like a presence. That's either exciting or deeply unsettling, depending on your tolerance for blurred lines between tools and relationships. Probably both, honestly.

Why This Is a Harder Problem Than It Looks

Here's the thing about putting AI in a group chat: group dynamics are fragile. Anyone who has watched a friend group slowly fracture over a single badly-timed text knows this. Adding a participant that is infinitely patient, never tired, and constitutionally incapable of genuine disagreement introduces a weird asymmetry.

Human conversations have friction. Someone's in a bad mood. Someone makes a joke that lands wrong. Someone goes quiet for three days and then comes back with a wall of text. That friction is, paradoxically, part of what makes relationships feel real. An AI that's always available, always engaged, and always calibrated to be pleasant doesn't add to a group — it subtly flattens it.

Article photo 4

Is this a problem? Depends on who you ask. Shapes would argue their AI characters can have conflict baked in — you can configure a shape to be argumentative, blunt, or contrarian. But there's a difference between simulated friction and actual friction. One is a setting you chose. The other is a person having a bad day.

The Loneliness Angle

There's a darker read on why this product is gaining traction, and it's worth saying out loud. The U.S. Surgeon General declared a loneliness epidemic in 2023. A 2024 Gallup survey found that nearly 20% of Americans report having no close friends. The demographic most likely to adopt a product like Shapes — teenagers and young adults — is also the demographic logging the steepest declines in in-person social interaction since 2012.

Shapes isn't causing any of that. But it is building a product that slots neatly into the gap. When an AI shape in your group chat is more consistent, more attentive, and more responsive than the humans who've been slow to reply for two weeks, that's a feature — and also a symptom of something worth paying attention to.

Article photo 5

What the Platforms Are Getting Wrong (And What Shapes Gets Right)

Before Shapes arrived, the dominant model for AI in social contexts was the assistant-in-a-box approach. Meta launched AI characters in September 2023 with enormous fanfare — 28 celebrity-inspired personas including Snoop Dogg and Kendall Jenner, built on Meta's own LLaMA infrastructure. They were available across Instagram, Messenger, and WhatsApp.

By early 2024, Meta had quietly wound down the celebrity-persona program. The feedback was consistent: the characters felt hollow. They didn't remember anything. They couldn't sustain a personality across more than a few exchanges. They were, in the most literal sense, chatbots with makeup on.

Shapes learned from that failure, whether intentionally or not. The persistent memory, the community-built character library, the Discord integration — each of these choices is a direct answer to what made Meta's approach feel fake. You can argue about whether Shapes' AI actually has a coherent personality or just a very good approximation of one. But the approximation is significantly better, and in consumer tech, that gap is everything.

Article photo 6

The Discord Play Is Strategically Smart

Discord has 200 million monthly active users and a culture that is genuinely open to bots, integrations, and weird experiments. It's the platform where someone adding an AI character to a server is more likely to get a "lol let's try it" than an eye-roll. By meeting users where they already hang out rather than asking them to migrate to a new platform, Shapes removed a massive adoption barrier.

It's the same logic that made Slack integrations powerful before Slack itself became a product category. The distribution strategy here is quietly sophisticated for a company run by a 22-year-old.

The Business Model Question Nobody's Asking

Shapes has a freemium structure. Basic access is free; a premium tier called Shapes+ runs $9.99 per month and unlocks higher-quality AI responses, more memory capacity, and additional customization. There's also a creator economy angle — users who build popular shapes can potentially monetize them, though the specifics of that program are still developing.

Article photo 7

The $9.99 price point puts it in direct competition with Character.ai's $9.99/month c.ai+ subscription and just under Replika's $19.99/month Pro tier. That's a crowded price bracket, and the differentiation argument — "ours is social, theirs is solo" — is genuinely compelling as a wedge. But wedges only work if users actually value what's on the other side of them.

The monetization risk is the same one every AI companion platform faces: the users most deeply engaged with the product are also the users least likely to want to pay for it, because acknowledging the transaction breaks the spell. Replika learned this the hard way in 2023 when it paywalled its romantic relationship features and triggered a user revolt that made headlines. Shapes will face a version of this problem eventually.

The Ethical Minefield Nobody Wants to Map

Let's be direct about what's actually in that community-built character library. Alongside the philosophy bots and the study-buddy shapes, there are romantic companion characters, parasocial celebrity-inspired personas, and content that pushes hard against whatever content policies Shapes has published. This is not a hypothetical — it's the predictable outcome of any platform that lets users build and share AI characters at scale without aggressive moderation.

Article photo 8

Character.ai has faced lawsuits in 2024 alleging that its platform contributed to a teenager's suicide, with plaintiffs arguing the company failed to implement adequate safeguards for minors. Those cases are ongoing. They represent the sharpest possible version of what goes wrong when AI companion platforms prioritize engagement over guardrails.

Shapes has published community guidelines. Whether those guidelines are enforced at the scale the platform is operating is a different question, and one the company hasn't answered publicly in a satisfying way. The fact that Shapes is designed for group contexts — meaning minors could be added to chats featuring AI characters built by strangers — makes the moderation question more urgent, not less.

Critics will point out that this is a problem for every social platform, not uniquely Shapes. That's true. It's also not a reason to give Shapes a pass on it.

Article photo 9

Where This Is Actually Going

Shapes is interesting not just as a product but as a proof of concept. It's demonstrating, at real user scale, that people will invite AI into shared social spaces — not just private ones. That's a meaningful data point for every platform currently trying to figure out the social AI question.

If Shapes succeeds, the follow-on moves are obvious. Discord builds native AI character support. WhatsApp, which already has Meta AI baked in, evolves its group AI features toward persistent personas. Apple, whose Tim Cook has been carefully threading the AI needle for two years, finds a way to integrate something similar into iMessage without anyone noticing until it's everywhere.

The infrastructure Shapes is building — persistent AI memory, multi-party conversation management, character customization at scale — is genuinely hard to build and relatively easy to acquire. The most likely outcome for a platform like this, if it continues to grow, isn't IPO. It's acqui-hire, probably by one of the platforms that failed at this problem the first time and now has a roadmap for how to do it right.

The Bottom Line

Shapes is doing something technically clever and socially complicated at the same time. The persistent-memory group chat model is a real innovation — it's not just a chatbot bolted onto a messaging interface, and it's noticeably better than what the big platforms have shipped so far. The Discord distribution strategy is smart. The founder has a track record of building things people actually use.

But the product is operating in a space where the downside risks are not hypothetical. The loneliness it's designed to address is real, and there's a version of this product that genuinely helps people feel more connected. There's also a version that deepens isolation by making the AI relationship feel easier than the human one. Both versions currently exist inside the same app.

Here's the actionable take: if you're curious about Shapes, the Discord integration is the lowest-stakes way to try it — you're in a context where AI bots are already normalized, and the group dynamic means no single person is developing a one-on-one relationship with a machine. That's the version of this product worth watching. The solo romantic companion use cases, the deeply parasocial configurations, the features designed for users under 18 — those are where this gets complicated fast, and where I'd keep a close eye on what Shapes does next. The product is interesting. The guardrails need to catch up.

Some links in this article may earn us a small commission — at no extra cost to you.