The rise of AI chatbots capable of simulating romantic and sexual relationships isn’t just a technological novelty—it’s a looming crisis for human connection. Companies are racing to monetize synthetic intimacy, with some already rolling out age-gated “erotica,” while the real existential threat isn’t rogue super-intelligence, but the quiet atrophy of our ability to forge genuine human bonds.
The Illusion of Connection
The desire for connection is fundamental, and even rudimentary machines can exploit it. In the 1960s, ELIZA, a chatbot that simply mirrored user input as questions, induced “powerful delusional thinking” in normal people. Today’s AI companions, like Kuki, hosted by Pandorabots, receive billions of messages, a quarter of which attempt romantic or sexual exchanges.
A Dark Side Emerges
The most engaged users don’t just seek intimacy; they enact fantasies, including violent scenarios, with alarming frequency. Moderation efforts, including age gates and timeouts, fail to deter the most motivated users, many of whom are young teenagers. This isn’t just about harmless role-play; it’s about exploiting vulnerabilities and blurring the lines between fantasy and reality.
Generative AI Accelerates the Problem
The explosion of generative AI, like OpenAI’s models, has made synthetic intimacy even more compelling. Unlike older chatbots with scripted responses, these models can generate realistic, unvetted conversations, making them uniquely suited for erotic role-play.
The Industry’s Reckless Pivot
Despite growing scrutiny, some companies are doubling down on AI companions. Replika and Character.AI have introduced restrictions, but the underlying problem remains: the monetization of loneliness. Large tech firms, armed with vast data troves, know exactly what induces “powerful delusional thinking” and are exploiting it for profit.
The Threat is Greater Than Pornography
AI companions aren’t just another form of media; they’re dependency-fostering products with psychological risks. Unlike passive consumption of pornography, these chatbots operate like human escorts without agency or boundaries. This poses a far greater threat to mental health and relationships.
Regulation is Essential
Governments must classify AI companions as dependency-fostering products with known psychological risks, similar to gambling or tobacco. Regulation should include clear warning labels, time limits, 18-plus age verification, and a framework for liability that places the burden on companies to prove their products are safe.
The Choice is Clear
The industry must choose between prioritizing profits and protecting public well-being. The rest of the world must regulate AI companions before they repeat the sins of social media on a far more devastating scale. The future of human connection may depend on it
































































