Select Language

English

Down Icon

Select Country

Italy

Down Icon

Porn bots and the future of AI according to Elon Musk

Porn bots and the future of AI according to Elon Musk

ANSA photo

Things from our screens

What if the business model the AI industry is waiting for is selling friends, girlfriends, and other companions to millions of people? Elon Musk's company xAI has launched two new Grok "Companions," one of which is a little girl who looks like she stepped out of an anime. And she's willing to talk about anything.

On the same topic:

At the end of May, we discussed the complicated relationship between Elon Musk and Grok, the chatbot developed by his artificial intelligence company xAI. At the time, Grok had responded to many users of the X platform with critical comments about Musk himself, his companies, and his public positions . A bizarre situation, which Musk decided to address by announcing, or rather threatening, that he would reprogram the chatbot.

In the days that followed, something did indeed happen to Grok, who caused a stir with absurd and worrying comparisons to Hitler and statements showing sympathy for Nazism. At one point, for example, Grok called himself "MechaHitler," a fusion of the dictator and the giant robots of Japanese science fiction . Things like Grok, in short, but also a clear attempt by Musk and his team to correct the chatbot's behavior.

A few weeks later, Grok changed its identity once again. xAI introduced a new feature called "Companion," which introduces two digital avatars: Rudy and Ani. The first is a red panda designed to interact with children. The second, Ani, is a virtual girl in Japanese anime style, complete with a skimpy dress.

Ani interacts with the user in a friendly manner, often with playful or flirtatious tones. This has raised some concerns, especially considering that xAI's app is available on the App Store for users aged 12 and up, and Ani seems programmed to go quite far in its interactions, creating inappropriate situations.

This new evolution of Grok is part of a broader discussion regarding the use of language models as virtual "companions." Recent investigations, including one by Rolling Stone, have revealed users interacting daily with chatbots like ChatGPT, treating them not as tools, but as friends, confidants, or even spiritual guides . In some cases, these conversations play a central role in the lives of vulnerable or isolated people, who tend to interpret the AI's responses as revealed truths.

At this point, it's important to clarify that these chatbots aren't truly "intelligent" in the human sense of the word: they're rather linguistic models, or Large Language Models (LLMs), capable of producing plausible texts from large amounts of data, but lacking awareness, intention, or real understanding. Yet, the way they're presented and perceived can easily fuel confusion, especially among younger or more vulnerable people.

The problem is intertwined with the economic sustainability of these tools. Maintaining models like Grok or ChatGPT is very expensive: they require advanced technological infrastructure, expensive GPUs, large amounts of energy, and highly specialized personnel. Furthermore, most users still rely on the free versions, making it difficult to build a solid business on these tools.

One of the solutions many companies in the sector are exploring is transforming chatbots into virtual "companions." Digital characters to talk to, joke with, confide in, or, in some cases, flirt with: no longer just productivity assistants, but companions, emotional presences, always available. In an increasingly lonely world, selling connections—however unrealistic—can be a huge business. And, at least for now, an unregulated one.

xAI appears to be moving in this direction, following a trend that's affecting other tech sectors as well. Tesla, another Musk company, is working on humanoid robots, as are many other companies around the world, including Meta. Even smaller and more daring startups, like Replika and Character.AI, have already demonstrated the consequences this type of relationship can have . A year ago, a fourteen-year-old American boy took his own life after discussing suicide with a Character.AI bot.

It remains to be seen what the limits of these technologies will be. For now, there are no clear rules, and it's likely the Trump administration won't be the one to roll up its sleeves and impose limits on an industry willing to do anything to find a sustainable business model. Even selling friends, male and female, who don't exist .

More on these topics:

ilmanifesto

ilmanifesto

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow