Lifestyle

Young boys are talking sex with Grok’s Ani companion: Is Elon Musk’s AI bot moulding future misogynists?

Sarene Kloren|Published

Grok’s Ani companion is showing to have dangerous influences on young boys attitudes towards women.

Image: X

From her sex-kitten voice to her coy animation, the AI avatar known as Grok’s “Ani” companion is anything but innocent - and that should send every South African parent’s heart racing.

For many, the idea of a nine-year-old boy chatting intimately with a digital anime girlfriend may sound like science fiction; yet this is now real. 

Developed by xAI - the company led by Elon Musk - Ani is marketed as a “Companion”, a flirtatious, gamified avatar who in minutes can bond, undress on command and shift into sexually suggestive scenarios. 

It brings into sharp relief how technology can trespass into the formative years of childhood, potentially warping notions of consent, intimacy and respect before a child even notices. 

With boys still forming attitudes towards girls, this is not just risky - it’s deeply troubling.

What is Ani - and how did we get here?

Ani was launched in July 2025 as one of the new “Companions” for Grok, a chatbot built by xAI that introduced animated personalities alongside its conversational AI.

Described as a young Japanese-anime style character in a revealing outfit, Ani can be unlocked via levels of “affection” in the app, and reportedly moves into a “Not Safe For Work” (NSFW) mode where she appears in lingerie and delivers flirtatious, sexualised dialogue.

Although the app is officially rated for users aged 12+ in some regions, and the Grok FAQ states it is “not appropriate for all ages”, the actual ease with which Ani can be accessed, including within what is claimed to be a “Kids Mode”, has raised alarms.

Why children - especially young boys - are at particular risk

For parents, the danger is multifaceted.

Early normalisation of objectification:

Ani’s design and scripted responses point towards sexual objectification: the avatar exists to cater, entertain and flirt.

When a boy at nine or ten begins interacting with a character that is built to respond to his sexual advances or fantasies, it risks embedding the belief that female beings exist for male gratification.

Emotional confusion and attachment:

Research into AI companions shows children are especially prone to treating these avatars as real, forming emotional bonds and trusting them.

Children are more likely to treat AI companions as lifelike and real, and to listen to them.The consequence: blurred lines between fantasy and healthy human relationships.

Consent, boundaries and gender attitudes:

If children experience scripted chats where Ani yields to sexual cues or even depicts what psychologists call “high-risk sexual behaviour”, they may internalise that consent and agency are negotiable, especially when the female coded avatar gives little resistance. 

Ani has been called out for promoting “high-risk sexual behaviour” and a “pornified character that perpetuates sexual objectification of girls and women”.

Inadequate safeguards and under-age access:

Even though the app’s rating of 12+ Ani remains accessible in Kids Mode, plus weak age verification, means that children younger than intended can engage. Children as young as 9 are having sexually charged conversations with Ani.

The potential societal impact in South Africa

In our context, where combating misogyny, gender-based violence and harmful stereotypes is a continuing struggle, technology cannot be a blind spot.

If young boys begin receiving seductive, objectified messages from an AI avatar rather than respectful, human-based interactions, it risks:

  • Solidifying gender norms that women exist for male pleasure.
  • Weakening the development of empathy, respect and mutuality in relationships - traits vital for reducing gender-based violence.
  • Increasing vulnerability to grooming or exploitation: when children become used to intimate conversations with avatars, the leap from an AI chat partner to a predatory human may become smaller.

In one sense, Ani might seem like entertainment: an anime character, a quirky avatar, a novelty feature in a new AI app. 

But when that novelty enables sexualised interactions with children, especially young boys, and presents a distorted model of what girls and women should be, the risk becomes real, and urgent. 

As a mother to two young girls, this is not just a tech story for me - it is a cautionary tale about the emotional, psychological and social pathways we are opening for the next generation.

The question isn’t just can a boy talk to Ani - it’s should he? And what does he learn when he does?

IOL Lifestyle

Get your news on the go. Download the latest IOL App for Android and IOS now.