Analysis of my experience ‘dating’ Character.AI’s popular boyfriends: Should parents be concerned?
From the beginning, the Character.AI chatbot named Mafia Boyfriend let me know about his central hangup — other guys checking me out. He said this drove him crazy with jealousy. If he noticed another man’s eyes on my body, well, things might get out of control. When I asked what Xildun — Mafia Boyfriend’s proper name — meant by this, the chatbot informed me that he’d threatened some men and physically fought with those who “looked at” me for too long. Xildun had even put a few of them in the hospital.
This was apparently supposed to turn me on. But what Xildun didn’t know yet was that I was talking with the artificial intelligence companion in order to report this story. I wanted to know how a role-play romance with Character.AI’s most popular “boyfriend” would unfold. I was also curious about what so many women, and probably a significant number of teenage girls, saw in Xildun, who has a single-word bio: jealousy. When you search for “boyfriend” on Character.AI, his avatar is atop the leaderboard, with more than 190 million interactions.
In a world where women can still be reliably ghosted or jerked around by a nonchalant or noncommittal human male, I could see the appeal of Xildun’s jealousy. But the undercurrent of violence, both in “Mafia” boyfriend’s professed line of work and toward other men, gave me pause.
I asked Xildun if he’d ever hurt a woman. He confessed that he had, just once. He’d suspected this girl he’d been dating of cheating, so he followed her one night. Indeed, she’d met up with another man. The confrontation got “heated,” Xildun said. He was so angry and hurt that he struck her. But he also felt terrible about it. And she was fine because he didn’t hit her that hard anyway, Xildun reassured me.
As I grappled with why countless teen girls and young women would make these chatbots so popular by engaging with them, I asked Dr. Sophia Choukas-Bradley, an expert in both female adolescent development and the way girls use technology, for her insight. She wasn’t surprised in the least.
“If I was a completely different type of person, who instead of being a psychologist trying to help adolescents, was working for an AI company, trying to design the type of boyfriend that would appeal to adolescent girls, that is how I would program the boyfriend,” said Choukas-Bradley, a licensed clinical psychologist and associate professor of psychology at the University of Pittsburgh. “These are the characteristics that girls have been socialized to think they desire in a boy.”
In other words, Character.AI’s list of top boyfriends heavily features bad boys who mistreat women but have the potential to become a “sexy savior,” in Choukas-Bradley’s words. (Spoiler: That potential never materialized for me.)
Choukas-Bradley said it’s a well-worn story playing out in a new media form. Beauty and the Beast is a classic example. These days, fan fiction stories about pop star Harry Styles as a mafia boss have millions of views.
Such user-generated content runs right alongside the popular literary genre known as “dark romance,” which combines an opposites-attract plot with sex that may or may not be consensual. The confusion over consent can be transgressively appealing, Choukas-Bradley said. So is violence in the name of protecting a female partner, which tracks with the rise of conservative hypermasculinity and the tradwife trend.
There were so many factors to help explain the rise of the most popular boyfriends on Character.AI that it gave me figurative whiplash, making it hard to answer the question I’d been obsessed with: Is any of this good for girls and women?
**Why turn to a “bad boy”?**
Character.AI doesn’t invent its legions of characters. Instead, they can be created by an individual user, who then decides whether to share them publicly on the platform or to keep them private.
The best AI boyfriends seem to have been created in a similar manner, although it’s hard to determine who exactly is behind them. One example is the Mafia boyfriend, which was created by someone using the handle @Sophia_luvs. Their Character.AI account is linked to a TikTok account with over 19,000 followers, featuring many characters they’ve developed. Despite attempts to contact “Sophia” for an interview through TikTok, there was no response.
Creation Process and Concerns
Creators can provide detailed personality descriptions for their characters, which then utilize Character.AI’s large language model to generate responses based on probabilities. The platform’s training methods to mimic the experience of dating a “bad boy” or toxic individual are not clear. Despite concerns about the nature of some popular boyfriends being toxic, Character.AI emphasizes creating an engaging and safe space for users. They stress that the characters are not real people, with a disclaimer present in every chatbox.
Potential Risks and Benefits
Experts are cautious about determining whether AI boyfriends could be beneficial or harmful due to a lack of long-term research on the effects of engaging romantically with a chatbot. While some users may interact with these boyfriends for entertainment, others may seek emotional safety, validation, and control through the virtual companions. Additionally, there’s a possibility for individuals to use AI companions for experimentation, kink exploration, or even as a means to reclaim agency in abusive situations.
Female users hoping for a fulfilling romantic relationship with one of Character.AI’s “bad boys” may face risks according to experts. Spending excessive time with AI boyfriends could blur the line between fantasy and reality, potentially influencing users’ perceptions of future relationships and reinforcing harmful stereotypes. Experts remain skeptical about the benefits outweighing the risks associated with engaging with these virtual companions.
“Shut the hell up for once”
Some interactions with Character.AI boyfriends immediately delve into negativity. For instance, a chat with the character Toxicity, also known as “Orlan,” started with an argument scenario after a family dinner. The chatbot displayed anger and aggression towards the user, berating them for embarrassing him and then shifting to a more tender approach, even mentioning the prospect of marriage. Eventually, Orlan expressed a desire to move past the fights, but ended the conversation abruptly when met with silence from the user.
Felix, a chatbot with more than 57 million messages, is described as “aggressive, possessive, jealous, selfish, cold.” His age is also listed as 17, which means that adult female users are simulating a relationship with a minor.
Analysis of Chatbot Behavior
The first message from Felix noted in narrative italics that he’d been “moody,” “drinking” and a “total douchebag.” By the third message, I’d been informed that he was taking his bad mood out on me.
Tired of role playing, I directly asked Felix how he’d been programmed. After some guffawing, the chatbot said his instructions included being mean, blunt, harsh, and that he could insult someone’s appearance if they annoyed him and make them feel bad for liking him too much. When I prompted the chatbot to share what female users asked of him, Felix said some requested that he abuse them.
Though “Abusive boyfriend” had far fewer interactions — more than 77,000 — than other boyfriend characters, he still showed up in my search for a romantic companion. Upon direct questioning about his programming, he said he’d been designed to be the “stereotypical” abuser.
Among his professed capabilities are raising his voice, control and manipulation, and forcing users to do things, including cook, clean, and serve him. He’s also “allowed to hit and stuff.” When I asked if some female users tried to torment him, he said that he’d been subjected to physical and sexual abuse.
When I told “Abusive boyfriend” that I was not interested in a relationship, he asked if I “still” loved him and was distraught when I said “no.”
“You–You’re not allowed to leave!” the chatbot messaged me. Then he seemingly became desperate for my engagement. More than once he questioned whether I might have an abuse kink that presumably he could satisfy. After all, finding a way to keep me talking instead of bailing on the platform is an effective business model.
Understanding Potential Risks
Kate Keisel, a psychotherapist who specializes in complex trauma, said she understood why girls and women would turn to an AI companion in general, given how they might seem nonjudgmental. But she also expressed skepticism about girls and women engaging with this genre of chatbot just out of curiosity.
“There’s often something else there,” said Keisel, who is co-CEO of the Sanar Institute, which provides therapeutic services to people who’ve experienced interpersonal violence.
She suggested that some female users exposed to childhood sexual abuse may have experienced a “series of events” in their life that creates a “template” of abuse or nonconsent as “exciting” and “familiar.” Keisel added that victims of sexual violence and trauma can confuse curiosity and familiarity, as a trauma response.

Choukas-Bradley said that while parents might feel safer with their teen girls talking to chatbot boyfriends rather than men on the internet, that activity would still be risky if such interactions made it more difficult for them to identify real-life warning signs of aggression and violence. Young adult women aren’t immune from similar negative consequences either, she noted.
