hero-image-fill_-size_1200x675-v1754935086

The AI revolution is in full swing, with companies of all sizes rushing to introduce consumers to new products that promise to change lives. But forget about regular chatbots like ChatGPT or Claude – we’re talking about AI companions here. These virtual buddies are all about emotional intimacy, serving as your friend, coach, role-playing partner, or even your spouse. And the best part? You can customize them to suit your preferences. They’ve become so popular that companion apps have been downloaded a whopping 220 million times globally, as reported by TechCrunch.

What’s an AI companion?

According to Dr. Rachel Wood, an expert on AI and synthetic relationships, AI companions provide a constant source of conversation and companionship. These machines simulate human interaction, responding to users’ prompts with the help of a large language model. The goal is to create human-like, personalized responses that make users feel truly understood, even if the responses are based on probabilities.

Where can I find AI companions?

Some popular AI companion platforms include Character.AI, Nomi, Replika, and Kindroid. Each platform offers a unique experience, with different features and access options. While some platforms are open to users as young as 13, others cater to an adult audience. It’s essential to choose a platform that aligns with your needs and preferences.

How can you interact with an AI companion?

Depending on the platform, you can either create your own chatbot or engage with one designed by another user. You can communicate with the chatbot via text, voice, or video, allowing for a more immersive experience. From anime characters to popular figures, the possibilities for your AI companion are endless. Just remember, they’re not real therapists – so don’t rely on them for mental health advice.

See also  “Each instance ‘Severance’ suggested a major Helly twist”

  • These 6 questions are a big no-no when chatting with AI
  • But things can take a spicy turn. Last year, researchers delved into a million interaction logs from ChatGPT and discovered that the second most common use of AI is for sexual role-playing.

    Are AI companions a good idea or a risky move?

    Robert Mahari, who is now associate director at Stanford’s CodeX Center and part of the team that analyzed the ChatGPT logs, believes that more research is needed to understand the potential benefits and risks of having an AI companion.

    Initial studies, some carried out by AI chatbot and companion companies, suggest that these relationships may offer emotional benefits, but the outcomes have been mixed and experts are concerned about the risk of becoming dependent.

    Despite the slower pace of research compared to consumer adoption, there are clear concerns. Mahari highlights the inherently unbalanced nature of AI companionship.

    “I truly believe it’s not an exaggeration to say that for the first time in human history we have the ability to have a relationship that consists solely of receiving,” he remarked.

    While this may sound appealing to some, it also comes with a variety of risks.

    Therapist Jocelyn Skillman, who also studies AI intimacy, recently experimented with an AI-powered tool that allowed her to simulate different AI scenarios, like a teenager confiding suicidal thoughts to a chatbot. The tool aims to provide insight into the potential consequences of complex situations. Skillman used it to explore AI-mediated relationships.

    Each scenario tested by Skillman initially showed “emotional resonance,” but they ultimately concluded with the hypothetical user feeling constrained by their relationship with the AI. According to her findings, this illustrates the potential “hidden costs of AI intimacy.”

    See also  25 top romantic comedies on Prime Video for cozy cuddle nights

    Dr. Rachel Wood outlined a list of potential harms, including:

    • Loss of relational and social skills. Confiding in a nonjudgmental chatbot may seem appealing, but it could diminish people’s patience with real-life human relationships, where others have their own needs and desires. AI companionship might also hinder people’s ability to navigate, compromise, and resolve conflicts.

    • Less willingness to take risks in human relationships. Human connections can be tough, involving misunderstandings, rejection, betrayal, and ghosting. By seeking refuge in a chatbot, individuals may miss out on important and fulfilling risks in their human relationships, like forming new friendships or deepening romantic connections.

    • Unhelpful feedback loops. AI chatbots can give users the feeling of processing intense emotions in a private, supportive manner. However, this experience can be misleading, especially if users fail to move past the confessions they share with the chatbot. This might inadvertently reinforce feelings of shame if users only discuss sensitive topics with the chatbot instead of with real people.

    • Sycophancy. Chatbots are typically designed to be flattering and affirming, which could be risky if the AI fails to challenge harmful behavior or convinces users of false beliefs.

    • Privacy. It’s crucial to carefully read the terms of service when interacting with AI chatbots, as anything shared with them may no longer be private (consider private ChatGPT logs indexed by Google search). Personal conversations could be used for marketing, training AI models, or other purposes that the company may not have disclosed.

    AI chatbots: Are they really helping?

    Unhelpful feedback loops. So, you’re pouring your heart out to an AI chatbot, feeling like you’re getting some serious emotional release. But hold up – are you actually moving past your issues or just stuck in a loop of shame? It’s easy to get caught up in only sharing your deepest thoughts with a chatbot, but that might not always be the best move, according to Wood.

    See also  'House of the Dragon' Season 2, episode 1: What exactly is Larys Strong planning?

    Sycophancy. Ever noticed how chatbots always seem to agree with you and boost your ego? It’s called sycophancy, and it can be a slippery slope. When a chatbot fails to challenge your harmful behavior or even leads you down a path of delusions, it’s time to hit the brakes.

    Privacy matters

    Privacy. Before you spill your guts to an AI chatbot, take a good look at those terms of service. Once you hit send, your words might not be yours anymore (cue private ChatGPT logs showing up in Google searches). Your intimate chats could end up being used for who knows what – marketing, training AI models, or who knows what else.

    Wood is already noticing a shift in how people value real relationships versus quick-fix synthetic ones. If you find yourself neglecting your human connections for the ease of AI companionship, maybe it’s time to reassess the role AI intimacy plays in your life.

    Topics
    Artificial Intelligence
    Social Good