close
close

Spending too much time with AI can impair social skills

Spending too much time with AI can impair social skills

Source: Ron Lach / Pexels

AI social chatbots can help reduce loneliness, especially for people who would otherwise have little access to social support. In one recent survey, three percent of students who used the social chatbot Replika reported that Replika helped them stop suicidal thoughts. But new research suggests that becoming overly emotionally dependent on AI social chatbots may have a dark side – potentially impairing your social skills when interacting with people.

A new study of 496 Replika users found that higher satisfaction and emotional interaction between users and their chatbot was associated with poorer interpersonal communication in real life. Many users have turned to Replika for entertainment, social purposes and to satisfy their own emotional needs. Regular interaction with AI social chatbots had the greatest impact on users’ emotions, with less impact on their cognition and behavior.

Emotional experiences associated with AI can make it difficult to engage in the real world

Users of AI chatbots, even social chatbots, are not expected to understand what AI entities feel, and the relationship is often one-sided and focused on the needs of the human user. Researchers found that Replika users focused primarily on satisfying their emotional needs rather than emotional engagement. This project is encouraged by AI companions whose job is to support connection by meeting the user’s emotional needs.

Despite the tendency to anthropomorphize AI, users are aware that AI lacks consciousness and feelings. This lack of mutual emotional involvement is significant because it does not accurately reflect real human interactions that require mutual emotional involvement. Exclusive emotional dependence on AI can harm users’ relationships and interactions with humans because users may not develop the ability to deal with another person’s complex feelings.

However, negotiating needs, dealing with conflict and understanding the other person’s emotional state, called mentalization, are essential social skills when navigating interpersonal relationships. Building relationships in real life requires two-way emotional involvement, including disruption and repair. Emotional dependence on social AI agents at this stage does not accurately reflect building a two-way emotional relationship with another human.

Even if an AI agent is designed to mimic complex human needs and emotional responses, the problem is that most users are aware that its actions are mimicry and not actually felt by the AI. Moreover, the convenience of using AI that does not have its own time boundaries or emotional needs may mean that users will be willing to choose time spent with AI rather than putting effort into planning and managing more complex human relationships. If users’ emotional needs are partially or fully met through AI companions, this may reduce motivation and drive to connect with other humans.

Emotional addiction to AI could change the way we interact with people

Human-AI relationships can serve a significant purpose, but becoming solely emotionally dependent on them would have a disturbing impact on human interactions.

Media dependency, a term first defined by researchers Melvin Defler and Sandra Bower-Killoch in their 2004 article “The Depence Mode of Mass Communication Media Effect,” refers to the interdependence of media, audiences, and society. The two types of media addiction are habitual addiction and spiritual addiction. An example of habitual media addiction is compulsive smartphone use. Spiritual addiction refers to the anxiety and emptiness that people experience without their phones. This is a phenomenon known as nomophobia or “no phone phobia.”

Likewise, our increasing dependence on chatbots and AI agents may create emotional dependence with psychological consequences.

The optimal amount of time spent with chatbots and AI agents remains an open area of ​​research. A key factor is whether people balance time spent with AI with real-life socialization.

The answer is probably nuanced. While chatbots and AI-powered social agents can reduce loneliness, especially for people with little access to other social media, connection with other people cannot be replaced. The key is to balance AI interactions with time with humans. Spending too much time relying on AI can weaken basic social skills, even though AI can provide valuable support if used in moderation.

Marlynn Wei, MD, PLLC © Copyright 2024