A recent study conducted by the Autonomy Institute revealed that four out of every five young individuals in the United Kingdom have engaged with an artificial intelligence companion. Surprisingly, nearly 10% of them admitted to having intimate or sexual interactions with these AI companions.
These AI companions are virtual entities equipped with human-like avatars, customizable personalities, and the ability to retain long-term memories. The study, recognized as the first of its kind in the UK, highlighted how these companions are reshaping the emotional and social dynamics of young adults, as per the researchers at the Autonomy Institute.
Polling data from 1,160 individuals aged 18 to 24, commissioned by Autonomy, indicated that 79% of young people in the UK have interacted with an AI companion. Approximately half of them are regular users, engaging multiple times per week. Moreover, 40% of them sought emotional advice or therapeutic support from these AI companions, while 9% disclosed engaging in intimate or sexual interactions.
Despite the widespread privacy concerns, a significant 31% of youngsters shared personal information with their AI companions. These companions were perceived as always available, non-judgmental, and a low-pressure avenue for seeking advice, honing social skills, and exploring emotions, according to the young participants in the survey.
Curiosity and entertainment were identified as the primary motivations for using AI companions. However, some individuals also relied on these companions for emotional support, as highlighted by the Autonomy Institute. The institute raised alarms about manipulative design patterns, privacy violations, and the risks of self-harm and suicide associated with certain AI chatbots.
The Autonomy Institute is advocating for new regulations for AI companions, including restrictions on access to intimate or sexualized AI companions for minors and the implementation of protocols for self-harm and suicide intervention. Additionally, they are pushing for stronger privacy safeguards, such as banning the sale of sensitive data and prohibiting manipulative design features that exploit emotional dependency.
In response to these concerns, Technology Secretary Liz Kendall acknowledged the gaps in the Online Safety Act concerning AI chatbots and committed to introducing new legislation if necessary to ensure their regulation. Lead author of the study, James Muldoon, emphasized the significant role AI companions now play in the emotional lives of young people but cautioned about the risks they pose without adequate safeguards.
A DSIT spokesman reiterated the importance of regulating AI services, including chatbots, under the Online Safety Act to protect users from harmful content. They emphasized the need for rules to evolve alongside technology and highlighted efforts to ensure the safety of children using chatbot services.
The study sheds light on the evolving landscape of AI companions and underscores the critical need for regulations to safeguard users, particularly young individuals, from potential risks associated with these technologies.
