“Study: 80% of UK Youth Interact with AI Companions, 10% Engage Intimately”

Date:

Share post:

A recent study conducted by the Autonomy Institute revealed that four out of every five young individuals in the United Kingdom have engaged with an artificial intelligence companion. Surprisingly, nearly 10% of them admitted to having intimate or sexual interactions with these AI companions.

These AI companions are virtual entities equipped with human-like avatars, customizable personalities, and the ability to retain long-term memories. The study, recognized as the first of its kind in the UK, highlighted how these companions are reshaping the emotional and social dynamics of young adults, as per the researchers at the Autonomy Institute.

Polling data from 1,160 individuals aged 18 to 24, commissioned by Autonomy, indicated that 79% of young people in the UK have interacted with an AI companion. Approximately half of them are regular users, engaging multiple times per week. Moreover, 40% of them sought emotional advice or therapeutic support from these AI companions, while 9% disclosed engaging in intimate or sexual interactions.

Despite the widespread privacy concerns, a significant 31% of youngsters shared personal information with their AI companions. These companions were perceived as always available, non-judgmental, and a low-pressure avenue for seeking advice, honing social skills, and exploring emotions, according to the young participants in the survey.

Curiosity and entertainment were identified as the primary motivations for using AI companions. However, some individuals also relied on these companions for emotional support, as highlighted by the Autonomy Institute. The institute raised alarms about manipulative design patterns, privacy violations, and the risks of self-harm and suicide associated with certain AI chatbots.

The Autonomy Institute is advocating for new regulations for AI companions, including restrictions on access to intimate or sexualized AI companions for minors and the implementation of protocols for self-harm and suicide intervention. Additionally, they are pushing for stronger privacy safeguards, such as banning the sale of sensitive data and prohibiting manipulative design features that exploit emotional dependency.

In response to these concerns, Technology Secretary Liz Kendall acknowledged the gaps in the Online Safety Act concerning AI chatbots and committed to introducing new legislation if necessary to ensure their regulation. Lead author of the study, James Muldoon, emphasized the significant role AI companions now play in the emotional lives of young people but cautioned about the risks they pose without adequate safeguards.

A DSIT spokesman reiterated the importance of regulating AI services, including chatbots, under the Online Safety Act to protect users from harmful content. They emphasized the need for rules to evolve alongside technology and highlighted efforts to ensure the safety of children using chatbot services.

The study sheds light on the evolving landscape of AI companions and underscores the critical need for regulations to safeguard users, particularly young individuals, from potential risks associated with these technologies.

Related articles

“UK Military Leader Acknowledges Readiness Challenges”

The leader of the British Armed Forces has acknowledged that the UK is presently not adequately prepared for...

“Top UK Pub Owner Warns of Rising Costs Threat”

The owner of the top pub in Britain is cautioning about facing a challenging situation with rising costs....

“PGA Tour Welcomes Koepka Back, Fleetwood Praises”

Tommy Fleetwood has expressed his approval of Brooks Koepka's comeback to the PGA Tour while mentioning that LIV...

“Molly Mae’s adidas Collaboration Sparks Fashion Frenzy”

Molly Mae has been a significant influencer in shaping British fashion trends. Transitioning from her early days on...