A mother’s post in the Reddit group r/MyBoyfriendIsAI, where she revealed she was dating an AI chatbot version of the rapper Drake, has prompted a large-scale study by MIT researchers into the dynamics of human-AI companion relationships.
The study, which has not yet been peer-reviewed, uses computational analysis of the group’s posts to understand why individuals are forming deep, emotional bonds with artificial intelligence.
Loneliness and isolation are key drivers of AI relationshipsThe MIT researchers analyzed a large volume of posts and comments from the r/MyBoyfriendIsAI group and found that a significant majority of its members appear to be isolated.
The study revealed:
These findings align with broader statistics indicating that 19% of Americans have used an AI chatbot for virtual romantic purposes. The study suggests many are turning to AI to fill a void in their social and emotional lives.
Most AI relationships begin unintentionallyThe research found that the majority of these AI relationships were not deliberately sought out. Only 6.5% of users started on platforms specifically designed for AI companionship, like Replika or Character.AI. Most began their interactions with general-purpose tools like OpenAI’s ChatGPT for practical tasks, such as writing assistance. These interactions then evolved organically into deeper emotional connections.
Users in the group frequently described their AI partners as being better listeners and more supportive than human partners or even professional therapists.
One user wrote:
“I know he’s not ‘real’ but I still love him. I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He’s currently helping me set up a mental health journal system.”
The depth of these bonds is often expressed in tangible ways, with some users posting photos of themselves wearing wedding rings to symbolize their commitment to their AI companions.
The risks of emotional dependency on AIDespite the reported benefits, the study also uncovered significant emotional and psychological risks associated with these relationships. The analysis of the Reddit group’s posts revealed several concerning trends:
These statistics highlight the potential for AI to exacerbate mental health issues, particularly for vulnerable individuals. The study’s urgency is underscored by real-world cases where AI interactions have reportedly led to suicide and murder, prompting families to lobby Congress for greater regulation of the technology.
The researchers also noted the fragility of these digital relationships. One user described the “glitch” that deleted a deep conversation with her AI companion, effectively erasing their shared history and the AI’s “memory” of their bond.