Your A.I. “Soulmate” Is In Bed With Big Data

Swipe left on this predatory technology

Stephen Moore

--

A.I. romance chatbots

Right now, humanity is experiencing something of a loneliness epidemic. Brought on by social media slowly eroding IRL connections, the pandemic and resulting isolating measures to combat it, and increasing cases of depression and anxiety, levels of loneliness are so high that the condition has been called a “global concern” by WHO. Loneliness is so prevalent in the U.S. that it’s been declared a public health crisis. It was so bad in the U.K. that, five years back, the government appointed its first-ever Minister of Loneliness (that’s not a joke), which resulted in the world’s first loneliness strategy. (It has achieved very little to date.)

With people feeling so disconnected from society — and each other — it’s left many looking elsewhere for companionship and love.

Technology has a checkered past in this department. Social media companies have talked big about “bringing the world closer together,” but have instead made us more inward and less connected to actual human beings. Dating apps have always claimed to be helping you find “the one,” when their ulterior motive was to keep you swiping left and right for as long as possible. And now, A.I. is stepping in with romantic chatbots that claim to be a “self-help program,” “a provider of software and content developed to improve your mood and wellbeing,” and “here to maintain your mental health.”

Screenshots from EVA AI Chat Bot & Soulmate

However, research from Mozilla’s *privacy not included team that explores the privacy safety of 11 popular romantic A.I. chatbots shows these chatbots are nothing more than bad actors that don’t have your privacy at heart. They are predatory — promising to be your soulmate, help you in times of crisis, or be an ear to your secrets and your most private vulnerabilities, but, instead, doing everything to keep you hooked to the app.

All while cheating on you with, surprise, surprise, Big Data.

As Mozilla researcher Misha Rykov puts it,

“To be perfectly blunt, A.I. girlfriends are not your friends. Although they are marketed as…

--

--