Cover Image for Avoid Taking Artificial Intelligence on a Valentine's Day Date: There's a Surprise Cost You Wouldn't Anticipate.
Fri Feb 14 2025

Avoid Taking Artificial Intelligence on a Valentine's Day Date: There's a Surprise Cost You Wouldn't Anticipate.

Taking your virtual assistant on a date could lead you to reveal more than you imagine.

Being single on Valentine's Day can be disheartening, but seeking companionship in conversations with an artificial intelligence assistant may not be much better. These assistants lack their own personalities, and their true interest lies in obtaining your personal data.

Privacy specialists have discovered that four out of the five most popular AI companion apps in the Apple store can track the use of your data for profit. Miguel Fornés, a cybersecurity expert, points out that "instead of being there for us, these tools can feel more like surveillance instruments," emphasizing that tracking done by AI assistants can undermine user trust and violate privacy.

Surfshark's research focused on the data collection practices of five AI companion services: Kindroid, Nomi, Replika, EVA, and Character AI. The number, type, and handling of the types of data collected by each app were examined, revealing that 80% of these applications could use data to track their users.

The term "tracking" refers to linking user or device data collected from the app with data obtained from other applications and websites, aimed at personalized advertising. Additionally, this tracking involves sharing user information with data brokers. "This detailed information can enable companies to influence your decisions, which can have negative effects, such as overwhelming advertising, financial risks, or other unexpected issues," warns Fornés.

Character AI stands out as the app that collects the most user data, reaching up to 15 unique types of data collected, while the average is 9 types. EVA ranks second in data collection, obtaining 11 types, and both applications also collect information about users' approximate location to offer targeted advertising. Nomi, on the other hand, is the only one that claims not to collect data for tracking purposes.

In addition to the data these apps collect, there is concern about the information developers may obtain while you interact with the chatbot. This is risky, as AI assistants are designed to simulate human interactions that evoke friendship and love, which could make you more likely to share sensitive information that you wouldn’t provide to other types of chatbots like ChatGPT.

Experts warn about the unprecedented consequences this could trigger, especially now that regulations on artificial intelligence are just beginning to emerge. Therefore, it is recommended to take certain precautions when using AI companion services to protect your personal information and minimize the risk of misuse. Fornés advises: "Make sure to frequently review what permissions these apps have and be cautious about the information you share."