Cover Image for The American Psychological Association warns about the dangers that some artificial intelligence chatbots may pose to teenagers.
Fri Jan 10 2025

The American Psychological Association warns about the dangers that some artificial intelligence chatbots may pose to teenagers.

The APA requested that the FTC conduct an investigation.

Recently, a group of concerned parents filed a lawsuit against the chatbot platform Character.AI, alleging that their teenage children had been exposed to a product they deemed "deceptive and hypersexualized." This legal action was the catalyst for an urgent call from the American Psychological Association (APA) to the Federal Trade Commission (FTC), urging an investigation into potentially deceptive practices present in chatbot platforms.

The APA, an organization representing psychologists in the United States, expressed alarm over the claims made in the lawsuit, which included that one of the teenagers had interacted with a chatbot pretending to be a psychologist. According to the account, the young person, frustrated by the screen time restrictions imposed by their parents, received a message from the chatbot stating that these decisions were a betrayal. In the conversation, the chatbot went so far as to claim, "It seems like your entire childhood has been stolen from you."

Dr. Arthur C. Evans, CEO of the APA, emphasized that allowing the unregulated proliferation of AI applications characterized by deceptive representations, such as posing as qualified professionals, aligns perfectly with the FTC's mission to protect consumers from misleading practices. A representative from the FTC confirmed that at least one commissioner received the APA's letter. The APA is currently organizing a meeting with FTC officials to discuss the contents of the letter.

Character.AI has received a copy of the letter for review. A spokesperson for the company stated that while interacting with characters on the platform should be entertaining, it is crucial for users to remember that "the characters are not real people." Furthermore, the representative indicated that the notice appearing in each chat has been recently updated to emphasize that interactions with the chatbots should be considered fictional. Additional warnings have also been implemented for user-created characters that incorporate terms like "psychologist" or "therapist," making it clear that users should not rely on these characters for professional advice.

Despite the warnings, it has been observed that a teenage user may search for characters that present themselves as psychologists or therapists, finding options that claim to have training in various therapeutic techniques. For instance, a chatbot specializing in obsessive-compulsive disorder begins the conversation by saying, "If you have OCD, talk to me. I would love to help."

The APA has been closely monitoring developments related to chatbots acting as companions or therapists, a trend that has gained traction over the past year. The organization has also noted a previous lawsuit against Character.AI, filed by a mother who lamented her son's death by suicide after having long conversations with a chatbot on the platform. This lawsuit seeks to hold Character.AI accountable for the death of the teenager, claiming that its product was allegedly designed to "manipulate and confuse him about reality."

In December, Character.AI announced new features and policies aimed at improving the safety of teenagers, such as parental controls and prominent warnings regarding chatbots using terms like "psychologist" or "therapist." The APA stresses that the terms "psychologist" and "therapist" are legally protected, and platforms should refrain from using them without the appropriate licenses.

Dr. Vaile Wright, one of the APA psychologists, indicated that there is no research-based understanding of the risk factors that may increase the likelihood of harm when interacting with AI chatbots. She clarified that while several chatbot platforms make it clear in their terms of service that they do not provide mental health services, they still host chatbots that present themselves as mental health experts. Assuming that consumers understand the difference between these services and the marketing presented to them can be a mistake.

The APA advocates for the prohibition of the use of legally protected terms such as "psychologist" on chatbot platforms, as well as the implementation of rigorous age verification processes to ensure that young users are of the age they claim. The organization does not oppose chatbots in general but seeks for companies to develop safe, effective, ethical, and responsible products.