Cover Image for "Why the Name 'David Mayer' Causes Failures in ChatGPT? Possible Digital Privacy Issues Could Be the Cause."
Tue Dec 03 2024

"Why the Name 'David Mayer' Causes Failures in ChatGPT? Possible Digital Privacy Issues Could Be the Cause."

Users of the conversational AI platform ChatGPT noticed a curious phenomenon over the weekend: the popular chatbot refuses to answer questions if...

Users of the conversational artificial intelligence platform ChatGPT made a curious discovery last weekend: the popular chatbot refuses to answer questions related to a "David Mayer." When asked for information about this name, the service becomes completely paralyzed. This has led to a series of conspiracy theories, although there may be a more logical reason behind this strange behavior.

Over the weekend, word spread that the name acted as a "poison" for the chatbot, prompting many to try to trick the system into simply recognizing the reference. However, all attempts to make ChatGPT mention that specific name resulted in errors or interruptions. When prompted, it responds with "I cannot provide an answer," if it says anything at all.

This phenomenon, which began as a simple curiosity, soon revealed that not only "David Mayer" caused the problem; other names, such as Brian Hood, Jonathan Turley, Jonathan Zittrain, David Faber, and Guido Scorza, also resulted in service failures. The question arises as to who these individuals are and why they might be on this list.

OpenAI has not provided any answers to inquiries about this matter, leaving users with the task of trying to piece together the clues. Some of these names could belong to ordinary people, but users have noticed that these individuals are public or semi-public figures who might be seeking to keep certain information under wraps, preventing it from being recalled by search engines or artificial intelligence models.

For instance, Brian Hood, an Australian mayor, had sued ChatGPT last year for falsely linking him to a crime he had reported. Although his lawyers contacted OpenAI, no lawsuit was filed. In a report, Hood mentioned that the offensive material had been removed and that a new version of the chatbot had been released.

As for the other names, David Faber is a CNBC reporter, Jonathan Turley is a lawyer and Fox News commentator who was a victim of "swatting" (a false 911 call that sends armed police to his home) at the end of 2023, Jonathan Zittrain is a legal expert who has spoken about the "right to be forgotten," and Guido Scorza is a member of the Italian Data Protection Authority. It is interesting to note that they may have requested that certain information about them online be restricted.

Returning to "David Mayer," there are no clearly identifiable notable figures with that name, although there was a professor by that name who taught about drama and history, specializing in connections between the Victorian era and film. Mayer passed away in the summer of 2023 but had previously faced legal and online issues by having his name associated with a criminal who used that pseudonym, complicating his travels.

Without an official explanation from OpenAI, speculation arises that the model has been programmed to handle certain names in a special manner, whether for legal, security, or privacy reasons. This may have led to one of these lists, likely maintained or updated automatically, becoming corrupted with erroneous code or instructions, generating the chatbot's anomalous behavior.

This case serves as a reminder that AI models are not magical and function as advanced autocomplete systems, being monitored and intervened upon by the companies that develop them. It is advisable, the next time you seek information, to consider consulting sources directly rather than relying on a chatbot.