U.S. Rushes to Finalize Investigation into X's Content Moderation Efforts.
Meta has made the decision to mimic the moderation strategy used by X.
The European Union is in an advanced process to determine whether X's actions to moderate illegal or harmful content comply with the digital services regulations of the bloc. The inquiry into X's risk management and content moderation will conclude "as soon as legally possible," according to a statement addressed to European legislators. This news comes a day after Meta announced significant changes to its own moderation practices, inspired by X's methods.
EU Justice Commissioner Michael McGrath, along with technology policy leader Henna Virkkunen, committed to actively drive this investigation. This process was triggered by a complaint from center-right German lawmakers who expressed concerns about the promotion of far-right party leaders on the platform by Elon Musk.
The investigation was initiated under the Digital Services Act (DSA) in December 2023, focusing on the "dissemination of illegal content in the context of Hamas's terrorist attacks against Israel" and the effectiveness of X's Community Notes system in combating information manipulation. Preliminary indications also suggest that X may be in violation of the DSA in aspects of advertising transparency, misleading practices, and its user verification system through the "blue check." Those found in violation of the DSA could face fines of up to six percent of their annual global revenue.
Since Elon Musk acquired X, formerly known as Twitter, in 2022, he has altered its verification system, turning it into a subscription service and reducing the trust and safety team to focus on community-driven moderation. The EU has expressed concerns about the increasing level of misinformation on the platform, a phenomenon that Musk has fostered following these changes. Moreover, Musk was recently appointed to be part of Donald Trump's upcoming administration, who has threatened to take action regarding what he considers biases in moderation.
Major U.S. tech companies are already responding to complaints from Republicans about online expression standards. Yesterday, Meta announced that, inspired by X, it would eliminate third-party fact-checkers in favor of implementing its own Community Notes program, in addition to lifting restrictions on topics such as immigration and gender identity. While these initial changes will be implemented in the U.S., both Meta and X operate globally, so the EU regulators' scrutiny remains constant.