Cover Image for Roblox, Discord, OpenAI, and Google Form New Group for Child Safety.
Mon Feb 10 2025

Roblox, Discord, OpenAI, and Google Form New Group for Child Safety.

ROOST claims that the advancement of artificial intelligence has created a greater need to strengthen online security for minors.

Google, OpenAI, Roblox, and Discord have joined forces to create a new nonprofit organization aimed at improving online child safety. This initiative, called Robust Open Online Safety Tools (ROOST), aims to make essential safety technologies more accessible to businesses and provide open-source, free artificial intelligence tools to identify, review, and report child sexual abuse material.

The motivation behind this initiative partly stems from the transformations brought about by advancements in generative artificial intelligence in digital environments. Eric Schmidt, former CEO of Google and one of the founding partners of ROOST, highlighted the urgent need to innovate online safety for children. While few details about the CSAM detection tools are known, it has been confirmed that they will employ advanced language AI models and seek to "unify" existing options to address this content.

ROOST’s approach is based on a child protection-centered platform, fostering collaboration and transparency through an open-source model that will enable a more accessible and inclusive infrastructure. The announcement of ROOST comes amid intense regulatory debate over child safety on social media and digital platforms, where companies are seeking self-regulatory solutions to meet legal demands.

According to the National Center for Missing and Exploited Children (NCMEC), suspected child exploitation has increased by 12% between 2022 and 2023. Since 2020, more than half of children in the United States use Roblox, which has faced criticism for not adequately addressing child sexual exploitation and exposure to inappropriate content. Both Roblox and Discord were named in a social media-related lawsuit in 2022, alleging that these platforms failed to prevent adults from messaging children unsupervised.

Founding members of ROOST are providing funding, tools, or their expertise for the project. The organization plans to collaborate with leading developers of artificial intelligence models to create a "community of practice" focused on content safety measures, providing training datasets and identifying gaps in security.

ROOST aims to make "existing tools" more accessible by combining the detection and reporting technologies of its member organizations into a unified solution that will be easier for other companies to implement. Naren Koneru, Vice President of Engineering, Trust, and Safety at Roblox, mentioned that ROOST could host AI moderation systems that companies can integrate through API calls. However, there is still uncertainty regarding the AI moderation tools that ROOST will offer.

For its part, Discord announced that its contributions will be based on the Lantern project, a cross-platform information-sharing initiative it joined in 2023 along with Meta and Google. It may also include an updated version of its AI model for detecting inappropriate content, which is planned to be made public this year. The interaction between these tools and existing CSAM detection systems, such as Microsoft’s PhotoDNA image analysis tool, is not entirely clear.

In addition to its participation in ROOST, Discord has launched a new feature called "Ignore," which allows users to hide messages and notifications without notifying the people they have muted. Discord's Legal Director, Clint Smith, stated that they are committed to making the Internet a safer and better place, especially for young people. ROOST has raised over $27 million for its first four years of operation, backed by philanthropic organizations such as the McGovern Foundation, the Future of Online Trust and Safety Fund, the Knight Foundation, and the AI Collaborative. The organization will also be supported by experts in child safety, artificial intelligence, open-source technology, and counter-violent extremism.