In a significant move to bolster online child safety, tech giants Google, OpenAI, Roblox, and Discord have collaborated to establish the non-profit organization, Robust Open Online Safety Tools (ROOST). This initiative aims to make core safety technologies more accessible and provide free, open-source AI tools for identifying, reviewing, and reporting child sexual abuse material (CSAM).
Core Details:
- Founded by: Google, Roblox, Discord, and OpenAI
- Funding: $27 million from McGovern Foundation, Knight Foundation, and others
- Announced: AI Action Summit 2025, Paris
Primary Objectives:
- Develop open-source AI tools for:
- Identifying CSAM (Child Sexual Abuse Material)
- Reviewing suspicious content
- Reporting harmful material
- Key Features:
- Uses large language AI models
- Makes safety infrastructure transparent
- Provides free access to protection tools
- Unifies existing CSAM detection methods
Significance:
- Addresses growing challenges of AI-generated harmful content
- Creates collaborative approach to child safety
- Makes essential safety tools globally accessible
- Focuses on transparency and accountability
As stated by Eric Schmidt, former Google CEO: Need to “accelerate innovation in online child safety.”
UK Leads with New AI Laws: UK’s Groundbreaking Move:
- First country to criminalize AI-generated sexual abuse images
- Creates specific AI sexual abuse offenses
- Sets global precedent for child protection laws
The initiative represents a unified effort by tech leaders to create a safer digital environment for children through accessible, open-source safety technologies.🌍✨
Reference: Verge