TLDRs;
- Roblox will require all users to complete facial age verification to access chat features beginning January 2026.
- The change follows multiple investigations and lawsuits targeting inadequate child safety controls on the platform.
- Age-based chat grouping will restrict communication between adults and minors to reduce harmful interactions.
- Rollout begins in Australia, the Netherlands, and New Zealand before global expansion later in 2026.
Roblox is preparing to introduce a sweeping new safety measure that will require every user to undergo facial age verification before accessing the platform’s chat features.
The rule, which takes effect globally in January 2026, represents one of the most significant policy shifts in the company’s history as it faces growing pressure from regulators and child protection advocates.
The move comes after a series of investigations and lawsuits, including high-profile cases from the attorneys general of Texas and Louisiana, accusing the platform of failing to shield minors from predatory behavior. With nearly 40% of Roblox’s audience aged 12 or younger, industry analysts say the company has little room left to delay the adoption of “highly effective” age assurance tools now becoming mandatory in several countries.
Age-Restricted Chat Access
As part of the new system, Roblox will classify users into six age categories. Chat features will be restricted so individuals can only interact with others in similar age brackets, significantly reducing opportunities for adults to contact minors in private or semi-private channels.
This marks a major overhaul of Roblox’s social architecture, where communication has long been central to gameplay, collaboration, and multiplayer experiences.
Persona Will Handle the Facial Checks
Roblox has partnered with Persona, a well-known identity verification provider, to process the facial age scans. Users will be asked to use their device camera to capture an image, which Persona will analyze to estimate age. Both Roblox and Persona say that images will be deleted after processing, a commitment meant to ease concerns around biometric privacy.
Before the global rollout, the system will be enforced beginning in December across Australia, the Netherlands, and New Zealand, three markets currently pushing aggressive digital safety reforms.
The timing also aligns closely with the UK’s Online Safety Act, which requires platforms to implement robust age assurance measures by July 2025 or face penalties reaching up to £18 million or 10% of global revenue.
Vendors Race to Meet Compliance Demand
Roblox’s transition signals broader momentum across the gaming and social media industries, where regulators are increasingly demanding age-verification mechanisms to protect minors. Compliance vendors now find themselves in a race to deploy tools that satisfy legal requirements without causing significant user drop-off.
Experts say companies offering device-level verification, where data is processed locally rather than uploaded, could see rapid adoption as platforms seek lower-friction alternatives. Tokenization methods that avoid storing sensitive images are also gaining traction.
With Roblox targeting 1 billion daily active users and preparing new “Party” social features for users as young as nine, ensuring smooth and secure verification may prove essential for maintaining engagement.
New Resources for Families
Alongside the facial verification rollout, Roblox is introducing a Safety Center meant to help parents understand account controls, communication restrictions, and reporting tools.
The company hopes the new portal will enhance transparency and trust as it navigates rising scrutiny over youth safety.
While the upcoming verification measures are poised to reshape how millions interact on one of the world’s largest gaming platforms, Roblox insists the changes are necessary to safeguard its youngest players and meet the tightening global standards governing online behavior.


