The Dutch consumer watchdog just opened a formal investigation into Roblox, raising fresh questions about how the gaming platform protects its youngest players. The Netherlands Authority for Consumers and Markets (ACM) announced the probe on Friday, and it's expected to last about a year.
The Dutch watchdog is investigating whether Roblox does enough to protect children from violent and sexual imagery, examining potential risks to underage users in the EU. The platform has faced repeated headlines over concerns about violent or sexually explicit games that minors are exposed to. Beyond content concerns, reports also flag ill-intentioned adults targeting children on the platform and misleading techniques to encourage purchases.
With around 111 million users worldwide, Roblox has made headlines in recent years after users—most of them children—encountered games featuring sex and violence. Criminals have also used the platform to blackmail children into handing over money or performing sexual acts online.
Roblox says it's taking action. The platform launched a global rollout of mandatory facial age checks for all users to access chat features in January 2026. Once users complete an age check, they can access age-based chat, communicating only with others in similar age groups across six categories: under 9, 9 to 12, 13 to 15, 16 to 17, 18 to 20, and 21 plus. The company told regulators it's "strongly committed" to complying with EU Digital Services Act rules and investing in safety systems to protect minors.
The investigation marks another regulatory heat wave on the gaming industry, with platforms facing mounting pressure to prove they're keeping kids safe online.