The popular online platform Roblox, which hosts millions of user-created games and claims over 100 million daily active users, is facing intense scrutiny over its safety for children. A recent investigation, logging into the platform as a young user, revealed a disturbing environment where avatars were subjected to bullying, simulated violence, and sexually explicit behavior, despite having parental controls activated.
The experiment began in one of the platform’s most frequented games for young audiences, a fashion-themed experience. While presented as a creative outlet, the game includes features that allow players to virtually throw objects like tomatoes or excrement at others, a mechanic critics argue monetizes bullying. Further exploration uncovered hidden rooms with unsettling themes and narratives, a far cry from the family-friendly image often promoted.
Beyond individual games, broader concerns center on the platform’s economic model and moderation. Experts liken the pervasive in-game purchases and randomized “loot box” rewards to forms of child gambling. There is also significant criticism regarding the financial ecosystem, where a small percentage of young developers can earn money, leading to accusations that the platform profits from what some term “playbour” – the labor of children through play. Roblox has denied relying on child labor, stating most developers are adults.
Perhaps most alarming are the persistent safety failures. The investigation encountered games where avatars were sexually assaulted, harassed with abusive language, and targeted with violent imagery. These experiences were accessible while posing as a child. This aligns with a wave of serious lawsuits in the United States alleging that the platform’s safety shortcomings have facilitated the grooming and exploitation of minors. One high-profile investment report previously labeled the environment an “X-rated paedophile hellscape.”
In response to mounting pressure, including from regulators in Australia, Roblox has pledged new safety measures. These include making accounts for younger users private by default and restricting communication features. A company spokesperson stated that hundreds of safety enhancements have been made this year and emphasized a commitment to protecting users.
However, for many observers, these steps may be insufficient. The sheer volume of user-generated content makes consistent moderation a monumental challenge. The investigation found that determining who is responsible for moderating popular games can be nearly impossible, with some allegedly managed by teams of teenagers who have themselves been embroiled in controversies over inappropriate conduct.
The core dilemma for parents remains unresolved. While the platform offers creative and social opportunities, the investigation suggests that shielding children from harmful content is exceptionally difficult. One academic specializing in digital games, who acknowledges the positive potential of gaming, stated plainly that he does not allow his own young child to use Roblox, citing the uncontrollable risks. As the platform continues to grow, the question of whether it can truly be made safe for its youngest users hangs in the balance.