Roblox gives parents the wheel

Gaming platform grants greater adult supervision


News Desk April 06, 2025
Roblox has faced issues with cyberbullying and grooming in the past. Photo: File

print-news

Roblox has now made it possible for parents and adult supervisors to limit their children's interaction with select users, as per The Guardian. The online gaming platform, popular amongst children, now also allows parents to block their young ones from playing certain games within the virtual universe.

The changes have been added as part of a suite of safety updates, which help parents monitor their children's gaming experience to ensure less exposure to harmful content. Starting Wednesday, supervisors identifiable by their IDs or credit cards will gain access to three new tools.

Updated access

As one of the largest gaming platforms in the world, Roblox is home to more monthly users than the Nintendo Switch and the Sony PlayStation combined. In 2024, it recorded more than 80 million players in a day, of which roughly 40 per cent were under the age of 13.

As impressive a feat as that may be, it also calls for greater control and responsibility over the features that the virtual universe has to offer. Therefore, Roblox recently added new tools to its guidelines to aid concerned parents.

Parents should be able to access the new controls by linking their Roblox account to their child's account after completing verification procedures. They will then be able to head over to the 'Parental Controls' tab and supervise their child's gaming experience.

The friend management tool would allow them to block users on their child's friend list, which entails being unable to send and receive messages. Parents will also have the option to report accounts that they believe are violating the platform's policies.

In the past, they would only be able to view the friend list, but this update allows them to intervene without obstruction. Child players can also request their parents to unblock users.

Caregivers can also manage the content maturity level and decide which games their children can have access to. This would also allow them to keep a check on the player's screen time, while also being able to review 20 most frequent experiences over the past week. Previously, parents were allowed to set limitations on their children's daily screen time.

What prompted this?

After the Online Safety Act came into effect in the US this year, tech companies have been compelled by the law to take measures against malicious content on their platforms. Failure to do so would result in fines up to £18m or 10 per cent of global revenue. It seems that Roblox has been acting accordingly, given the platform's issues with grooming and cyberbullying.

The virtual universe, which welcomes players as young as eight years old, has run into problems with users bypassing safety monitoring systems. As per The Guardian, Turkey banned it in August for having "content that could lead to the exploitation of children".

Following this, Roblox issued a statement on their website, promising that they were working with local authorities to tackle the problem. "We're committed to doing everything we can to keep our community safe, and we share global policymakers' commitment to protecting children," read the statement.

Listing all the ways that Roblox works to maintain a user-friendly community, the statement added, "We work tirelessly to be vigilant against attempts to circumvent our safety systems. We continuously innovate to develop the next generation of safety tools and features as we seek to continue leading the future of safety and civility online."

The gaming platform introduced 40 safety updates last year, such as prohibiting players under 13 from sending direct messages. Developers also employed a machine-learning model to serve as a more accurate chat moderator between players than a human moderator.

Trust in supervision

Andy Burrows, the chief executive of suicide prevention organisation the Molly Rose Foundation, appreciated the safety improvements but believes that there are problems yet to be tackled.

"Roblox still needs to get to grips with substantial problems with harmful and age-inappropriate content," he said. "Extensive research has shown Roblox is awash with age-inappropriate games and communities, including depression rooms that can compound misery and offer no support to vulnerable children."

Raising concern over these creative choices, he added, "This content raises fundamental questions about Roblox's broader commitment to safety and shows it cannot just rely on parental controls but must take decisive action to make the platform safe for its young users."

In March, Roblox's co-founder and chief executive, David Baszucki, also weighed in on the matter. Vouching for the platform's ability to protect its users from harmful content, he said, "My first message would be: if you're not comfortable, don't let your kids be on Roblox. That sounds a little counterintuitive, but I would always trust parents to make their own decisions," he told the BBC.

In response, Mumsnet founder Justine Roberts said that parents on the forum often struggle with online supervision despite trying their best. She added, "We all know that, with the best will in the world, life sometimes gets in the way."

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ