Wellbeing

Indonesia’s social media ban pushes Roblox to tighten child safety controls

Article cover image

Roblox said it will introduce additional content and communication controls for users under the age of 16 in Indonesia, and confirmed that new safety measures are being developed.

Indonesia is moving ahead with stricter online child safety regulations, prompting gaming platform Roblox to introduce new safety controls for younger users in the country.


The new rules, which come into force on March 28, require digital platforms to implement stricter age verification, restrict access for under-16 users, and block accounts identified as high risk. The directive is part of Indonesia’s broader push to strengthen online safety for children and regulate social media and gaming platforms.


In response, Roblox said it will introduce additional content and communication controls for users under the age of 16 in Indonesia. While the company has confirmed that new safety measures are being developed, specific details about the new features have not yet been disclosed.


Under the new Indonesian policy introduced by Communications and Digital Minister Meutya Hafid, platforms will also be required to deactivate accounts belonging to under-16 users who are deemed high risk, marking one of the stricter regulatory approaches to child online safety in the region.


The move comes as Roblox faces growing legal scrutiny globally over its child protection practices. Less than three weeks after Indonesia’s directive was announced, the company became involved in a legal case in the United States related to child safety concerns on its platform.


The Indonesian regulatory push follows a lawsuit filed in Nebraska by Attorney General Mike Hilgers, who accused Roblox of failing to adequately protect children from harmful content and predatory online behaviour. The complaint alleges that minors were able to access inappropriate content and were exposed to dangerous online interactions due to gaps in existing safety measures.


The Nebraska case is part of broader legal pressure facing the company in the United States, where more than 130 lawsuits related to child safety concerns have been consolidated into a single federal court case. Some of the allegations claim that children were directed to external platforms such as Discord, where risks increased.


Despite the legal challenges, Roblox has defended its platform and said user safety remains a core priority. The company has pointed to existing safety features including chat filters, parental controls and age verification systems as part of its ongoing efforts to improve child protection and online safety.


Indonesia’s new law is expected to increase compliance pressure on global tech platforms operating in the country, particularly those with large numbers of younger users, as governments worldwide move to tighten digital safety regulations for children and teenagers.

Loading...

Loading...