Roblox Boosts Child-Safety Settings by Auto-Blocking DMs to Users Under 13, Increases Parental Controls on Violent, Sexually Themed Content

Reading now: 109

Jennifer Maas TV Business Writer Following recent reports and criticisms of Roblox‘s child-safety protections, the younger-skewing online gaming platform is implementing new measures, including the automatic blocking of direct messaging for users under the age of 13 and parents’ ability to regulate the level of “mature” content available to their kids, regardless of age.

These changes to Roblox — which partners with brands including Disney, Netflix, Nickelodeon, Warner Bros. Discovery, Mattel and more on kid-focused games — will allow parents to see their child’s friends, set time limits for their child on the platform and manage how much their child spends on Roblox every month, all under a parents’ separate account from their child’s.

Additionally, new communication settings for users under 13 that will be automatically implemented by Roblox unless a parent chooses to override include users no longer being able to use “platform chat,” which is the ability to directly message others on Roblox outside of games or experiences, and users limited to public broadcast messages only within a game or experience.

Outside of messaging, the “maturity” level of content — including what would be considered violent or sexually suggestive content — will now be rated as “minimal,” “mild” and “moderate,” rather than have a specific age assigned for a game or experience.

Read more on variety.com
The website celebsbar.com is an aggregator of news from open sources. The source is indicated at the beginning and at the end of the announcement. You can send a complaint on the news if you find it unreliable.

Related News

DMCA