Roblox reaches $12 million settlement with Nevada on child safety

Roblox has agreed to a $12 million settlement with Nevada to resolve claims related to child safety and exploitation on its platform, avoiding a trial. The deal includes funding for child support programs and new safety measures like enhanced age verification. Nevada Attorney General Aaron Ford hailed the agreement as a step toward safer online environments for children.

Roblox announced the settlement this week amid ongoing legal challenges from multiple states. The agreement allocates $10 million over three years to local children's programs, such as the Boys and Girls Club, and nondigital support groups. An additional $2.5 million will fund a law enforcement liaison position and online safety awareness campaigns, according to details of the deal with Nevada Attorney General Aaron Ford's office. Ford stated, 'this settlement will create a safer environment for our children online, and I hope that it will serve as a bellwether for how online interactive platforms allow our state's youth to use their products.' The company will introduce stricter safety protocols, including an age verification system using facial estimation combined with government-issued IDs. This will limit children to chatting only with players of similar ages. Users under 16 cannot message adults unless designated as a 'trusted friend' via QR code, ensuring prior relationships. Parental controls now extend to accounts up to age 16, up from 13 previously. Roblox will create dedicated children's accounts restricting adult content and featuring vetted games for younger players. These build on recent updates for Roblox Kids accounts, aimed at ages five to eight, and Roblox Select accounts for ages nine to 15, both with content and chat limits. Despite this resolution, Roblox faces ongoing lawsuits from states including Kentucky, Iowa, Louisiana, and Texas over allegations of facilitating child sexual exploitation.

Awọn iroyin ti o ni ibatan

Illustration of Discord users facing mandatory teen settings and age verification prompts amid privacy backlash.
Àwòrán tí AI ṣe

Discord defaults all users to teen settings with age verification

Ti AI ṣe iroyin Àwòrán tí AI ṣe

Discord announced it will default all accounts to a teen-appropriate experience starting in early March, requiring age verification to access adult content and restricted servers. The move aims to enhance child safety but has sparked backlash over privacy concerns following a recent data breach. Verification options include on-device facial estimation or submitting government IDs.

Los Angeles County has filed a lawsuit against Roblox, alleging the gaming platform engages in deceptive practices and fails to adequately protect children from predators and exploitation. The suit claims Roblox markets itself as safe for young users while its design exposes minors to harm. Roblox strongly disputes the allegations, emphasizing its ongoing safety improvements.

Ti AI ṣe iroyin

Roblox is introducing three age-specific account types to improve safety for younger users amid growing regulatory pressure. The tiers, Roblox Kids for ages 5-8, Roblox Select for 9-15, and standard Roblox for 16 and older, will roll out globally starting mid-May or early June. Restrictions on chat and content access will vary by age group.

Spain's Prime Minister Pedro Sanchez has announced plans to ban children under 16 from using social media, following Australia's lead. The legislation, part of broader regulations, could take effect next week with strict age-verification requirements. Sanchez criticized platforms for exposing children to harm and called for accountability from tech executives.

Ti AI ṣe iroyin

As countries like Australia and Spain advance bans on social media for children, the Philippines is now considering similar restrictions to protect youth from online risks, though no decision has been reached.

Governments around the world are pushing to restrict children's access to social media, doubting platforms' ability to enforce age limits. TikTok has responded by announcing a new age-detection technology across Europe to prevent users under 13 from joining. This approach aims to balance protection with less drastic measures than outright bans.

Ti AI ṣe iroyin

Juries in California and New Mexico last week held Meta and Alphabet's YouTube liable for harms to young users, awarding a total of over $381 million in damages. The cases targeted platform features rather than third-party content, challenging long-standing Section 230 protections. Company lawyers have vowed to appeal the rulings.

 

 

 

Ojú-ìwé yìí nlo kuki

A nlo kuki fun itupalẹ lati mu ilọsiwaju wa. Ka ìlànà àṣírí wa fun alaye siwaju sii.
Kọ