Los Angeles County has filed a lawsuit against Roblox, alleging the gaming platform engages in deceptive practices and fails to adequately protect children from predators and exploitation. The suit claims Roblox markets itself as safe for young users while its design exposes minors to harm. Roblox strongly disputes the allegations, emphasizing its ongoing safety improvements.
Los Angeles County filed a lawsuit against Roblox on Friday, accusing the popular online gaming platform of false advertising, unfair competition, and enabling the systemic sexual exploitation and abuse of children across the United States. The 82-page complaint details how Roblox's platform design, safety architecture, and monetization practices allegedly expose minors to inappropriate content, grooming, and predators. It points to years of reports, research, articles, screenshots of harmful content, and social media posts involving specific games on the platform.
County counsel Dawyn R. Harrison stated in a release, "This is not about a minor lapse in safety. It is about a company that gives pedophiles powerful tools to prey on innocent and unsuspecting children. The trauma that results is horrific, from grooming to exploitation to actual assault. This needs to stop."
The suit follows recent enhancements by Roblox, including mandatory age verification for some features, restrictions on in-game content and messaging for users under 13 implemented in 2024, and a program last year requiring tens of millions of children to verify their age with a selfie. Roblox has about 150 million daily active users worldwide, with over 40 percent under age 13, and the time players spend on the platform reportedly exceeds that of PlayStation Network and Steam combined.
Roblox responded, stating, "We strongly dispute the claims in this lawsuit and will defend against it vigorously. Roblox is built with safety at its core, and we continue to evolve and strengthen our protections every day." The company highlighted advanced safeguards to monitor content and communications, noting that users cannot send or receive images via chat, and directed parents to its Safety Center.
This action is part of broader legal scrutiny of gaming and social media platforms. Similar lawsuits have come from states including Florida, Texas, Kentucky, and Louisiana, where the attorney general cited a case of an individual arrested last year for using voice-altering technology to exploit young players. Other platforms like Discord and YouTube have also introduced age verification tools.