Roblox is implementing a new safety measure that requires facial age verification to access chat features following its involvement in a lawsuit that accuses the gaming platform of causing harm to children.

The decision comes as Roblox faces several lawsuits and other claims accusing it of enabling sexual predators to connect with and abuse children, CNN reports.

The new system, which Roblox called the “gold standard” for safety, is meant to rigorously limit and block inappropriate communication between minors and unknown adults, the company explained in a statement .

“As the internet has matured, technological advancements in safety and security have steadily shifted the industry’s approach to age assurance,” the statement reads. “While initial efforts relied on simple self-declare

See Full Page