Snapchat Investigated in Europe Over Child Safety Policies
📖 Full Retelling
Regulators in Brussels accused the social media platform of maintaining a weak age-verification system, and steering younger users toward inappropriate experiences.
Entity Intersection Graph
No entity connections available yet for this article.
Original Source
Advertisement SKIP ADVERTISEMENT Supported by SKIP ADVERTISEMENT Snapchat Investigated in Europe Over Child Safety Policies Regulators in Brussels accused the social media platform of maintaining a weak age-verification system, and steering younger users toward inappropriate experiences. Listen · 3:58 min Share full article By Adam Satariano Reporting from London March 26, 2026, 7:30 a.m. ET European Union regulators on Thursday started an investigation into child protection safeguards at Snap, the latest in a series of cases worldwide challenging social media platforms over safety for younger users. Officials in Brussels accused Snap of having an ineffective age-verification system to keep children under the age of 13 off the company’s social media service Snapchat. The company’s algorithm also often misclassifies users age 13 to 17 as adults, then steers them toward inappropriate experiences, regulators said. Social media companies are under growing scrutiny amid a wider debate about child protection online. On Wednesday, a California jury found Meta and YouTube harmed the mental health of a young user with addictive design features — a landmark case that could lead to further lawsuits. Earlier this week, a New Mexico jury found Meta liable for violating state laws by failing to protect users from child predators. Across Europe, policymakers are examining new laws to limit children’s social media use. Governments in France, Denmark and Spain are among those exploring banning young people from the platforms, which many policymakers see as addictive and harmful to mental health. In February, European Union regulators issued a preliminary decision against TikTok for its “ addictive design ” that poses potential harm to the “physical and mental well-being” of users, including minors. Meta is also facing an investigation, started in 2024 , for the protection of child users on Instagram and Facebook. On Thursday, European regulators accused Snap of not adequately protec...
Read full article at source