The DSA's Blind Spot: Algorithmic Audit of Advertising and Minor Profiling on TikTok
#DSA #algorithmic audit #advertising #minor profiling #TikTok #regulation #child protection
📌 Key Takeaways
- The DSA may have gaps in regulating algorithmic advertising on TikTok.
- Audits reveal concerns about profiling minors via TikTok's algorithms.
- TikTok's ad targeting practices potentially exploit vulnerable young users.
- Regulatory oversight needs strengthening to protect minors online.
📖 Full Retelling
🏷️ Themes
Digital Regulation, Child Safety
📚 Related People & Topics
TikTok
Video-focused social media platform
TikTok, known in mainland China, Macau, and Hong Kong as Douyin (Chinese: 抖音; pinyin: Dǒuyīn; lit. 'Shaking Sound'), is a social media and short-form online video platform. It hosts user-submitted videos, which range in duration from three seconds to 60 minutes.
Entity Intersection Graph
No entity connections available yet for this article.
Mentioned Entities
Deep Analysis
Why It Matters
This investigation reveals critical gaps in the Digital Services Act's enforcement regarding child protection on social media platforms. It affects millions of young TikTok users across Europe who may be exposed to inappropriate advertising and algorithmic profiling despite regulatory protections. The findings highlight systemic failures in platform compliance that could undermine the DSA's effectiveness in safeguarding minors online. This matters to parents, regulators, and advocacy groups concerned about children's digital wellbeing and data privacy.
Context & Background
- The Digital Services Act (DSA) is the EU's landmark legislation regulating online platforms, implemented in 2024 with strict rules for very large platforms like TikTok
- TikTok has over 150 million monthly users in Europe and has faced multiple investigations regarding child safety and data practices
- Previous studies have shown social media algorithms can amplify harmful content to minors despite age restrictions and parental controls
- The DSA requires platforms to conduct risk assessments and implement measures to protect minors from harmful content and profiling
What Happens Next
The European Commission will likely launch formal proceedings against TikTok for potential DSA violations, with possible fines up to 6% of global revenue. National data protection authorities may coordinate additional investigations into TikTok's advertising practices. Expect increased pressure for independent algorithmic audits across all major platforms, with potential amendments to strengthen the DSA's minor protection provisions.
Frequently Asked Questions
The audit found TikTok's algorithm continues to profile minors for targeted advertising despite DSA prohibitions, and the platform fails to adequately verify users' ages, allowing inappropriate ad targeting to underage users.
Minors may receive advertising for age-inappropriate products or services, and their browsing data could be used to build detailed behavioral profiles that violate EU privacy protections for children.
The DSA allows the European Commission to impose fines up to 6% of global turnover, demand algorithmic transparency, and order immediate changes to platform practices that endanger minors.
Yes, algorithmic audits of Meta, YouTube, and Snapchat would likely reveal comparable gaps in minor protection, suggesting this is an industry-wide compliance problem rather than isolated to TikTok.
Parents should enable Family Pairing controls, regularly review account settings, report inappropriate content, and consider limiting screen time while regulatory solutions are developed.