SP
BravenNow
The DSA's Blind Spot: Algorithmic Audit of Advertising and Minor Profiling on TikTok
| USA | technology | ✓ Verified - arxiv.org

The DSA's Blind Spot: Algorithmic Audit of Advertising and Minor Profiling on TikTok

#DSA #algorithmic audit #advertising #minor profiling #TikTok #regulation #child protection

📌 Key Takeaways

  • The DSA may have gaps in regulating algorithmic advertising on TikTok.
  • Audits reveal concerns about profiling minors via TikTok's algorithms.
  • TikTok's ad targeting practices potentially exploit vulnerable young users.
  • Regulatory oversight needs strengthening to protect minors online.

📖 Full Retelling

arXiv:2603.05653v1 Announce Type: cross Abstract: Adolescents spend an increasing amount of their time in digital environments where their still-developing cognitive capacities leave them unable to recognize or resist commercial persuasion. Article 28(2) of the Digital Service Act (DSA) responds to this vulnerability by prohibiting profiling-based advertising to minors. However, the regulation's narrow definition of "advertisement" excludes current advertising practices including influencer mar

🏷️ Themes

Digital Regulation, Child Safety

📚 Related People & Topics

Blind spot

Topics referred to by the same term

Blind spot or Blindspot may refer to:

View Profile → Wikipedia ↗

DSA

Topics referred to by the same term

DSA may refer to:

View Profile → Wikipedia ↗
TikTok

TikTok

Video-focused social media platform

TikTok, known in mainland China, Macau, and Hong Kong as Douyin (Chinese: 抖音; pinyin: Dǒuyīn; lit. 'Shaking Sound'), is a social media and short-form online video platform. It hosts user-submitted videos, which range in duration from three seconds to 60 minutes.

View Profile → Wikipedia ↗

Entity Intersection Graph

No entity connections available yet for this article.

Mentioned Entities

Blind spot

Topics referred to by the same term

DSA

Topics referred to by the same term

TikTok

TikTok

Video-focused social media platform

Deep Analysis

Why It Matters

This investigation reveals critical gaps in the Digital Services Act's enforcement regarding child protection on social media platforms. It affects millions of young TikTok users across Europe who may be exposed to inappropriate advertising and algorithmic profiling despite regulatory protections. The findings highlight systemic failures in platform compliance that could undermine the DSA's effectiveness in safeguarding minors online. This matters to parents, regulators, and advocacy groups concerned about children's digital wellbeing and data privacy.

Context & Background

  • The Digital Services Act (DSA) is the EU's landmark legislation regulating online platforms, implemented in 2024 with strict rules for very large platforms like TikTok
  • TikTok has over 150 million monthly users in Europe and has faced multiple investigations regarding child safety and data practices
  • Previous studies have shown social media algorithms can amplify harmful content to minors despite age restrictions and parental controls
  • The DSA requires platforms to conduct risk assessments and implement measures to protect minors from harmful content and profiling

What Happens Next

The European Commission will likely launch formal proceedings against TikTok for potential DSA violations, with possible fines up to 6% of global revenue. National data protection authorities may coordinate additional investigations into TikTok's advertising practices. Expect increased pressure for independent algorithmic audits across all major platforms, with potential amendments to strengthen the DSA's minor protection provisions.

Frequently Asked Questions

What specific DSA violations did the audit uncover?

The audit found TikTok's algorithm continues to profile minors for targeted advertising despite DSA prohibitions, and the platform fails to adequately verify users' ages, allowing inappropriate ad targeting to underage users.

How does this affect TikTok users under 18?

Minors may receive advertising for age-inappropriate products or services, and their browsing data could be used to build detailed behavioral profiles that violate EU privacy protections for children.

What powers does the DSA give regulators in this situation?

The DSA allows the European Commission to impose fines up to 6% of global turnover, demand algorithmic transparency, and order immediate changes to platform practices that endanger minors.

Are other social media platforms likely affected by similar issues?

Yes, algorithmic audits of Meta, YouTube, and Snapchat would likely reveal comparable gaps in minor protection, suggesting this is an industry-wide compliance problem rather than isolated to TikTok.

What can parents do to protect their children on TikTok?

Parents should enable Family Pairing controls, regularly review account settings, report inappropriate content, and consider limiting screen time while regulatory solutions are developed.

}
Original Source
arXiv:2603.05653v1 Announce Type: cross Abstract: Adolescents spend an increasing amount of their time in digital environments where their still-developing cognitive capacities leave them unable to recognize or resist commercial persuasion. Article 28(2) of the Digital Service Act (DSA) responds to this vulnerability by prohibiting profiling-based advertising to minors. However, the regulation's narrow definition of "advertisement" excludes current advertising practices including influencer mar
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine