SP
BravenNow
Meta, Google under attack as court cases bypass 30-year-old legal shield
| USA | general | ✓ Verified - cnbc.com

Meta, Google under attack as court cases bypass 30-year-old legal shield

📖 Full Retelling

Internet platforms have long been able to rely on special protections to avoid liability for what takes place on their sites. But that may be changing.

📚 Related People & Topics

Google

Google

American multinational technology company

Google LLC ( , GOO-gəl) is an American multinational technology corporation focused on information technology, online advertising, search engine technology, email, cloud computing, software, quantum computing, e-commerce, consumer electronics, and artificial intelligence (AI). It has been referred t...

View Profile → Wikipedia ↗

Meta

Topics referred to by the same term

Meta most commonly refers to:

View Profile → Wikipedia ↗

Communications Decency Act

1996 attempt by the United States Congress to regulate Internet pornography

The Communications Decency Act of 1996 (CDA) was the United States Congress's first legislative attempt to regulate obscene and indecent material on the Internet. In the 1997 landmark case Reno v. ACLU, the United States Supreme Court unanimously overturned most of the statute due to its restriction...

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for Google:

🌐 Artificial intelligence 6 shared
🌐 YouTube Premium 5 shared
🌐 Gemini 5 shared
🌐 Alphabet 5 shared
🌐 YouTube Music 4 shared
View full profile

Mentioned Entities

Google

Google

American multinational technology company

Meta

Topics referred to by the same term

Communications Decency Act

1996 attempt by the United States Congress to regulate Internet pornography

Deep Analysis

Why It Matters

This news matters because it threatens the legal foundation that has enabled the explosive growth of internet platforms for three decades. The erosion of Section 230 protections could fundamentally reshape how social media companies and search engines operate, potentially making them legally liable for user-generated content. This affects billions of users who rely on these platforms for communication, information, and commerce, as well as the companies themselves who may need to implement stricter content moderation or face significant legal exposure. The outcome could also impact smaller tech startups that lack resources for extensive legal compliance.

Context & Background

  • Section 230 of the Communications Decency Act was passed in 1996 and provides immunity to online platforms for content posted by third-party users.
  • This legal shield has been credited with enabling the growth of social media, search engines, and user-generated content platforms by protecting them from lawsuits over user posts.
  • Recent years have seen bipartisan criticism of Section 230, with conservatives arguing it enables censorship and liberals arguing it allows harmful content to spread unchecked.
  • The Supreme Court has previously declined to significantly reinterpret Section 230, leaving lower courts to navigate its application in various cases.
  • Previous attempts to reform Section 230 through legislation have stalled in Congress despite multiple proposals from both political parties.

What Happens Next

The court cases will likely proceed through the judicial system, potentially reaching appellate courts and possibly the Supreme Court within the next 1-2 years. Congress may renew efforts to reform Section 230 legislation in response to judicial developments. Tech companies will likely increase lobbying efforts and prepare contingency plans for operating under reduced legal protections. Regulatory agencies like the FTC may issue new guidelines for platform liability if legal precedents shift significantly.

Frequently Asked Questions

What is Section 230 and why is it important?

Section 230 is a 1996 law that protects online platforms from being held legally responsible for content posted by their users. It's crucial because it allows platforms like Facebook, YouTube, and Twitter to host user content without facing endless lawsuits over every problematic post.

How could losing Section 230 protection affect social media platforms?

Without Section 230 protection, platforms would likely implement much stricter content moderation and filtering to avoid liability. This could lead to significant censorship, reduced user engagement, and increased operational costs that might disadvantage smaller competitors.

What types of court cases are bypassing Section 230 protections?

Recent cases involve allegations that platforms' recommendation algorithms actively promote harmful content rather than just passively hosting it. Courts are increasingly examining whether platforms lose immunity when their systems amplify or distribute problematic content through personalized feeds.

How might this affect ordinary internet users?

Users could see platforms removing more content preemptively to avoid legal risk, potentially limiting free expression. Services might also become more restrictive about what can be posted, and some platforms might implement paywalls or reduce features to cover increased legal costs.

Are there any alternatives to Section 230 being proposed?

Various proposals include creating exceptions for certain types of harmful content, requiring platforms to meet specific moderation standards to retain immunity, or replacing Section 230 with a new regulatory framework that balances platform responsibility with user protections.

}
Original Source
For the last three decades, internet giants have been able to avoid legal exposure for content on their platforms, thanks to a law that differentiates the companies from online publishers. But those safeguards appear to be weakening. Meta and Google , which dominate the U.S. digital ad market, find themselves as defendants in a host of lawsuits that collectively serve to undermine the long-held notion that they have legal protection for what surfaces on their sites, apps and services. Companies like TikTok and Snap are in the same predicament. The unifying aspect of the recent cases is that they're crafted to circumvent Section 230 of the Communications Decency Act, which Congress passed in 1996 and President Bill Clinton signed into law. Established in the early days of the internet, the law protects websites from being sued over content posted by their users, and allows them to act as moderators without being held liable for what stays up. Last week, a jury in New Mexico found Meta liable in a case involving child safety, while jurors in Los Angeles held the Facebook parent and Google's YouTube negligent in a personal injury trial. Days after those verdicts were revealed, victims of the notorious sex offender Jeffrey Epstein filed a class action lawsuit against Google and the Trump administration over allegations related to the wrongful disclosure of personal information. In that complaint, the plaintiffs argue that Google's AI Mode, which serves up AI-powered summaries and links, is "not a neutral search index," a clear effort to make the case that Google isn't just a platform sitting between users and the information they seek. "The plaintiffs' bar is winning the war against section 230 through systematic, relentless litigation that is causing there to be divots and chinks in its protection," said Eric Goldman, a law professor at Santa Clara University School of Law, in an interview. watch now VIDEO 1:18 01:18 Meta shares plunge 8% for worst day since October af...
Read full article at source

Source

cnbc.com

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine