
Case may roll back legal immunity for social media companies
Clip: 2/21/2023 | 7m 52sVideo has Closed Captions
Supreme Court case could roll back legal immunity for social media companies
The Supreme Court heard a case that could radically alter the internet and social media. Section 230 of the Communications Decency Act protects websites from lawsuits over material posted by users. But the family of a victim killed in an ISIS attack says that immunity should not apply to recommendations YouTube’s algorithms make. John Yang discussed the case with Marcia Coyle and Sheera Frenkel.
Problems with Closed Captions? Closed Captioning Feedback
Problems with Closed Captions? Closed Captioning Feedback
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...

Case may roll back legal immunity for social media companies
Clip: 2/21/2023 | 7m 52sVideo has Closed Captions
The Supreme Court heard a case that could radically alter the internet and social media. Section 230 of the Communications Decency Act protects websites from lawsuits over material posted by users. But the family of a victim killed in an ISIS attack says that immunity should not apply to recommendations YouTube’s algorithms make. John Yang discussed the case with Marcia Coyle and Sheera Frenkel.
Problems with Closed Captions? Closed Captioning Feedback
How to Watch PBS News Hour
PBS News Hour is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipAMNA NAWAZ: Today, the Supreme Court heard arguments in a case that has the potential to radically transform the Internet and social media.
John Yang explores the case and its impact.
JOHN YANG: Amna, this case is about a law that protects Web sites from lawsuits over material posted by users.
The family of a student killed in a 2015 ISIS attack in Paris says that immunity should not apply to the recommendations YouTube's algorithms make based on a user's viewing history.
The family says that, by recommending ISIS-related content, YouTube acted as a recruiting platform for the group.
The law, which is often referred to as Section 230, was written in 1996.
That's before Google, before Twitter and before the concerns about the spread of disinformation and hate speech.
Marcia Coyle is the "NewsHour"'s Supreme Court analyst.
She was in the courtroom for today's two-and-a-half-hours of arguments.
And Sheera Frenkel is a tech reporter for The New York Times.
Marcia, I want to start with you.
When the lawyer for the family was making his case, I think confusion seemed to be the word of the day from the -- from the justices.
I looked at the transcript.
It was used five times by both liberal and conservative justices.
They were very -- there was a lot of skepticism about his arguments.
This is Justice Elena Kagan questioning the attorney.
ELENA KAGAN, U.S. Supreme Court Associate Justice: Everybody is trying their best to figure out how the statute, which was a pre-algorithm up statute applies in a pre-algorithm world.
Every time anybody looks at anything on the Internet, there is an algorithm involved.
Does your position send us down the road such that 230 really can't mean anything at all?
ERIC SCHNAPPER, Attorney For Gonzalez: I don't think so, Your Honor.
The question -- as you say, algorithms are ubiquitous.
But the question is, what does the defendant do with the algorithm?
JOHN YANG: Marcia, what's going on there?
MARCIA COYLE: Well, the family's lawyer was responding to Justice Kagan, trying to explain how he viewed these algorithmic recommendations.
And he also-called them thumbnails that YouTube puts on its Web site.
And he said, if these thumbnails encourage access to information or videos, in this case, to ISIS information, and the user has not requested that information, then it falls outside of the statute because it is, in effect, he argues, in service to ISIS.
JOHN YANG: On the other hand, there was a lot of -- there was some skepticism about Google's argument.
This is Justice Ketanji Brown Jackson.
KETANJI BROWN JACKSON, U.S. Supreme Court Associate Justice: Isn't it true that statute had a more narrow scope of immunity than is -- than courts have ultimately interpreted it to have and that what YouTube is arguing here today?
The question today is, well, can we be sued for making recommendations?
That's just not something the statute was directed to.
LISA BLATT, Attorney For Google: That's death by 1,000 cuts.
And the Internet would have never gotten off the ground if anybody could sue every time.
JOHN YANG: So, she's looking at the text of the law.
MARCIA COYLE: She is.
And she sees a narrower immunity for social media platforms.
The Google attorney today is really arguing for the greatest amount of immunity, obviously, for her client and other social media platforms.
And she disagreed with Justice Jackson about the history here.
She said that, even though that Congress at the time that they enacted this may not have referred to algorithmic recommendations, there were analogues to it.
They certainly knew what was coming on the horizon, and were able to address it.
JOHN YANG: Does it seem like the court is not ready to go all the way on either side?
MARCIA COYLE: It did, absolutely.
In fact, I think the bottom line here is, if the court wants to do something with this case, it's going to be drawn into line-drawing.
How far does it go to protect social media platforms, or how far does it have to go to take away some of that immunity?
So it's a line-drawing difficulty.
JOHN YANG: And where to draw that line, Sheera, a couple of justices actually asked out loud whether the Supreme Court was the right place to do that.
This is Justice Brett Kavanaugh.
BRETT KAVANAUGH, U.S. Supreme Court Associate Justice: Isn't it better for -- to keep it the way it is for us, and Congress -- to put the burden on Congress to change that, and they can consider the implications and make these predictive judgments?
JOHN YANG: What has Congress been trying to do about Section 230?
SHEERA FRENKEL, The New York Times: Well, we have a rare situation where almost everyone in Congress is united in thinking that Section 230 is an old law that needs updating.
The problem is that Republicans and Democrats really disagree about the problems.
Republicans are concerned that Internet companies have too much control, in a sense.
They're worried about something they call conservative bias, which is the idea that tech companies routinely censor conservative voices on their channels more than other voices.
Democrats are worried that the companies aren't doing enough about harmful speech and conspiracy theories, things like misinformation around the elections.
Democrats want to see more done on that.
JOHN YANG: Why does the -- changing this law, the prospect of changing this law, make the Internet companies and social media platforms so nervous?
SHEERA FRENKEL: Well, Section 230 has been what's protected them until now.
It's essentially said that they're not publishers.
They're not responsible for what other people say on their platform.
And so it's let them roll out content moderation policies that they see as appropriate.
They decide what's allowed and what's not allowed.
But, if they miss something, such as the case was in this specific trial that's being heard, they are not punished for it.
JOHN YANG: We have a new "PBS NewsHour"/NPR/Marist poll set to be released this week, found that only 29 percent of adults questioned said the government should be the ones setting the rules for social media; 67 percent said it should be left to the social media companies themselves.
I would imagine that's music to Silicon Valley's ears.
SHEERA FRENKEL: Yes, I imagine, if you're sitting at Google or Facebook or Twitter, you're happy to hear that.
I think people are worried, rightfully so, about how the government is going to decide what something like hate speech is.
And depending who controls Congress and who sits in the White House, that could mean something very different to different political parties.
So, there is a reason why people are worried that, if the government makes these decisions, they will become overly politicized.
JOHN YANG: Marcia, this was the first big social media case to come before the court, the first time they had looked at Section 230.
MARCIA COYLE: That's right.
JOHN YANG: A lot more to come.
What's -- there's a case tomorrow.
What's that about?
MARCIA COYLE: It's sort of a sequel to the one today.
The Gonzalez family is part of a case, the case tomorrow, that has two other families that are -- the underlying claim here is that the YouTube, Facebook, Twitter have had recommendations, content that violated or encouraged an act of terror under the Anti-Terrorism Act.
And so that's what the court is going to look at, the elements of that kind of a claim and whether they can be held liable under that specific statute.
John, I should also add that the content moderation that Sheera mentioned, that there are two big cases pending in the Supreme Court right now.
The justices have yet to decide whether to take them.
They're out of Florida and Texas, opposite lower court rulings.
Good chance the court might get into this again.
JOHN YANG: And it would probably be in the term beginning in the fall.
MARCIA COYLE: That's right.
JOHN YANG: Marcia Coyle, "NewsHour"'s Supreme Court analyst, and Sheera Frenkel of The New York Times, thank you both very much.
MARCIA COYLE: Pleasure, John.
SHEERA FRENKEL: Thank you.
Avian flu strain shows transmissibility among mammals
Video has Closed Captions
Clip: 2/21/2023 | 6m 7s | Avian flu strain raises concerns after outbreaks among mammals (6m 7s)
How Fort Wayne's investment in art mirrors its turnaround
Video has Closed Captions
Clip: 2/21/2023 | 8m 7s | How an Indiana city's investment in public art mirrors its overall turnaround (8m 7s)
Norfolk Southern CEO defends Ohio derailment response
Video has Closed Captions
Clip: 2/21/2023 | 11m 43s | Norfolk Southern CEO defends Ohio response as EPA orders company to clean up toxic spill (11m 43s)
Russia suspends participation in New START nuclear treaty
Video has Closed Captions
Clip: 2/21/2023 | 4m 12s | Russia suspends participation in nuclear treaty as Biden rallies support for Ukraine (4m 12s)
Where U.S.-Russia relations stand a year after invasion
Video has Closed Captions
Clip: 2/21/2023 | 8m 39s | Where relations between U.S. and Russia stand a year after Putin's invasion of Ukraine (8m 39s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipSupport for PBS provided by:
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...