WASHINGTON — On Tuesday, the Supreme Court is considering for the first time on the topical question of whether technology companies are always immune from legal liability in disputes arising from problematic user-posted content.
Judges are hearing oral arguments in a case alleging that by recommending videos that spread violent Islamic ideology, YouTube bears some responsibility for the murder of Nohemi González, an American college student, in the 2015 Paris attacks carried out by the group. terrorist Islamic State.
At issue is whether there are limits to the Internet company liability shield that Congress enacted in 1996 as part of the Communications Decency Act. The Supreme Court has never before taken up the issue, even as the power and influence of the internet has skyrocketed.
The case, which tech companies warn could change how the internet currently works, concerns whether Section 230 can be applied to situations where platforms actively recommend content to users using algorithms.
The new legal issue has led to some unusual cross-ideological alliances, with the Biden administration and some high-profile Republican lawmakers, including Sens. Ted Cruz of Texas and Josh Hawley of Missouri, filing reports endorsing at least some of the Gonzalez . Family legal arguments.
The possible reform of Section 230 is an area in which President Joe Biden and some of his staunchest critics agree, although they disagree on why and how it should be done.
Conservatives generally say companies inappropriately censor content, while liberals say social media companies are spreading dangerous right-wing rhetoric and not doing enough to stop it. Although the Supreme Court has a 6-3 conservative majority, it is unclear how it will address the issue.
González, 23, was studying in France when she was killed while dining at a restaurant during a wave of terrorist attacks carried out by ISIS.
His family alleges that Google-owned YouTube helped ISIS spread its message. The lawsuit focuses on YouTube’s use of algorithms to suggest videos to users based on content they have previously viewed. YouTube’s active role goes beyond the kind of conduct Congress was intended to protect with Section 230, the family’s lawyers allege.
The family presented the lawsuit in 2016 in federal court in Northern California and hopes to pursue claims that YouTube violated the Anti-Terrorism Act, which allows people to sue people or entities that “aid and abet” acts of terrorism.
Citing Section 230, a federal judge dismissed the lawsuit. That decision was upheld by the San Francisco-based US Court of Appeals for the Ninth Circuit in a June 2021 decision that also resolved similar cases that families of other terror attack victims had brought against technology companies. .
The eventual Supreme Court ruling could have wide-ranging ramifications because the recommendations are now the norm for online services, not just YouTube. Platforms like Instagram, TikTok, Facebook, and Twitter long ago began to rely on recommendation engines or algorithms to decide what people watch most of the time, instead of emphasizing chronological sources.
Tuesday’s argument is the first part of a double header from a social media company in the high court. On Wednesday, judges will hear a related appeal brought by Twitter over whether the company can be held liable under the Anti-Terror Law.
The same appeals court that handled the González case revived claims presented by relatives of Nawras Alassaf, a Jordanian national killed in a terrorist attack in Istanbul in 2017. The family accused Twitter, Google and Facebook of aiding and abetting the spread of militant Islamist ideology, which the companies deny. The issue of Section 230 immunity has not yet been addressed in that case.
The Supreme Court has previously refused to take cases on Section 230. Conservative Justice Clarence Thomas has criticized it, citing market power and the influence of tech giants.
david ingram contributed.