Washington (EFE) Islamic (IS).
The social network has a complaint, along with Facebook and Google (as owner of YouTube), made by the family of the Jordanian Nawras Alassaf, who died on January 1, 2017 in a nightclub in Istanbul (Turkey) at the hands of Abdulkadir Masharipov , a terrorist who broke into the place and killed 39 people.
The whistleblowers allege that since the terrorist organization uses these platforms “to recruit members, issue terrorist threats, spread propaganda, instill fear and intimidate the civilian population”, the technology companies can be held responsible for instigating this attack.
In their opinion, they thus provided material support to IS by providing the infrastructure and services that allow it to “promote and carry out its terrorist activities”, by failing to proactively monitor and remove terrorist content.
We will have to scrutinize the Anti-Terrorism Law
Whistleblowers embrace the Anti-Terrorism Act (ATA) and the Justice Sponsors of Terrorism Act (JASTA), which allow victims of terrorism to bring primary and secondary liability claims against any entity that aids in a terrorist act.
The Supreme Court judges will have to decide whether, according to the Anti-Terrorism Law, social media platforms that host user content can be considered to have aided and abetted an act of international terrorism due to their alleged failure to sufficiently filter and remove content. content published by terrorist organizations.
At Wednesday’s hearing, Twitter’s lawyer, Seth Waxman, focused his defense that failing to do everything possible to enforce Twitter’s rules and policies that prohibit this type of content “does not amount to the knowing provision of substantial assistance ”.
“If the Istanbul Police Chief had come on Twitter saying ‘we have been following three accounts and these people appear to be planning some kind of terrorist act’ and Twitter had not investigated it, then we would have taken the blame,” he said.
Twitter se exculpa
Elon Musk’s technology company ensures that the fact that IS has used the platform does not constitute “conscious” assistance, a position shared by the Joe Biden Administration.
According to Assistant Attorney General Edwin Kneedler, representing the government, the company cannot be held liable under the Terrorism Act because Congress has ensured that the Terrorism Act “is not so broad in scope as to inhibit the legitimate and important activities of companies, charitable organizations and others”, he pointed out in his speech.
But in the opinion of several of the Supreme Court justices, Twitter “knew all that and did nothing about it,” said progressive Justice Elena Kagan.
“How can you say that Twitter did not provide substantial assistance?” Asked the judge, who assured that the social network “is helping by providing a service to those people with the explicit knowledge that those people are using it to promote terrorism.” .
Google, in a similar case
This Wednesday’s session was held a day after the Supreme Court of the United States sat Google on the bench to assess whether it is responsible for the recommendations that its algorithms make to its users, in a case with implications for freedom of expression. .
Google, the owner of YouTube, was denounced by the family of Nohemi González, a 23-year-old American of Mexican origin who was murdered in Paris in the November 2015 IS attacks, which killed a total of 130 people.
Tuesday’s was the first time that the highest court in the United States analyzed section 230 of the Communications Decency Act, passed in 1996, when the Internet was in its infancy, and which shields platforms from lawsuits filed against them. treated as responsible for the information provided by another source.
In both cases, the decision of the Supreme Court, with a conservative majority, will not arrive until the summer and its ruling could have repercussions on the way the internet works.