Washington (EFE).- The Supreme Court of the United States sat Google on the bench on Tuesday to assess whether it is responsible for the recommendations that its algorithms make to its users, in a case with implications for freedom of expression.
Google, owner of YouTube, was denounced by the family of Nohemi González, a 23-year-old American assassinated in Paris during the November 2015 attacks by the Islamic State (IS) group, which killed a total of 130 people.
His parents believe that Google, through Youtube, allowed ISIL to post videos that incited violence and to join it, in addition to recommending recordings of the jihadists to its users through an algorithm that identified possible interested parties. In his opinion, that makes the technology legally responsible for the damage inflicted.
This Tuesday was the first time that the highest court in the United States analyzed section 230 of the Communications Decency Act, approved in 1996, when the Internet was in its infancy.
This regulation shields the platforms from lawsuits that treat them as responsible for the information provided by another source. Specifically, it establishes that no provider should be treated “as a publisher or disseminator of information provided by another content provider.”
Whether they can be blamed for third-party material recommendations is up in the air.
“Are the internet providers responsible for the next video that is available to me?” The progressive judge Sonia Sotomayor asked herself this Tuesday at the oral hearing.
“I don’t understand how a neutral suggestion of something you’ve shown an interest in is incitement,” added Conservative Justice Clarence Thomas.
“The question is what does the defendant do with the algorithm. Does he use it to incite people to watch ISIL videos? (…) They offer you things you haven’t asked for. There is no difference with the fact that they were sending you an email,” replied the lawyer Eric Schnapper, representative of Nohemi’s father, Reynaldo González.
The ruling on Google attracts attention for its implications
The decision of the Supreme Court, with a conservative majority, will not arrive until the summer, but different platforms have already warned that the ruling could have repercussions on the way the internet works.
“Interactive computing services must make constant decisions about what information to display and how so that users are not overwhelmed with irrelevant or unwanted information,” Google said in a brief submitted to the court.
“Helping to find the needle in the haystack is an existential necessity of the internet,” said the firm’s lawyer, Lisa Blatt, at the hearing.
Judge Brett Kavanaugh, of a conservative tendency, also warned against a hypothetical future in which this shielding does not exist: “Every day billions of queries are made on the Internet. Each of them would open up the possibility of a lawsuit.”
The case has offered the opportunity for the promoters of the law in question, current Democratic Senator Ron Wyden and former Republican Congressman Chris Cox, to speak out about their original intentions.
“Section 230 protects specific recommendations to the same extent as other forms of content editing and presentation. Any other interpretation would subvert its purpose to encourage innovation in presentation and moderation. Given the enormous volume of user-created content today, its protection is even more important now than when it was enacted.
It is not a unique case
The Gonzalezes filed their complaint in 2016, but a California federal court stressed that Google was protected by law because the videos in question were produced by IS. An appeals court later sided with that court, prompting the plaintiffs to take the case to the Supreme Court, which admitted it for processing in October.
The high court is examining this Wednesday a similar case filed against Twitter by the family of a young Jordanian who died in January 2017 in an IS attack in Turkey and who, also represented by Schnapper, accuses the technology company of not having done enough to curb the spread of violent content.