Lara Malvesí
Barcelona (EFE) expert from the Open University of Catalonia (UOC) Ferran Lalueza.
Viral videos on Twitter and TikTok such as the case of the Waka nightclub in Sant Quirze del Vallès (Barcelona), in which two minors appeared practicing oral sex, or the less explicit images of a group of adolescents between the ages of 14 and 17 « perreando» with sexual movements in the local Pampara de Les Corts (Barcelona).
In both cases, because they show explicit sex or a clearly sexual attitude, on the one hand, and because they are minors, on the other, they should never have been able to distribute on social networks such as TikTok, although its own moderation system makes it easier for them to being “out there” and being “captured” by some user before being removed by a moderator.
The algorithm is not foolproof
Although the social network’s own algorithm can automatically identify prohibited elements and block them, most violations of the rules are identified by people who “moderate” the content and eliminate it, leaving a record of the rules that have been violated and, where appropriate, , alerting the authorities.
Meanwhile, however, seconds or minutes can pass in which the reproductions and records of the images are countless, a TikTok moderator explains to EFE.
«One of the biggest practical difficulties is knowing if the person who appears in an image is a minor or not. Unless it says it clearly, it is very ambiguous, ”acknowledges another colleague.
In the last viral case, that of the Pampara nightclub, the images of young people “perreando” were eliminated by the local’s own account after their publication, in which they even labeled the party “Pamparatardes+14”, the name of the session evening for young people between the ages of 14 and 17 who still cannot access the nightly parties with the sale of alcohol.
However, until they were deleted, it took time for the video to be played more than a million and a half times.
And that when the algorithm detects that there may be sexual content, it already prioritizes its moderation as urgent in the “chat” that moderators share to act quickly against this type of content or others, such as hate messages.

The algorithm is blind
The professor of Information and Communication Sciences at the UOC Ferran Lalueza explained that unfortunately social networks function as spirals of potential transmission, “both for the positive and to disseminate videos whose distribution may in itself be criminal, such as Waka’s video.
“When the algorithm detects that something in its first few seconds has a lot of activity and views, it takes care of showing it to an increasing number of people, promoting its own viralization,” he explains.
“Although I would like to think that the degree of digital maturity is high and users will stop the spread of this type of content and report it, if applicable, unfortunately the evidence shows that it is not,” he added.
The expert points out that the viralization of this type of content has not “surprised” him since the moderation of content “has become much more flexible” and “more artisanal, or human, control resources have been reduced in favor of automated ones.”
Deficiencies in the barriers to minors
Furthermore, it has called attention to the fact that some social networks such as Twitter -which played a relevant role in spreading Waka’s criminal video- have reduced the number of people who filter their videos as a result of staff cuts after the entry of the new board led by Elon Musk.
“Although networks like TikTok try to improve, the reality is that their artificial intelligence systems are not infallible and can be tricked more or less easily and by the time human controllers detect the video it is already too late,” he reflected.
It is also possible to improve, he points out, the system of these networks to prevent those under 14 years of age from opening an account or that adolescents who do so do not receive content labeled as for adults by the algorithm.
However, in its control effort, the Chinese social network reported in its latest transparency report on 2021 data that it had closed 7.3 million accounts suspected of belonging to minors.
Other networks that have aroused the “concern” of the Barcelona Prosecutor’s Office about the increase in content of minors without consent are Only Fans, Admire Me or My fan page.
These are photos that minors post in the open on social networks and that criminals use without permission to upload them with the intention of making users pay for that content.
Last September, the Prosecutor’s Office pointed out that for this “novelty” of criminal typology they had received fifty complaints in the last six months.