New York (EFE).- The Instagram algorithm helps connect and promote a vast network of accounts dedicated to pedophilia and the purchase of sexual content of minors, according to a joint investigation by The Wall Street Journal (WSJ) and researchers from the University from Stanford and the University of Massachusetts Amherst.
Instagram, a very popular social network among teenagers, not only hosts these activities, but its algorithms promote them.
A system that pedophiles take advantage of
The recommendation systems of this social network that is part of Meta have the objective of linking those who share niche interests and through this algorithm pedophiles easily find sellers of sexual content of minors.
Investigators found that Instagram – which has more than 1.3 billion users – allowed people to search for explicit hashtags and sometimes even emoticons that linked them to accounts that used the terms to advertise the sale of child sexual material and “meetings” with them. minors.
The promotion of sexual content of minors violates the rules established by Meta, as well as US laws.
Meta acknowledged the problems within its compliance operations, telling the WSJ that it has established an internal working group to address the issues raised.
“Child exploitation is a horrible crime,” the company said, adding that it is “continuing to investigate ways to actively defend against this behavior.”
Instagram alleges it blocks tags
Meta – which has more than 3 billion users on its apps, including Instagram, Facebook and WhatsApp – said it has removed 27 pedophile rings in the past two years.
The platform also said it blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted systems that recommend users search for terms known to be associated with sexual abuse.
In its research, the Stanford Internet Observatory used labels associated with sex with minors and found 405 sellers.
For their part, the Stanford researchers found similar sexual exploitation activity on other, smaller social networks, but said the problem on Instagram is particularly serious.
“The most important platform for these networks of buyers and sellers appears to be Instagram,” the experts wrote in a report published today.
The Stanford team found 128 accounts offering to sell child sexual abuse material on Twitter, less than a third of the number they found on Instagram, which has a much larger user base than Twitter.
According to the study, Twitter did not recommend these types of accounts to the same extent as Instagram, and Instagram also removed them much faster.
The post Instagram’s recommendation algorithms promote pedophile networks, according to research, was first published in EFE Noticias.