Social media platforms are microcosms of society. You have all the interesting and fun communities, but also some terribly dark and heinous places. Now, Instagram has been exposed by a report that claimed the app's algorithms actually helped connect paedophile networks. A report has highlighted that vast networks of paedophiles have been operating openly on the platform and its algorithm is playing a role in connecting those who seek child pornography with those who supply them.
The report is based on an investigation by The Wall Street Journal (WSJ) and researchers at Stanford University and the University of Massachusetts Amherst. According to their findings, there have also been accounts where the buyers have been able to commission “specific sexual acts” or even arrange “meet-ups”.
It should be noted that the algorithms were not designed to specifically connect these groups. They were designed to help users find relevant content on the platform, and if a user searches for niche content or spends time on niche hashtags, they will eventually be shown such content, enabling them to connect with those who supply and sell them.
As per the report, some explicit and deeply offensive hashtags were functioning on Instagram such as #pedowhore” and “#preteensex” where thousands of posts were posted. These paedophile groups frequented such places to connect with child pornography sellers. Instagram would also recommend such sellers and helped the entire network thrive.
In fact, the findings of the report suggest that many of such seller accounts pretended to be the children themselves and would use “use overtly sexual handles”.
“That a team of three academics with limited access could find such a huge network should set off alarms at
Read more on tech.hindustantimes.com