Social media platform Instagram‘s algorithm pushes adult sex content as well as child-sexualizing material alongside advertisements for major United States companies and brands, according to a recent investigation by The Wall Street Journal (WSJ).
The investigation, which sought to determine what content the platform’s algorithm recommends by creating test accounts, consisted of following young gymnasts, cheerleaders, and other active teenage influencers.
The test accounts were quickly bombarded with “salacious content” such as inappropriate footage of children as well as “overtly sexual adult videos,” either above or below ads for brands including Pizza Hut, Disney, Walmart, and dating app Bumble.
Other footage recommended by Instagram involved a young girl with a digitally obscured face lifting up her t-shirt to expose her stomach, a man lying on a bed with his arms around what the caption described as a 10-year-old girl, as well as someone stroking a life-size latex doll.
Worse still, the WSJ’s investigation was not alone in discovering this trend, as the Canadian Center for Child Protection conducted its own testing with similar results over the past several months.
“Time and time again, we’ve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities,” said the Canadian centre’s executive director Lianna McDonald, who voiced her disgust that major companies were subsidizing the process through advertisements.
Responding to the investigation, Meta – the owner of Instagram – Vice President of Client Council and Industry Trade Relations, Samantha Stetson, said, “Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions.”