Mastodon – a social media platform popular with leftist “reporters” – is also coincidentally littered with highly explicit child sexual abuse material (CSAM), a new study from Stanford’s Internet Observatory has found.
Researchers located 112 instances of CSAM across 325,000 posts on Mastodon within just two days. The first instance of took the researchers only five minutes to find. There were also 713 uses of the 20 most prevalent CSAM hashtags that contained video or audio media and another 1,217 text posts linked to “off-site CSAM trading or grooming of minors.”
“We found that on one of the largest Mastodon instances in the Fediverse [a group of federated social networking services], 11 of the top 20 most commonly used hashtags were related to pedophilia,” the study adds.
“We got more photo DNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” David Thiel, one of the researchers told The Washington Post.
Mastodon is a “federated” social media platform through which people can join servers or “instances” that are separate yet interconnected, enabling users to interact with one another across multiple platforms. However, such platforms often have more permissive guidelines. Mastodon, for example, employs one person to moderate content on the platform compared to sites like Meta, which have thousands of moderators.
Pedophile networks are not merely restricted to federated platforms, however. It was discovered that “an enormous pedophile network” was operating on Instagram earlier this year.