A recent study conducted by Stanford’s Internet Observatory has uncovered that Mastodon, the decentralized network often regarded as an alternative to Twitter, is inundated with child sexual abuse material (CSAM). The study, which utilized Google’s SafeSearch API and PhotoDNA, found a staggering 112 instances of known CSAM within just two days of searching across 325,000 posts on the platform. Shockingly, the first instance of CSAM appeared after a mere five minutes of searching.

To conduct the research, the Internet Observatory scanned the 25 most popular Mastodon instances for CSAM. By employing Google’s SafeSearch API and PhotoDNA, the researchers were able to identify explicit images and content that matched hashtags or keywords frequently used by child sexual abuse groups online. In total, they discovered 554 pieces of content that were flagged as explicit by Google SafeSearch with the “highest confidence.” Additionally, they found 713 uses of the top 20 CSAM-related hashtags across the Fediverse on posts containing media, as well as 1,217 text-only posts that indicated “off-site CSAM trading or grooming of minors.” The prevalence of open posting of CSAM was described as “disturbingly prevalent” in the study.

One notable example cited in the study was the recent mastodon.xyz server outage caused by CSAM posted to Mastodon. The individual responsible for maintaining the server stated that they were alerted to content containing CSAM. However, due to being a sole maintainer and having limited time for moderation, it took several days to take action against the content. Despite their efforts, the mastodon.xyz domain was suspended, rendering the server inaccessible until someone could be reached to restore its listing. After the issue was resolved, the domain was added to a “false positive” list by the registrar to prevent future takedowns. However, as the researchers emphasized, the action taken was not a false positive.

Growing Concerns over Safety on Decentralized Networks

As decentralized networks like Mastodon gain popularity, concerns about safety have also escalated. Unlike mainstream platforms such as Facebook, Instagram, and Reddit, decentralized networks do not adhere to a uniform approach to moderation. Instead, each decentralized instance has control over its own moderation, which can result in inconsistency across the Fediverse. To address this issue, the researchers recommend that networks like Mastodon implement more robust moderation tools, integrate PhotoDNA, and enable CyberTipline reporting.

David Thiel, one of the researchers involved in the study, expressed his alarm at the findings. He stated that within a two-day period, they discovered more PhotoDNA hits than they had in their organization’s entire history of social media analysis. Thiel attributed this to a lack of effective tools utilized by centralized social media platforms to address child safety concerns. The study highlights the need for decentralized networks to prioritize the development and implementation of comprehensive tools for moderators to combat the presence of CSAM.

The Way Forward for Mastodon

In light of the study’s findings, it is imperative for Mastodon and similar decentralized networks to take decisive action. Enhancing moderation capabilities, integrating PhotoDNA technology, and facilitating CyberTipline reporting are crucial steps towards ensuring the safety of users, particularly vulnerable minors. The researchers emphasize the importance of these measures as Mastodon continues to grow in popularity. With the continued rise of decentralized networks, it is imperative to address the challenges they pose and prioritize the protection of users, particularly when it comes to the dissemination of CSAM.

Tech

Articles You May Like

Set Sail for Adventure: The Allure of Gold Teeth
Unveiling Unknown 9: Awakening – The Beauty of Gameplay Over Graphics
Redefining Strategy: The Revolutionary Ages System in Civilization 7
The End of an Era: The Final Chapter in the Star Wars Jedi Saga

Leave a Reply

Your email address will not be published. Required fields are marked *