OnlyFans’ paywalls make it hard for police to detect child sexual abuse materials (CSAM) on the platform, Reuters reported—especially new CSAM that can be harder to uncover online.
Because each OnlyFans creator posts their content behind their own paywall, five specialists in online child sexual abuse told Reuters that it’s hard to independently verify just how much CSAM is posted. Cops would seemingly need to subscribe to each account to monitor the entire platform, one expert who aids in police CSAM investigations, Trey Amick, suggested to Reuters.
OnlyFans claims that the amount of CSAM on its platform is extremely low. Out of 3.2 million accounts sharing “hundreds of millions of posts,” OnlyFans only removed 347 posts as suspected CSAM in 2023. Each post was voluntarily reported to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC), which OnlyFans told Reuters has “full access” to monitor content on the platform.
However, that intensified monitoring seems to have only just begun. NCMEC just got access to OnlyFans in late 2023, the child safety group told Reuters. And NCMEC seemingly can’t scan the entire platform at once, telling Reuters that its access was “limited” exclusively “to OnlyFans accounts reported to its CyberTipline or connected to a missing child case.”
Similarly, OnlyFans told Reuters that police do not have to subscribe to investigate a creator’s posts, but the platform only grants free access to accounts when there’s an active investigation. That means once police suspect that CSAM is being exchanged on an account, they get “full access” to review “account details, content, and direct messages,” Reuters reported.
But that access doesn’t aid police hoping to uncover CSAM shared on accounts not yet flagged for investigation. That’s a problem, a Reuters investigation found, because it’s easy for creators to make a new account, where bad actors can mask their identities to avoid OnlyFans’ “controls meant to hold account holders responsible for their own content,” one detective, Edward Scoggins, told Reuters.
Evading OnlyFans’ CSAM detection seems easy
OnlyFans told Reuters that “would-be creators must provide at least nine pieces of personally identifying information and documents, including bank details, a selfie while holding a government photo ID, and—in the United States—a Social Security number.”
“All this is verified by human judgment and age-estimation technology that analyzes the selfie,” OnlyFans told Reuters. On OnlyFans’ site, the platform further explained that “we continuously scan our platform to prevent the posting of CSAM. All our content moderators are trained to identify and swiftly report any suspected CSAM.”
However, Reuters found that none of these controls worked 100 percent of the time to stop bad actors from sharing CSAM. And the same seemingly holds true for some minors motivated to post their own explicit content. One girl told Reuters that she evaded age verification first by using an adult’s driver’s license to sign up, then by taking over an account of an adult user.
An OnlyFans spokesperson told Ars that low amounts of CSAM reported to NCMEC is a “testament to the rigorous safety controls OnlyFans has in place.”
“OnlyFans is proud of the work we do to aggressively target, report, and support the investigations and prosecutions of anyone who seeks to abuse our platform in this way,” OnlyFans’ spokesperson told Ars. “Unlike many other platforms, the lack of anonymity and absence of end-to-end encryption on OnlyFans means that reports are actionable by law enforcement and prosecutors.”
But Reuters’ investigation—which uncovered 30 cases of CSAM after submitting requests for “documents mentioning OnlyFans from more than 250 of the largest US law enforcement agencies”—concluded that “the 30 cases almost certainly understate the presence of child sexual abuse material on OnlyFans.”
According to OnlyFans’ transparency reports, though, the already small number of CSAM instances reported by the platform has seemingly decreased substantially. In 2023, OnlyFans reported that “incidents of suspected CSAM make up less than 0.001 percent of all content submitted by creators to be posted (or attempted to be posted) on OnlyFans.” But this year, that shifted a decimal point, with OnlyFans reporting that the amount of CSAM comprises “less than 0.0002 percent of all content.”
OnlyFans declined to comment on this shift in total incidents of suspected CSAM on the platform.
There’s no question that OnlyFans has invested in keeping CSAM off its platform. OnlyFans pays the Internet Watch Foundation, a British nonprofit dedicated to combatting CSAM, about $114,000 annually for services that help the platform detect CSAM, Reuters reported.
But on a page detailing how the platform fights CSAM, OnlyFans said it’s harder to identify “new” CSAM that is not yet “part of databases and tools used by law enforcement.” To do this, OnlyFans said that it “closely” inspects “images, text, and sound files,” reporting any “suspected CSAM which has not previously been identified” and passing that “information to law enforcement and non-governmental organizations to help identify the perpetrators.”
Arguably it’s even harder for cops to monitor for new CSAM on OnlyFans, according to Reuters’ reporting. And the impact of that on minors is “devastating,” Reuters reported. One father of a 16-year-old boy victim of CSAM on OnlyFans told Reuters that “there has to be accountability for these platforms,” because his son now suffered from “a wound that will never heal.”
Reuters searched US legal databases and found that OnlyFans has never been sued or held criminally liable for CSAM on its platform. But a question still seemingly remains as to whether OnlyFans prioritizes detecting CSAM over profits. When Reuters asked how minors dodged age verification, how CSAM evaded detection, and whether OnlyFans “kept its revenue from accounts involving minors,” OnlyFans declined to comment.
This post was originally published on this site be sure to check out more of their content.