It's time to break the news.The Messenger's slogan

Did Elon Musk reduce child exploitation material on Twitter? Here’s what the organization that tracks it says.

The national organization that takes in reports of child sexual abuse material said it has not seen any change in reporting frequency since Musk took over.

But Grid interviews with former Trust and Safety Council members, a group of organizations that advised the company on products and policies, and outside observers suggest that it’s unlikely that Twitter can keep up with the abuse posted on its platform, let alone get more aggressive in identifying it — especially after weeks of company shake-ups.

And the National Center for Missing and Exploited Children (NCMEC), the organization that takes in reports of child sexual abuse material (CSAM) online, says not much has changed since Musk took over in late October, despite his recent statements.

Specifically, that “NCMEC has seen no noticeable changes on the reporting front compared to previous months,” when Musk wasn’t in charge, Gavin Portnoy, vice president of communications and brand at NCMEC, said in a statement.

NCMEC it is a private nonprofit but it serves as the government-appointed clearinghouse to report CSAM to law enforcement. So, companies like Meta, Google, and yes, Twitter, must report any CSAM they find to NCMEC — making the organization the best third-party measure of how often companies are reporting CSAM.

To be fair, Twitter has a long history of cumbersome and unclear reporting processes for CSAM on the platform, according to critics, even in the pre-Musk era.

“Historically, the [NCMEC] has not seen the number of CyberTipline reports from Twitter we would expect given its user base and the well-documented issues of child exploitation that have occurred on the Platform,” said Portnoy.

And Musk did add a specific CSAM indicator to its reporting process — a drop-down function that lets users choose child sexual exploitation reports as an option, streamlining the process.

But the company’s recent reports to NCMEC still tell a different story than Musk is selling — that the issue isn’t fixed and some worry it will get worse.

Twitter did not respond to a request for comment.

Will less staff mean less oversight?

Eirliani Abdul Rahman, co-founder of YAKIN (Youth, Adult survivors and Kin in Need), which seeks to help child victims and adult survivors of child sexual abuse, and who served on the now-dissolved Twitter Trust and Safety Council’s CSE Prevention advisory group, was skeptical that the issue was, indeed, solved.

Given the multifaceted nature of the issue and the need to have people on hand to help address it, Abdul Rahman said that she was skeptical that amid staffing cuts, teams had enough people to address CSAM.

Abdul Rahman also noted that people can easily get around banned hashtags or the like that Wheeler highlighted when she tweeted out her support for Musk. People can add additional letters or figures, and there is also the issue of non-English content, said Abdul Rahman.

“That’s why it’s always good to have people on the ground, who are trusted partners who can tell you locally, OK, this is what’s needed and this is what they’re using, and it’s trending. You can’t just say, oh here are the known hashtags in English and it’s done. It’s not. It’s not that simple,” Abdul Rahman said.

“If you don’t have enough people to actually be listening to people, then I’m afraid the answer [to whether you’re doing so] is no,” she added.

How the process works … and doesn’t

CSAM is a notoriously difficult issue to address online. It’s a constant challenge of playing whack a mole with bad actors; electronic service providers report content and take it down, only for it to emerge elsewhere.

IN 2021, NCMEC received over 29 million reports of CSAM, an increase over 2020. The reports range to everything from the possession, manufacture, and distribution of CSAM, to child sex tourism. NCMEC allows both the public and electronic service providers, like Twitter, to make reports. Currently 1,800 companies can make reports.

Automation of these processes can work in some cases. Law enforcement inputs content into a database, where it is then “hashed” or fingerprinted. Platforms can then use those hashes or fingerprints to basically match content that’s uploaded onto their platforms and ensure that it’s removed swiftly without human intervention, according to Jillian York, the Electronic Frontier Foundation’s director for international freedom of expression, whose work largely focuses on extremism but who has worked on content moderation of CSAM.

But NCMEC doesn’t show a change in report numbers since Musk took over, so it’s unclear whether Musk has been doing more to address the issue than the previous regime, or, whether it is still underperforming.

Start your day with the biggest stories and exclusive reporting from The Messenger Morning, our weekday newsletter.
 
By signing up, you agree to our privacy policy and terms of use.
Sign Up.