Has Twitter reduced child exploitation material under Musk?

ADVERTISEMENT

Did Elon Musk reduce child exploitation material on Twitter? Here’s what the organization that tracks it says.

Elon Musk says preventing child exploitation and child sexual abuse material on Twitter is hugely important for the company. And it is. Legally, Twitter has an obligation to remove and report that material if it finds it, though it doesn’t have to proactively search for it. On a Twitter spaces last week, Musk reiterated his intent to solve the issue, saying that tackling child sexual abuse material on Twitter would be one of his top priorities.

To some of his supporters, the issue is seemingly solved. On Nov. 21, Liz Wheeler tweeted out “take a moment & thank @Elonmusk for ridding Twitter of child pornography & child trafficking hashtags,” which was shared widely and which Musk himself responded to, saying “this will forever be our top priority.”

But Grid interviews with former Trust and Safety Council members, a group of organizations that advised the company on products and policies, and outside observers suggest that it’s unlikely that Twitter can keep up with the abuse posted on its platform, let alone get more aggressive in identifying it — especially after weeks of company shake-ups.

And the National Center for Missing and Exploited Children (NCMEC), the organization that takes in reports of child sexual abuse material (CSAM) online, says not much has changed since Musk took over in late October, despite his recent statements.

ADVERTISEMENT

Specifically, that “NCMEC has seen no noticeable changes on the reporting front compared to previous months,” when Musk wasn’t in charge, Gavin Portnoy, vice president of communications and brand at NCMEC, said in a statement.

NCMEC it is a private nonprofit but it serves as the government-appointed clearinghouse to report CSAM to law enforcement. So, companies like Meta, Google, and yes, Twitter, must report any CSAM they find to NCMEC — making the organization the best third-party measure of how often companies are reporting CSAM.

To be fair, Twitter has a long history of cumbersome and unclear reporting processes for CSAM on the platform, according to critics, even in the pre-Musk era.

“Historically, the [NCMEC] has not seen the number of CyberTipline reports from Twitter we would expect given its user base and the well-documented issues of child exploitation that have occurred on the Platform,” said Portnoy.

And Musk did add a specific CSAM indicator to its reporting process — a drop-down function that lets users choose child sexual exploitation reports as an option, streamlining the process.

ADVERTISEMENT

But the company’s recent reports to NCMEC still tell a different story than Musk is selling — that the issue isn’t fixed and some worry it will get worse.

Twitter did not respond to a request for comment.

Will less staff mean less oversight?

Twitter’s massive layoffs and dissolution Monday of the Trust and Safety Council — which included a Child Sexual Exploitation (CSE) Prevention advisory group — is also concerning for some when it comes to Twitter’s capacity to identify and report CSAM.

Eirliani Abdul Rahman, co-founder of YAKIN (Youth, Adult survivors and Kin in Need), which seeks to help child victims and adult survivors of child sexual abuse, and who served on the now-dissolved Twitter Trust and Safety Council’s CSE Prevention advisory group, was skeptical that the issue was, indeed, solved.

Given the multifaceted nature of the issue and the need to have people on hand to help address it, Abdul Rahman said that she was skeptical that amid staffing cuts, teams had enough people to address CSAM.

Musk not only laid off many on the internal team that was tasked with addressing CSAM, he attacked the former head of Twitter’s Trust and Safety Team (separate from the council but which coordinated with it on CSAM issues), who left shortly after he joined.

Abdul Rahman also noted that people can easily get around banned hashtags or the like that Wheeler highlighted when she tweeted out her support for Musk. People can add additional letters or figures, and there is also the issue of non-English content, said Abdul Rahman.

“That’s why it’s always good to have people on the ground, who are trusted partners who can tell you locally, OK, this is what’s needed and this is what they’re using, and it’s trending. You can’t just say, oh here are the known hashtags in English and it’s done. It’s not. It’s not that simple,” Abdul Rahman said.

“If you don’t have enough people to actually be listening to people, then I’m afraid the answer [to whether you’re doing so] is no,” she added.

How the process works … and doesn’t

CSAM is a notoriously difficult issue to address online. It’s a constant challenge of playing whack a mole with bad actors; electronic service providers report content and take it down, only for it to emerge elsewhere.


ADVERTISEMENT

IN 2021, NCMEC received over 29 million reports of CSAM, an increase over 2020. The reports range to everything from the possession, manufacture, and distribution of CSAM, to child sex tourism. NCMEC allows both the public and electronic service providers, like Twitter, to make reports. Currently 1,800 companies can make reports.

Automation of these processes can work in some cases. Law enforcement inputs content into a database, where it is then “hashed” or fingerprinted. Platforms can then use those hashes or fingerprints to basically match content that’s uploaded onto their platforms and ensure that it’s removed swiftly without human intervention, according to Jillian York, the Electronic Frontier Foundation’s director for international freedom of expression, whose work largely focuses on extremism but who has worked on content moderation of CSAM.

But, importantly, not all content is found by law enforcement, and that’s where platforms have to step in. And while automation can catch many things, a human element is needed to contextualize some findings. One of the challenges for Twitter, said York, is that it is one of the few platforms that still allows adult content on the site. When that’s the case, it makes catching CSAM more challenging as companies have to make a call about whether someone is a minor or not. Twitter considered launching an adult subscription service, akin to OnlyFans, in early 2022 but stopped as work on it led the company to discover it had issues policing CSAM on the platform.

But NCMEC doesn’t show a change in report numbers since Musk took over, so it’s unclear whether Musk has been doing more to address the issue than the previous regime, or, whether it is still underperforming.

Thanks to Dave Tepps for copy editing this article.

  • Benjamin Powers
    Benjamin Powers

    Technology Reporter

    Benjamin Powers is a technology reporter for Grid where he explores the interconnection of technology and privacy within major stories.