In the weeks leading up to Twitter’s dissolving of its Trust and Safety Council, communications from the company to the members were sporadic, and multiple groups hadn’t heard from the company at all since Elon Musk purchased it in late October.
Before the council was dissolved and its webpage taken down, Grid reached out to every listed member of the group to see if they’d heard from Twitter since Musk took over. Some had, but others said they’d heard nothing and were looking at this week’s meeting of the trust and safety team as an opportunity to evaluate whether to remain on the council as well as Musk’s commitment to their work. It would have been the first under Musk’s ownership.
Now, they have their answer.
“Twitter has made it abundantly clear that trust and safety are no longer priorities for the company, given this abrupt dismissal of the Trust & Safety Council as well as its dangerous policy choices over the last month and invitations for suspended accounts to rejoin,” said a spokesperson for GLAAD, the world’s largest Lesbian, Gay, Bisexual, Transgender and Queer (LGBTQ) media advocacy organization. “Until the company returns to following basic industrywide best practices on hateful content, the platform clearly remains unsafe not only for LGBTQ users but for brands and advertisers.”
In an email to members of its Trust and Safety Council, signed by “Twitter,” the company said on Monday night before the coalition’s annual meeting that the group would be disbanded because it was no longer “best structure” to use to “bring external insights.”
The questions of how Musk will handle sensitive questions about moderation from hate speech to abuse and harassment, and what role the council might play, have been questions since he bought the company in October. The group was formed in 2016 and made up of close to 100 organizations, which advised Twitter on issues ranging from child exploitation, mental health, and human and digital rights.
Through emails, phone calls and social media, Grid independently reached out to the 88 members listed on the now taken-down website to ask whether Twitter or Musk had been in touch with them since he bought the company in late October.
Based on responses from 21 members, it appears that Twitter’s assessment of the council’s contributions (and its decision to dissolve it) involved little or no conversation with its members. The council was formed in 2016 in response to repeated criticism that there was too much toxic content on the platform. When it launched, Twitter called the group a “foundational part of our strategy” to enable “everyone, everywhere to express themselves with confidence on Twitter.”
Of the 21 that responded, five said they had not heard from Twitter since Musk’s purchase, while others said they’d heard from the company asking for their patience while they were reorganizing. Multiple organizations said they were waiting to hear what was said at this week’s meeting before deciding whether to continue on the council. Others declined to speak until after the meeting, before following up to share that the council had been disbanded.
“We have had recent constructive engagement and it is our understanding that thus far, the council remains in effect,” said a spokesperson for the Committee to Protect Journalists (CPJ) on Friday. “As we find out more, we will be in a better position to assess the status and real power of the council as a tool to preserve the safety of journalists and other users.”
Now, they’re changing their tune.
“Today’s decision to dissolve the Trust and Safety Council is cause for grave concern, particularly as it is coupled with increasingly hostile statements by Twitter owner Elon Musk about journalists and the media,” said CPJ President Jodie Ginsberg in a statement Monday.
Julie Owono, a member of the Facebook Oversight Board and the executive director of Internet Sans Frontières, whose objective is to promote the free circulation of information and knowledge and to defend digital freedoms and rights, said the organization had not had any contact with the company since Musk took over in October.
“We continue to flag violating tweets to the company, but we have not met with the trust and safety team yet,” said a spokesperson for the Anti-Defamation League on Friday.
The council was on a volunteer basis, but groups engaged with Twitter and advised on everything from aiding mental health of users to cutting down on child exploitation.
Twitter did not cut ties completely with council members. The company also noted in the email that it will be moving more aggressively to make Twitter a “safe, informative place” and may explore working in “bilateral or small groups” with members on focused issues.
Ella Irwin, the current vice president of product for trust and safety, who joined Twitter in May, did not respond to Grid’s request for comment. Neither Irwin nor Twitter has commented publicly on the council’s elimination.
But members of the former council were critical of the company’s decision.
“The abrupt disbanding of Twitter’s Trust and Safety Council is yet another arbitrary and irresponsible decision about content moderation processes under Twitter’s new ownership,” the Center for Democracy and Technology, which promotes democratic values by shaping technology policy and was a member of the council, said in a statement.
Writing was on the wall
At least one member, Freedom House — an organization researching and advocating for democracy, political freedom and human rights — said it had not been involved in the council since mid-2022, despite still being listed on the Trust and Safety Council’s webpage before it was taken down Monday night.
Last week, three members of the council resigned over concerns about the direction Musk was taking the platform.
“We know from research by the Anti-Defamation League and the Center for Countering Digital Hate that slurs against Black Americans and gay men have jumped 202 percent and 58 percent respectively since Musk’s takeover,” said Eirliani Abdul Rahman, the first female representative from Asia who served on the Council’s Child Sexual Exploitation (CSE) Prevention advisory group, in a release.
“Antisemitic posts have soared more than 61 percent in the two weeks after Musk’s acquiring of Twitter,” Abdul Rahman added.
Musk and Twitter have disputed these findings.
Abdul Rahman also cited the reinstatement of previously banned accounts of people tied to the Jan. 6, 2021, Capitol riots.
Musk has inflamed online attacks against the resigned members, tweeting that “it is a crime that they refused to take action on child exploitation for years!”
A new way to handle content moderation is yet to exist
Musk had said in late October he would form an independent content moderation council, which was news to some on the Trust and Safety Council, amid concerns from advertisers over how he would moderate content. He said the content moderation council would have “widely diverse viewpoints.” He has not, to this day, formed such a board.
Irwin said in early December that Twitter was leaning on automation to handle content moderation, something that has raised red flags for former members of the council.
“The biggest thing that’s changed is the team is fully empowered to move fast and be as aggressive as possible,” Irwin told Reuters.
One former member of the council said it seems like Musk is cleaning house and starting fresh. “It’s almost as if we don’t exist or an advisory [body] doesn’t exist unless he forms it,” said Anne Collier, founder and executive director of the Net Safety Collaborative, which aims to improve young people’s safety and well-being online, who also resigned last week. Taken together with large cuts to the trust and safety team at Twitter, the elimination of the council leaves Musk with little oversight or guard rails when it comes to content moderation as he courts users, advertisers and others in the hope of making the company profitable.
Twitter continues to roll out systems that are experiencing bugs resulting in the spread of misinformation. On Monday, it released the updated version of its verification system, Twitter Blue, which is meant to allow users to pay to be verified. It pulled back its previous version of the program after it was abused to spread misinformation and people impersonated companies.
As the company manually goes through and verifies government accounts, it incorrectly labeled one of Norway’s government accounts. Rather than affiliating it with Norway, it labeled it as a “Nigeria government organization.”
“They were listening,” said Collier of the previous Twitter regime. “They were listening in past years. So it’s sad for me and a real sign that safety’s not being taken particularly seriously there. They say it’s a top priority. Musk says it’s a top priority. But all indicators suggest that it’s not.”
Thanks to Lillian Barkley for copy editing this article.