The 18-year-old who carried out a mass shooting in Buffalo, New York, on Saturday that left 10 people dead followed a common model for racist attacks — livestreaming it for posterity and leaving instructions for the next attacker, both of which were quickly documented and preserved online.
The shooter said he was radicalized on web forums such 4chan, which is notorious for its wide variety of extremist content. Through 4chan infographics, memes and posts, he said he was exposed to racist ideas and found additional information that took him to other extremist sites. He also frequented the messaging platform Discord, where users can congregate on public or private servers and communicate through direct messages as well as channels. There he learned more about the equipment he used to execute the attack, according to a 180-page manifesto the shooter posted online.
The shooter claims he was inspired by a livestream of the shooting of 51 worshippers at a mosque in Christchurch, New Zealand, in 2019, among other similarly racist attacks.
In his manifesto, the Buffalo shooter wrote that the 2019 Halle synagogue shooting in Germany was streamed on Twitch for about 35 minutes “which for me shows that there is enough time to capture everything important.”
“I think that livestreaming this attack gives me some motivation in the way that I know that some people will be cheering for me,” the shooter wrote.
Video clips spread — even after they’re officially taken down
Twitch, which was popularized as a way for people to livestream while gaming and is owned by Amazon, removed the stream two minutes after the violence started.
Twitch’s Global Head of Trust and Safety Angela Hession said the company uses several mechanisms to rapidly detect, escalate and remove any violence on Twitch, including proactive detection and a robust user reporting system.
“User reports are vital for alerting us to content that violates our rules, and we have a comprehensive 24/7/365 escalation system in place to address urgent reports,” said Hession. “In this case, we identified and removed the stream less than two minutes after the violence began.”
Twitch teams also coordinated with the Global Internet Forum to Counter Terrorism, including sharing footage to be hashed, per their protocol, in order to help prevent the content from being shared on other services.
But that was long enough for it to be recorded elsewhere and linked to across the internet. By Sunday, the video was uploaded to the video-hosting website Streamable, where it was viewed over 3 million times before being taken down, and links to the video surfaced on Facebook, Twitter and Reddit before the sites eventually took them down over a matter of hours.
The shooter identified Twitch as the medium for his stream because “it was compatible with livestreaming for free and all people with the internet could watch and record.”
It also enabled him to share on Discord, where he posted extensively about his preparation for the attack, according to Discord logs reviewed by Megan Squire, a senior fellow for data and open-source intelligence at the Southern Poverty Law Center.
“They started last November, but it essentially was a Discord channel where he was writing about his plans for an attack. He had originally planned it for March 15, which was the anniversary of the Christchurch shooting,” said Squire.
More so than the livestream being shared widely on large social media platforms, Squire said the question bothering her right now is how no one noticed or reported the Discord logs.
“This case is very different because [the Christchurch shooter] didn’t sit there on Discord or some other public thing and post literally 600 pages, 800 pages worth of text for six months before carrying out his event, it was a lot quieter,” said Squire. “This guy did. I mean, he put all of his daily thoughts for six months out there. So, who’s responsible for that? The audience of people who are reading that never said anything?”
Discord did not respond to a request for comment, but its moderation policy says, “We do not proactively read direct messages between users or conversations in a server.”
In Discord’s most recent transparency report, spanning the second half of 2021, the company said it disabled over 25,000 accounts and over 2,000 servers for violating polices related to violent extremism.
Squire said that in regards to sites moderating the video itself, or links to it, it’s difficult for companies engaging in content moderation to stamp out all the cases, but it is doable and comes down to a matter of will and whether platform companies actually want to do it.
“Where the problem is, though, is companies don’t have the will to do it,” said Squire, given that some sites engage in little moderation and moderation is timely and costly at scale. “So that’s where, ultimately, the file or files will end up somewhere like Telegram or just these lower, lower-end sites with no content moderation.”
Telegram is a social media platform that boasts 500 million users worldwide and has more lax moderation standards.
Squire said she had seen Telegram groups where the video was being circulated as of Monday. Grid reviewed one group and confirmed the video was still up.
“Content that promotes violence is expressly forbidden on Telegram” said a Telegram spokesperson. “We are actively removing footage of the Buffalo mass shooting for this reason.”
Telegram took the video down once Grid identified the content to the company.
The Department of Homeland Security has previously identified a pattern of some mass shooters studying online videos of prior mass shootings.
Extremists exploit platforms for ideas and plans
Shootings like the one in Buffalo follow an unfortunately common playbook.
Oren Segal, vice president at the Anti-Defamation League’s Center on Extremism, said that 4chan is a forum in which extremists congregate and where his team has seen hate and violent language incubated.
“We’ve seen many real-life extremists doing real-life damage being influenced from message boards, like 4chan,” said Segal. “We know that extremists are preparing their social media strategies at the same time that they’re preparing their weapons. And if that’s the case, we know how extremists are exploiting these platforms.”
When asked about other uses of Discord by extremists, Segal pointed to the fact that a lot of the planning and discussions ahead of the rally at Charlottesville, Virginia, by neo-Nazis and white supremacists occurred on Discord servers.
The shooter’s manifesto, the extensive logs of his Discord posts and the livestream itself are all tactics that are signaling back to the community saying, “not only should you do this, I’m going to show you how to do this,” according to Segal.
The documentation also serves a secondary purpose in keeping their actions alive.
“Clips of the video are being turned into memes and images,” said Segal. “It creates sort of this discussion and subculture, which keeps what they did alive. So you might be arrested, or in some cases, shooters are killed, but overall, their message, their narrative and their will continues to circulate these online spaces.”
Thanks to Lillian Barkley for copy editing this article.