On May 14, 2022, an alleged extremist streamed video footage online as he murdered 10 people and injured three more inside a Tops supermarket in Buffalo, New York. The shooter livestreamed his rampage on the social media platform Twitch, which Amazon owns. Preceding the murders, the shooter allegedly plotted his attack on the chat platform Discord and spread white supremacist dogma on fringe websites such as 4chan and Kiwi Farms.
After the shooting, a video recording was downloaded by unknown users from Twitch and distributed across numerous platforms, including Facebook, Instagram, Telegram, TikTok, and Twitter.
In response, the Global Internet Forum to Counter Terrorism (GIFCT), an industry-based anti-terrorism coalition, activated its content incident protocol to allow member tech companies to detect and remove the video of the shooting across their platforms. Facebook, Twitter and YouTube are members of GIFCT, as are Discord and Amazon (which owns Twitch). Alternative websites such as 4chan and Kiwi Farms are not members of the GIFCT, so the video continued, and continues to spread, on them. Websites like 4chan and Kiwi Farms lack any accountability and are notorious for having few, if any, policies on content moderation no matter how violent and inciting the content. What's more, tech infrastructure providers—companies that enable websites to operate such as domain hosts, website registrars, payment processors, or cybersecurity outfits—make it possible for these alternative sites to remain active no matter how hateful the content. Infrastructure providers either actively support alternative websites' resistance to content moderation, such as Epik, or look the other way without vetting clients, such as Cloudflare. Given that so many of their clients ignore extreme content, infrastructure providers are one of the few means of holding them accountable and stemming the spread of extremist ideologies.
The events of Buffalo underline that tech companies bear responsibility for serving as algorithmic megaphones for extremists to promote hateful ideologies and as tools to plan violent attacks. Below, we summarize the role of tech companies and platforms implicated in the attack, what they have done to address hate and extremism around the shooting, and offer recommendations to improve their responses to the dissemination of extremist content.
Twitch
Role: The accused assailant broadcast the shooting live from the streaming platform Twitch, which has been owned by Amazon since 2014, and is primarily a platform where users broadcast videos of themselves playing video games. However, the attack in Buffalo was not the first time Twitch was used to livestream an extremist event. In 2019, a white supremacist attacked a synagogue in Halle, Germany, and livestreamed the incident on the platform.
According to Twitch's community guidelines, the platform prohibits "content that depicts, glorifies, encourages, or supports terrorism, or violent extremist actors or acts." Yet the broadcast of the Halle attack was active on the platform for 35 minutes. The Buffalo broadcast was active on the platform for at most seven minutes, with the final two minutes being the portion that contained direct footage of violence, according to the ADL Center on Extremism. Twitch claims it removed the stream after those two minutes of violence. Regardless, the shooter was able to broadcast his attack, post the recorded video, and make the video available for download to spread across the internet. Despite Twitch stating in its 2021 transparency report that it quadrupled its law enforcement response capacity, which deals with counter-terrorism efforts, and quadrupled its content moderation team to respond to user reports, extremist content still appears on the platform.
Recommendation: Amazon should commission an experienced and independent, third-party researcher from academia or civil society to audit Twitch's policies, products, and procedures to evaluate how effectively they disrupt extremist activity and understand how the shooting was livestreamed on the platform. Amazon should also ensure that Twitch has implemented all the recommended livestreaming patterns in ADL's Social Pattern Library. Congress and other oversight bodies should demand that Amazon and Twitch publicly disclose what they are doing to remedy this situation.
Discord
Role: According to reports and a record of the Buffalo shooter's online logs, the alleged assailant spent over a year on Discord. In December 2021, the shooter allegedly posted to the server log of his Discord channel that he would "carry out an attack against the replacers and will even livestream the attack via discord and twitch." He posted the 14 words, the most popular white supremacist slogan in the world, in January 2022. Discord has attempted to address extremism since the platform was used to organize the white supremacist Unite the Right rally in Charlottesville, Virginia, in 2017 that culminated in the death of one counter-protester and injuries to multiple others. In 2021, Discord made some progress by acquiring and bringing in-house the AI content moderation startup Sentropy.
Recommendation: Given that the shooter plotted his attack on Discord, the platform should audit how it conducts both human review and automated detection of extremist content. Discord can incorporate the findings of ADL's Belfer Fellow Libby Hemphill to inform its automated systems for detecting extremism. Hemphill's research shows how white supremacists hide in plain sight on digital spaces by using "civil," non-profane language. According to Hemphill, platforms must recognize that white nationalist speech is hate speech, even when it's not openly toxic or profane, such as advocating Great Replacement theories. Her report provides detailed suggestions for improving content detection systems. Discord should make the audit public and Congress and other oversight bodies should call on Discord to make relevant information public.
Telegram
Role: Content related to the shooting was shared across known extremist channels of the private messaging app Telegram. Despite having over 500 million users worldwide, Telegram has only three stated content policies, unlike other platforms of a similar size. One of these policies forbids users from sharing content that would "promote violence on publicly viewable Telegram channels."
ADL analysts found that extremists shared footage of the shooting and memes fashioned from clips of the rampage across publicly available Telegram channels. Some of these channels each had more than 15,000 members. It is appalling that Telegram can't enforce even one of its few rules.
Recommendation: Telegram should remove footage of the Buffalo shooting and implement policies around hateful conduct and extremism. Apple and Google should remove Telegram from their app stores until the platform takes meaningful action.
Mainstream Platforms and the GIFCT
Role: The Buffalo shooting began at 2:30 PM ET. At 4:52 PM ET, the Global Internet Forum for Counter Terrorism (GIFCT) activated its content incident protocol. The GIFCT developed the protocol after the livestreaming of the Christchurch, New Zealand mosque shooting in 2019. It enables GIFCT members to remove livestreamed content from terrorists or violent extremists by adding hashes of that content, or unique numerical representations of the content, to a shared database.
Unfortunately, the protocol is inadequate. The Buffalo shooting video has been shared across mainstream platforms such as Facebook and Twitter, with some shares garnering up to a million views, speaking either to the weakness of the GIFCT protocol or the inability of major tech companies to act effectively on the information provided to them. Although Discord and Amazon are members of the GIFCT, the consortium’s latest transparency report states they are not members of the "hash sharing consortium" that would share information about an attack.
Recommendation: The GIFCT should organize member companies following an incident, determine why platforms do not remove extremist content from their platforms after the protocol is activated, and find additional measures to improve the protocol's efficacy. Members such as Meta and Twitter should assess their ability to act on information received from the GIFCT and increase their capacity to halt the spread of hateful footage.
Websites such as 4chan and Kiwi Farms are unlikely to apply for membership to the GIFCT, so the protocol has its limits. The GIFCT can establish criteria for tech companies that have allowed harmful content to circulate on their platforms to explore link blocking options for content originating from bad actor websites.
Problematic Websites and Their Infrastructure Providers
Role: ADL investigators saw video clips related to the shooting and the shooter's manifesto spreading on fringe sites. These sites are some of the worst places on the internet: fertile grounds for hate, harassment, and extremism with no robust content moderation policies despite having millions of monthly users. These platforms refuse to take responsibility and will only continue to be hotbeds of extremism. One of the only means of addressing hate in these spaces is encouraging infrastructure providers to stop doing business with these alternative sites.
The challenge is, the leaders of some infrastructure providers believe speech should remain free no matter the dangers posed. For example, Epik offers services to Bitchute and Gab, known to host racist, extremist posts and videos—and commends them for it. On May 13, 2020, Epik CEO Rob Monster tweeted his support for BitChute: "The more heavy-handed the mainstream censorship, the faster the @bitchute audience grows. @EpikDotCom is happy to empower you."
Through "Project Galileo," Cloudflare provides free cybersecurity services to organizations focused on the arts, democracy, and human rights as part of its charitable efforts. Yet it also supports some of the most disturbing sites on the internet. Cloudflare's clients include 4chan and Kiwi Farms, where footage of the Buffalo shooting has been shared and celebrated.
Cloudflare has no formal policy regarding how it chooses its clients. Cloudflare states it cannot vet every company it services, contending such a policy would affect its ability to do business. However, in August 2019, Cloudflare terminated its relationship with 8chan, following a mass shooting in El Paso, Texas, when the assailant's manifesto spread on the platform. This, along with other infrastructure providers pulling their services, led to the platform going down later that same month for a period before resurfacing in a form with less engagement, reach, and impact later in the year.
Recommendation: Cloudflare must discontinue its services for 4chan and Kiwi Farms. The company must implement a policy regarding who it will do business with and no longer support platforms that are agnostic about extremist content that promotes violence and hostile to content moderation.
Left online, extremist content such as manifestos or violent footage can popularize hateful ideologies or radicalize people—and do so for years to come. The Buffalo shooter was inspired by the Christchurch shooting, which was also livestreamed. Without stronger policies and protocols to prevent extremist content that promotes violence from being posted, broadcasted, and shared, what happens online may cause offline devastation. Tech companies cannot continue abdicating their responsibility in fomenting and amplifying hate. Lives are at stake.