Article

Profiting from Hate: Platforms’ Ad Placement Problem

Profiting from Hate

Executive Summary

We examined whether four major social media platforms—YouTube, X (formerly Twitter), Facebook, and Instagram—are potentially profiting from ad placements alongside searches for hate groups and extremists. This new review by ADL and the Tech Transparency Project (TTP) found a range of results. 

The most problematic of the four was YouTube. Not only did the company profit from searches for hate groups on its platform, YouTube is also creating—and profiting from—music videos for white supremacist bands. The review found that YouTube ran ads in search results for dozens of antisemitic and white supremacist groups, allowing the platforms to likely earn money from hate-related queries. This included ads for Walmart, Amazon, and Wayfair.   

We also identified an issue specific to YouTube: The platform is serving ads for brands like McDonald’s, Microsoft, and Disney alongside music videos for white supremacist bands. Even more disturbing, YouTube itself generated these videos as part of an automated process to create more artist content for its platform.  

Our review found that X has also placed ads alongside searches for hate groups, but for a smaller percentage of hate group searches than YouTube.   

Meta-owned Facebook and Instagram, by contrast, only served ads in a handful of hate group searches, showing that it is possible for tech platforms to avoid this problem. 

These findings show how YouTube and X are potentially generating ad revenue off content produced by hate groups that would violate their own policies. In the process, major brands are unwittingly associated with groups that stir up antisemitism and hate online and, in some cases, inspire violence in the physical world.

Notably, all ads displayed a notation such as “Ad,” “Promoted,” or “Sponsored”, which is designed to serve as a disclaimer of an association between the searched content and the served advertisement. Despite this effort, in our opinion the monetization alongside this content is problematic. 

Note: These reviews were conducted in Summer 2023, but we ran spot checks in mid-September on both YouTube and X, using some of the same search terms. Our results showed that hate group searches continued to be monetized with advertising. Although some of the desktop searches for YouTube no longer produced ads, the mobile ones did. On X, hate group searches continued to produce ads on mobile and desktop. 

Based on our findings, here are our recommendations: 

For Industry 

  1. Improve Ad Placement Features. Platforms must ensure advertisements are not placed in close proximity to hateful or antisemitic content. 

  2. Stop Auto-Generating Hate. Platforms should either properly program tools to distinguish between auto-generating hateful versus innocuous content or pull down auto-generation features until the risk for creating nefarious content is mitigated.   

  3. Share More Information About Advertising Policies. Many of the monetization policies and practices on platforms are opaque or unclear. Platforms should be more transparent about revenue-sharing practices and policies around ad placement.  

For Government 

  1. Mandate Transparency Regarding Platform Advertising Policies and Practices. Congress should require tech companies to share more information about how platforms monetize content and what policies drive advertising decisions. 

  2. Update Section 230. Congress should update Section 230—the law that provides near-blanket immunity from liability to tech companies for user-generated content—to better define what type of online activity should remain covered and what type of platform behavior should not be immune. These necessary updates to the law can help ensure that social media platforms more proactively address how they monetize content. 

  3. Regulate surveillance advertising. Congress must focus on how consumers, and advertisers, are impacted by a business model that optimizes for engagement. Congress should restrict ad targeting and increase data privacy, so companies cannot exploit consumers' data for profit—a practice that too often can result in monetizing antisemitism and hate. 

Methodology

This project, conducted in Summer 2023, consisted of two reviews examining whether and when four major social media companies display ads alongside hate group content on their social media platforms.  

We compiled a list of hate groups and movements from ADL’s Glossary of Extremism that were tagged with all three of the following categories: “groups/movements,” “white supremacist,” and “antisemitism.” The list totaled 130 terms.  

For the first review, we typed each of the 130 hate groups and movements into the search bars of YouTube, X, Instagram, and Facebook on both the mobile and desktop versions of the apps and recorded any advertisements that the platforms ran in the search results. We then examined the content of the ads, including what companies, products, or services they advertised. 

For the second review, we compiled a list of white supremacist bands from prior research conducted by ADL (as well as from the Southern Poverty Law Center and media reports). Using this list, we identified YouTube channels and videos for the bands that were auto-generated by YouTube. We then looked at whether YouTube is displaying ads alongside these channels and videos.  

YouTube’s Auto-Generated Videos

We conducted a second review focused specifically on whether YouTube is monetizing auto-generated hate channels and videos.   

As ADL and TTP found and explained in a previous report, YouTube automatically generates “topic” channels for artists who have a “significant presence” on the platform, and even creates videos for them. It is not clear why YouTube does this. The auto-generated videos feature a song recording and a static image of album art. Concerningly, YouTube auto-generates this kind of content for white supremacist bands that violate the platform’s hate speech policy.

Profiting from Hate


For this review, we examined whether advertisements appeared with YouTube’s auto-generated content for white supremacist bands.  

We compiled a list of white supremacist bands that have been identified in reports by ADL, the Southern Poverty Law Center, and media outlets. We then typed the name of each band into YouTube’s search bar along with the word “topic,” which YouTube uses to mark channels that it has auto-generated. These topic channels frequently include videos that are also auto-generated by YouTube. 

In fact, YouTube ran a 15-second ad for Microsoft before an auto-generated video for the song “Martyr (R.I.P. RJM)” by the white power punk band White American Youth. The “RJM” in the title appears to refer to Robert Jay Mathews, the leader of a white supremacist terrorist group killed in a shootout with the FBI in the 1980s. The song comes from the band’s album, which contains tracks titled “White Pride,” “White Power,” and “AmeriKKKa For Me.” The video—and the channel in which it appears—are labelled as auto-generated by YouTube.  

Profiting from Hate


It is not clear why YouTube would generate content for White American Youth, especially as it appears that YouTube has previously taken down other content associated with the band. A once-active link to a YouTube playlist for White American Youth, which was cited in a 2017 media article, now leads to a screen that states, “This video has been removed for violating YouTube's policy on hate speech.” 

YouTube also ran a 15-second ad for CareCredit, a health care credit card, on an auto-generated video for the song “Zyklon Army,” a reference to the Zyklon B gas that Nazis used for mass murder in concentration camps during the Holocaust. (The song’s title was partially blanked out, appearing as “Z*** Army.”) The song—by Evil Skins, which has been described as a racist skinhead and neo-Nazi band—opens with a recitation of “Sieg Heil,” a phrase adopted by the Nazi Party during World War II that means “Hail Victory” in German. 

Profiting from Hate


The video, which had 169,000 views, appeared on a YouTube-generated “topic” channel for Evil Skins that was created in 2011. YouTube created the channel and video even though they appear to violate YouTube’s policies on hate speech, which ban content that promotes hatred against individuals or groups based on attributes like race, ethnicity, religion, and nationality.3 YouTube says that its Community Guidelines apply to “all types of content” on the platform. 

YouTube refers to the above auto-generated videos—which feature the song recording and a static image of album art—as “art tracks.”  

The platform does not disclose whether it shares revenue from ads that run on these auto-generated art tracks with the bands that created the music. But it appears that YouTube often monetized these videos in multiple ways. For example, YouTube ran a 30-second, non-skippable video ad for Stella Artois beer on an apparently auto-generated art track from the Spanish white supremacist band Pugilato H_C, formerly known as Pugilato NSHC. Non-skippable video ads require users to watch them in full before they can see the video content. YouTube also ran an ad for the Aviation Institute of Maintenance in the same track’s “watch feed” of recommended videos.  

Profiting from Hate


The review found YouTube running ads on other auto-generated content for white supremacist bands: 

  • YouTube displayed an ad for Full Sail University, a for-profit vocational school in Florida, on an auto-generated video for the song “Bombs Over Israel, Pt. 2” by the band IronMensch. The band is part of the Fashwave (a mashup of “fascism” and “vaporwave”) genre of white supremacist music. The cover art used for the auto-generated video includes a colorized image of a dancing soldier in a Nazi SS-style uniform. 

Profiting from Hate
  • YouTube ran a non-skippable video ad for the convenience store chain 7-Eleven on an auto-generated video for the song “Love Theme for a Twisted Mind” by the band Kill Baby, Kill!, which the Southern Poverty Law Center has categorized as racist.  

Profiting from Hate
  • YouTube also ran a pair of ads, including one for financial advisory firm InvestorPlace, before an auto-generated video for the song “ASSAULT” by the National Socialist black metal band Wiking 1940.  

Profiting from Hate


These findings raise questions about YouTube’s enforcement of its policies around music. The platform has been criticized in the past for giving a megaphone to white power bands that promote hate and racism through their lyrics.  

Our review found that YouTube is serving ads alongside some auto-generated hate band content on both its flagship site and YouTube Music, a separate music streaming service launched in 2018 to compete with Spotify and Apple Music.  

YouTube Music offers a free, ad-supported streaming service and a paid version with no ads, currently $10.99 per month. Of note, on YouTube Music, these hate band channels and videos are not labeled as auto-generated, even though they are identical to content that is marked as auto-generated on the main YouTube site.

The YouTube-generated video for “Zyklon Army” provides a good illustration of this dynamic. As noted previously, YouTube served an ad for CareCredit on the video on the main YouTube site. We found that YouTube also served an ad for McDonald’s on the video on YouTube Music. The McDonald’s ad, which was non-skippable and promoted the fast-food chain’s frozen Fanta drinks, ran before the video, which kicks off with chants of “Sieg Heil” and heavy metal guitar music. 

Profiting from Hate


Another example is the auto-generated video for the song “ASSAULT” by the National Socialist black metal band Wiking 1940. As noted previously, YouTube ran an ad for InvestorPlace ahead of the video on the main YouTube site. Reviewers found that YouTube also displayed a video ad for Disney’s “Haunted Mansion” movie ahead of the same track on YouTube Music. 

Profiting from Hate


Additionally, we found that YouTube Music ran a non-skippable Amazon ad before an auto-generated track for the song “Nueva Normalidad” (“New Normal”) by the Spanish white supremacist band Pugilato H_C. The song’s lyrics speak of an “obsession of the masters of the bank to destroy the white race in the West,” according to one music website. This appears to be a reference to baseless conspiracy theories about Jewish control over financial systems and the “Great Replacement” of white people by non-white immigrants. 

Profiting from Hate

Recommendations

These investigations raise a host of questions about how the major social media platforms, particularly YouTube and to a lesser extent X, are profiting from hate groups’ content through ads placement. These platforms say they’re dedicated to rooting out hate speech. Unfortunately, however, the machinery of their advertising business indicates otherwise. 

Based on our findings, here are recommendations for industry, government, and advertisers:   

Industry   

  1. Improve Ad Placement Features. Platforms must ensure advertisements are not placed in close proximity to hateful or antisemitic content. First, to the extent content that is tied to advertising violates platform policy, it should not be on the platform to begin with. Second, ad placement near hateful or antisemitic content is a risk to brand safety for advertisers. Third, ad placement may at times provide opportunities for hateful content creators to benefit from profit-sharing agreements with major platforms.    

  2. Stop Auto-Generating Hate. Platforms should not be auto-generating hate content. If platforms are unable to properly program tools to distinguish between auto-generating hateful versus innocuous content, they should pull down the auto-generation feature until the risk for creating such content is mitigated. There is no reason platforms should be proactively producing hate-filled or antisemitic content. 

  3. Share More Information About Advertising Policies. Many of the monetization policies and practices on platforms are opaque or unclear. Tech companies should be more transparent about revenue-sharing practices and policies around ad placement. In addition to providing updated information around policies, tech companies should conduct risk assessments and engage in red teaming exercises to ensure that their advertising practices are safe for brands and users. They should share information with advertisers—and researchers—about these exercises. Increasing transparency around advertising product features, policies, and practices would allow for advertisers, regulators, researchers, civil society, and other stakeholders to better understand the impact of monetizing hate and improve the ad tech ecosystem.   

Government    

  1. Mandate Transparency Regarding Platform Advertising Policies and Practices. We need more information about how platforms monetize content and what policies drive advertising decisions. Government-mandated disclosures are an important first step to understanding how to protect consumers and prioritize brand safety. Mandates, however, can and should be crafted in a way that protects private information about both advertisers and social media users. California has been successful in passing transparency legislation related to content moderation policies and practices. Still, Congress must pass comprehensive transparency legislation that specifically includes disclosures related to advertising policies and metrics.    

  2. Update Section 230. Congress must update Section 230 of the Communications Decency Act to fit the reality of today’s internet. Section 230 was enacted before social media and search platforms existed. Decades later, however, it continues to be interpreted to provide billion-dollar platforms with near-blanket legal immunity for user-generated content. We believe that by updating Section 230 to better define what type of online activity should remain covered and what type of platform behavior should not, we can help ensure that social media platforms more proactively address how they monetize content. Congress should clarify the specific kinds of content that fall under Section 230’s current liability shield. If user-generated or advertising content plays a role in enabling hate crimes, civil rights violations, or acts of terror, victims deserve their day in court.   

  3. Regulate surveillance advertising. We urge Congress to focus on how consumers—and advertisers—are impacted by a business model that optimizes for engagement. Congress should enact legislation that ensures companies cannot exploit consumers' data for profit—a practice that too often can result in monetizing antisemitism and hate. We know companies’ hands aren’t tied, and that they have a choice in what content they prioritize. Congress should pass the Banning Surveillance Advertising Act. Legislation like this is crucial to breaking the cycle and, ultimately, de-amplifying hate. ADL applauds the reintroduction of this vital legislation to disrupt Big Tech’s weaponization of our data. 

More about the Tech Transparency Project

Footnotes

1 Forty-one of the 67 search terms accompanied by in-feed ads generated ads solely on mobile. Twenty-four in-feed ads generated ads on both mobile and desktop. Two generated ads on desktop alone.

At the time of the experiment, the ads were identifiable by a small “Promoted” label in the bottom left corner. News reports indicate the platform later began using the label “Ad” in upper right corner. A mid-September spot check found that X used the “Promoted” label on mobile and the “Ad” label on desktop.

3 ADL and TTP first identified the auto-generated video for “Zyklon Army” in a previous report published in August 2023. Since that time, YouTube has removed the video, but the auto-generated “topic” channel for the band, Evil Skins, and the other auto-generated videos in that channel remained on YouTube as of mid-September. Some of those videos were monetized with advertising.

4 The hyperlinks for auto-generated artist channels and videos on YouTube Music are nearly identical to those on the main YouTube site, though the YouTube Music links start with “music.” For example, the channel for Evil Skins on YouTube is “youtube.com/channel/UCexG-ipS00OL2TmB34g6rSg” while the channel on YouTube Music is “music.youtube.com/channel/UCexG-ipS00OL2TmB34g6rSg."