December 09, 2021
House Committee on Energy and Commerce
Subcommittee on Consumer Protection and Commerce
"Holding Big Tech Accountable: Legislation to Build a Safer Internet"
Oral Remarks by Jonathan Greenblatt, ADL CEO (Full written: View PDF here)
December 9, 2021 | Washington, D.C.
Thank you Madam Chair Schakowsky, Ranking Member Bilirakis, and members of the subcommittee, good morning. It is a privilege and an honor to appear before you today.
ADL is the oldest anti-hate group in America. We have been fighting antisemitism and all forms of bigotry for more than a hundred years, and we have been tracking online hate since the days of dial-up.
This work includes partnering with law enforcement to help prevent online threats from mutating into offline incidents. We work with authorities at all levels and, in the past 11 months, we’ve provided the FBI with more than a thousand actionable tips. Our 25 offices across the country engage directly with individuals and institutions affected by hate.
In 2017, ADL launched the Center for Technology and Society to double down on our efforts to fight online hate. We were the first civil rights group with an operation right in the heart of Silicon Valley. And it is staffed, not by longtime nonprofit professionals, but by software engineers, product managers, data scientists and computer experts all hired from industry. We conduct analysis, publish research, build technology, and provide recommendations to policymakers, like yourselves, and industry leaders.
Today, there is no distinction between online and offline lives. When we say that Facebook is the frontline in fighting hate, I mean that literally. We’ve seen over and over again the way that hateful content online leads to violence in our communities offline.
Poway. El Paso. Pittsburgh. These targeted mass shootings were motivated by extremist conspiracy theories that were spawned and spread on social media.
In addition to these tragedies, online hate affects the everyday lives of millions of Americans. Our research has found that 41 percent of users report experiencing online hate and harassment.
According to ADL's most recent analysis, 75 percent of those harassed report that it happened to them on Facebook. That's nearly three times the percentage on any other platform. And make no mistake, all of them are highly profitable companies. So this isn't a resource problem; it's a responsibility problem.
Just today ADL released new research demonstrating how easy it is to find white supremacist, accelerationist content on Instagram, less than 24 hours after the CEO sat at another table just like this and said they were cleaning up their mess.
But these platforms lack and neglect safety because, first and foremost, they are exempt from liability due to the loophole of Section 230. Now I know that isn't the topic of today's hearing, but make no mistake, Section 230 must be changed to force the companies to play by the same rules that every other media company on the landscape operates by today. It's just not a matter of free speech, it's simply being held accountable in courts of law when the platforms aid and abet unlawful, even lethal, conduct in service of their growth and revenue.
Tech companies are complicit in the hate and violence on their platforms, because if it bleeds, it leads, and it feeds their business model and their bottom line.
Hate speech, conspiracy theories, they are amplified by the algorithms, nudged to the top of their news feeds, and they addict users like a narcotic, driving engagement, which in turn increases their profits.
With no oversight and no incentives beyond increasing revenue, tech companies will continue to do whatever then can, whatever it takes to optimize engagement – regardless of the consequences. This just can't continue.
If not for courageous whistleblowers like Frances Haugen, we wouldn't have the hard evidence to prove that Facebook knowingly, knowingly, is mainstreaming extremism, inciting violence through its algorithms and fracturing societies around the world. What if other tech companies’ employees felt empowered and protected to expose wrongdoing when they saw it? That's why the protections, Congresswoman Schakowsky, in your FTC Whistleblower Act are so crucial.
If platforms have no meaningful motivation to fix the harmful algorithms that amplify hate, they won't do it. That's why the Algorithmic Justice and Online Transparency Act that would protect consumers from harmful and discriminatory AI systems are really long overdue. So we applaud that legislation as well.
Finally, to stay ahead of the curve, we've got to prioritize research. In August, ADL Belfer Fellow and NYU professor Laura Edelson was deplatformed on Facebook hours after the company realized that she and her team were studying the role that Facebook may have played in leading up to the January 6th insurrection. Platforms should not be able to thwart important, third-party research at their whim. Bills like the Social Media DATA Act would ensure that academics can study platforms to better inform the public.
Look, there are no silver bullets. There's no one-size-fits-all solution to repairing our internet. But there is a lot you can do – right now – to take action. I’ve highlighted three bills, and I'm happy to talk about them and others in the Q&A.
But members of the committee, let me conclude by urging you to remember that what happens online has a real impact on our lives. The status quo directly threatens our kids, our communities, and our country. Now is the time for you to legislate and act.
Thank you. I look forward to your questions.