House Dems Seek Facebook Info on Disinformation, Extremism
Cite reports Facebook knew impact of recommendations, content policies
House Democrats are accusing Facebook of knowingly "allowing extremist content and groups to grow" despite reportedly knowing through internal reviews that the platform's policies and recommendations were to blame for rapid dissemination of extremism and disinformation.
Also Read: House Dems Pressure Distributors On Channel Choices
The claim came in a letter to Facebook CEO Mark Zuckerberg from the Democratic leadership of the Energy & Commerce Committee, seeking answers to questions and access to internal documents.
The letter came a day before an Energy & Commerce subcommittee is scheduled to hold a hearing on traditional media's role in promoting disinformation and extremism.
Signing on to the letter were Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-N.J.), Oversight and Investigations Subcommittee Chair Diana DeGette (D-Colo.), Communications and Technology Subcommittee Chairman Mike Doyle (D-Pa.), and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-Ill.)
The legislators cited a report in the Wall Street Journal from back in May 2020, suggesting the company had been on notice long before the Capitol insurrection.
“[The] deadly attack on the Capitol laid bare the dire consequences of hyperpolarization and extremism in our current political discourse – much of which is occurring on your platform,” the House members told Zuckerberg. “With more than 3 billion monthly users across different services, Facebook must play a leading role in lessening the divide and lowering the temperature. To that end, the Committee is interested in understanding more about Facebook’s research on divisive content and user behavior, the reported presentations and recommendations made to Facebook executives and their actions in response, and the steps Facebook leadership has taken to reduce polarization on its platform.”
Also Read: FCC's Carr Says House Dems Are Trying to Censor Newsrooms
The quartet of subcommittee chairs also sought information and answers to the following questions:
- "When and why Facebook first began conducting research into divisive content and behavior on its platform;
- "Whether in the course of any internal studies or analyses Facebook uncovered any evidence or reached any findings that would confirm or suggest its platform, algorithms, or other tools exacerbate divisiveness or polarization;
- "Details on the Common Ground task force and Integrity Teams, as well as the review process that led to the two teams’ recommendations, and whether and how Facebook addressed or adopted such recommendations;
- "Details on the Eat Your Veggies process, including the names and role of each person involved in the process;
- "Copies of the 2016, 2018, and 2020 presentations warning of the rise of extremist content and the platform’s role in promoting it;
- "All documents and communications referring or relating to those presentations, any recommendations, suggestions, or proposals by the Common Ground task force or any Integrity Teams regarding divisiveness, polarization, or user behavior, and the Eat Your Veggies process."
The smarter way to stay on top of the multichannel video marketplace. Sign up below.
Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.