Tech Group ITIF Offers Plan to Tackle Content Moderation ‘Crisis’

Social media icons on a blue background
(Image credit: Serdarbayraktar via Getty Images)

Technology think tank ITIF–The Information Technology & Innovation Foundation said the United States should lead an international forum of online stakeholders to come up with voluntary content-moderation guidelines, rather than push for new rules and regulations on targeted advertising or algorithms.

That point came in a new ITIF-penned report responding to what it calls a “crisis of legitimacy” in social-media content moderation.

ITIF issued its report as both Democrats and Republicans call for new regulations on edge providers due to a number of issues, including privacy protections, or the lack of them; targeted advertising; allegations from Republicans of anti-conservative bias; allegations from Democrats of insufficient policing of hate speech; allegations from both sides of insufficient protections of children online; and more.

The report, authored by ITIF senior policy analyst Ashley Johnson and ITIF VP and Center for Data Innovation director Daniel Castro, contends social media companies can’t solve all of these issues by themselves. But Congress is deadlocked, they argued, so solutions lie elsewhere than new rules and regulations on algorithms and targeted advertising.

On the issue of weeding out “harmful” state-sponsored content, ITIF says the U.S. should fund research grants and promote better information-sharing.

And although Congress is deadlocked, the report authors said, the legislature should unlock itself long enough to pass laws “establishing transparency requirements for content moderation decisions of social media platforms and requiring platforms to enforce their content moderation policies consistently.”

That essentially tracks with some of the proposed legislation, but without what tech companies see as a heavy-handed regulatory backstop, such as new Federal Trade Commission rules under its Section 5 authority over false and deceptive practices.

“[S]everal of the structural or technical changes that have been proposed for social media would likely make things worse for both content moderation and other issues impacting consumers,” the report said. ▪️

John Eggerton

Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.