Skip to main content

House Bills Target Online Ad Practices on Child-Directed Sites

Capitol Hill
(Image credit: Gary Arlen)

The House Consumer Protection Subcommittee is the latest collection of legislators aiming to rein in Big Tech, and it has amassed a quiverful of regulatory arrows.

In a legislative hearing Thursday (Dec. 9), the subcommittee is getting input from a number of stakeholders on a raft of bills meant to start putting guardrails on social media and the algorithms that power them. Those include the "Social Media Disclosure and Transparency of Advertisements Act," which would require platforms to archive advertising for outside researchers, including copies of the ads and how and to whom they were targeted.

The bills are, in the order listed on the hearing agenda:

1. H.R. 3451, the "Social Media Disclosure and Transparency of Advertisements Act of 2021"

2. H.R. 3611, the "Algorithmic Justice and Online Platform Transparency Act," which would "prohibit the discriminatory use of personal information by online platforms in any algorithmic process, to require transparency in the use of algorithmic processes and content moderation, and for other purposes."

3. H.R. 3991, the "Telling Everyone the Location of data Leaving [TELL] the U.S. Act," which would "require that any person that maintains an internet website or that sells or distributes a mobile application that maintains and stores information collected from such website or application in China to disclose that such information is stored and maintained in the People’s Republic of China and whether the Chinese Communist Party or a Chinese state-owned entity has access to such information."

4. H.R. 4000, the "Internet Application Integrity and Disclosure Act," which would "require any person that maintains an internet website or that sells or distributes a mobile application that is owned, wholly or partially, by the Chinese Communist Party or by a non-state owned entity located in the People’s Republic of China, to disclose that fact to any individual who downloads or otherwise uses such application."

5. H.R. 5439, the "Kids Internet Design and Safety Act," which would prohibit an online platform directed to children to use various algorithmic process or other means to market to kids, to draw them to the site or to keep them on the site, including auto-play, messages or alerts meant to get them on the platform, badges for elevated engagement, and much more.

6. H.R. 6083, the "Deceptive Experiences to Online Users Reduction Act," which would "prohibit the use of exploitative and deceptive practices by large online operators and to promote consumer welfare in the use of behavioral research by such providers."

7. H.R. 6093, the "FTC Whistleblower Act of 2021," which would provide incentives for and protect whistleblowers, like Frances Haugen, formerly of Facebook, whose sharing of internal documents about Instagram has helped power the recent congressional crackdown on social media.

Subcommittee Chairwoman Jan Schakowsky (D-Ill.), who introduced the FTC Whistleblower Act, said that millions of people are powerless against manipulative ads and algorithms. She said in her opening remarks that, "for too long, Big Tech has acted without any real accountability," instead providing only excuses and apologies. "The time for self-regulation is over," she said, echoing the sentiments of the Senate Consumer Protection Subcommittee the day before. She said the subcommittee could and would create a better and safer internet.

The subcommittee lined up plenty for supporters as witnesses for its aim of regulating Big Tech.

"To put it plainly, the unregulated business model for digital media is fundamentally at odds with children’s wellbeing," said Josh Golin, executive director of Fairplay in his prepared testimony. "Digital platforms are designed to maximize revenue, and design choices that increase engagement and facilitate data collection put children at risk....It is past time for Congress to enact new online protections for children that require online operators to prioritize children’s wellbeing in their design choices," something the subcommittee was clearly trying to do with the bills teed up for the hearing.

Senate Democrats had used almost that exact wording in expressing their disaffection with Instagram at a hearing Wednesday in the Senate Consumer Protection Subcommittee.

"It is long past time to acknowledge the threats social media platforms fuel and the need for increased accountability," said Jonathan Greenblatt, CEO and national director of the Anti-Defamation League, in his prepared testimony. Greenblatt was focused on social media's spreading of hateful messages and said that researching the prevalence of online hate should include looking at the role of ISPs and online funding sources in facilitating the spread of "hate and extremism." ■

Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.