The Senate Consumer Protection Committee lined up three new targets Tuesday (Oct. 26) in its ongoing punishment of social media for its impact on children and teens and signaled they could take a pounding similar to Facebook's recent unpleasantness on the Hill.
That came in a hearing entitled “Protecting Kids Online: Snapchat, TikTok, and YouTube.” Testifying were Jennifer Stout, VP of global public policy, at SnapChat owner Snap Inc.; Michael Beckerman, VP and head of public policy, Americas, for TikTok; and Leslie Miller, VP, government affairs and public policy, for YouTube.
Also: Snapchat, TikTok, YouTube Agree to Capitol Hill Grilling
To make sure nobody missed the latest potential Big Tech beatdown, Sen. Richard Blumenthal (D-Conn.), chairman of the Subcommittee, and Sen. Marsha Blackburn (R-Tenn.), ranking member, had issued a press release in advance of the hearing pointing out who was testifying and where to stream the proceedings.
In his opening statement, Blumenthal left no doubt about his concerns with edge providers in general. He said Facebook's revelations had led to a "definite and deafening" drumbeat of revelations, and of calls for action from Washington, and not just targeting Facebook. He said that there is there is ample evidence to launch investigations into social media platforms.
Also: Sen. Blumenthal: Facebook Chronically Ignores Internal Alarms
And while he said this hearing was about continuing to educate the committee about "this crisis," it was also about reading those platforms a riot act. Blumenthal said it was the first time TikTok and Snap had appeared before Congress and he appreciated it, but that was about the last encouraging word from him.
Sen. Blumenthal said that Snapchat, TikTok, and YouTube were part of the crisis, and along with Facebook were sending the message to American parents: "You cannot trust Big Tech with your kids," and Big Tech can't say to parents it is their job to be the gatekeeper.
Like Facebook, he said, those platforms' algorithms exacerbate downward spirals for teens, fuel hate and violence, amplify depression, anger, and anxiety, and do so because those emotions hook kids on their platforms. He said that was why there was a drumbeat for accountability to "parents, the public, Congress, investors, shareholders, the SEC and other agencies.
Having read the witnesses testimony, Blumenthal said their argument was that "we're not Facebook." But he said that was not a defense, and that the Facebook bar was "the butter," and it should not be a race to the bottom.
Sen. Blumenthal, one of the legislators branding Big Tech with the Big Tobacco label, did say that there was a distinction between the two given that tobacco was inherently dangerous, while social media could be beneficial with proper safeguards.
Blackburn continued the punishment.
She said for too long social media platforms had been allowed to promote and glorify dangerous content and that she had heard from parents, teachers, and mental health professionals all with the same question: "How long are we going to let this continue, and what will it take for platforms to crack down on this dangerous material."
She said children as young as nine had died after doing viral challenges on TikTok and girls had been lured into sexual relationships on SnapChat. Then there was the videos of people slitting their wrists on YouTube. "You are parents, how can you allow this," she asked. "Does it matter to you?" She answered her own question, suggesting they loved to attract young audiences with content fed to them with algorithms.
Both Blumenthal and Blackburn said social media were driving kids and teens down dark rabbit holes.
Stout said SnapChat is not a "rabbit hole" because it curates its content, including by choosing trusted content partners.
Miller said that YouTube prohibits content glorifying eating disorders, though eating disorder content could be on the platform if it is people sharing their stories around the issue.
Blackburn also called out TikTok for its Chinese connection and collection of data, from keystrokes to geolocation to facial recognition to audio from smart speakers, that could be used by China to surveil U.S. citizens as the Chinese government does to its own citizens.
The witnesses all said they had protections in place for young people, including tools for parents; that they have age limits and remove too-young user accounts when they find them or inappropriate content when they find it; that the majority of users had positive experiences with their platforms, and that they have all done internal impact research and either had, or would, share it publicly.
Sen. Amy Klobuchar (D-Minn.) clearly appeared unpersuaded. She said that she did not think that kids and democracy should be collateral damage to the profit-seeking of social media.
SnapChat's Stout said that the platform's architecture "was intentionally designed to empower people to express a full range of experiences and emotions with their real friends, not just the pretty and perfect moments." But she also said the company "takes into account the unique sensitivities and considerations of minors when we design products."
TikTok's Beckerman said that company's goal is "providing an age-appropriate experience for our younger users." When pressed by Blackburn, he also said that the company does not share data with the Chinese government and he had research to prove it.
YouTube's Miller said the company has "clear policies that prohibit content that exploits or endangers minors on YouTube and we have committed significant time and resources toward removing violative content as quickly as possible."
From data privacy failures to allegedly dangerous algorithms, there is bipartisan angst and anger over a Big Tech sector once the bootstrap darlings of Capitol Hill. The latest hearing was billed as examining "how tech companies treat young audiences, including how algorithms and product design choices can amplify harms, addiction, and intrusions into privacy." The goal is to come up with legislation essentially to protect kids and young people from all that bad stuff.
The hearing, the fourth in a series on the Web and young people's safety, came against the backdrop of new criticism from the subcommittee of Facebook, driven by more documents from whistleblower Frances Haugen (the headline of a top-of-fold Washington Post front page story: "Insiders [Sa] Zuckerberg Chose Growth Over Safety").
"If TikTok, Snap and YouTube were as wonderful and responsive to concerns as their representatives asserted, there wouldn't have been any need for hearings," said Josh Golin, executive director of Fairplay (formerly Campaign for a Commercial-Free Childhood). "Unfortunately, these platforms are plagued by the same problems as Facebook and Instagram: privacy abuses, and design features and content that jeopardize young people's wellbeing and safety. The underlying issue is a business model that prioritizes engagement and data collection over the best interests of children. That's why we need a US design code. Legislative proposals like the KIDS Act and the Kids PRIVCY Act will get us there but only if Congress acts with the urgency these issues require."
The smarter way to stay on top of the multichannel video marketplace. Sign up below.
Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.
Thank you for signing up to Multichannel News. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.