YouTube, TikTok and Snapchat are offering only “tweaks and minor changes” in their operations to ensure young users’ safety amid rising concern over the platforms’ potential harm to children, the head of a Senate panel told the companies’ executives Tuesday.
“Everything you do is to add more eyeballs, especially kids’, and keep them on your platforms for longer,” Sen. Richard Blumenthal, D-Conn., said at the start of a hearing by the Senate Commerce subcommittee on consumer protection that he heads.
The panel took testimony recently from a former Facebook data scientist, who laid out internal company research showing that the company’s Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people’s attention and loyalty.
“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, Blumenthal said.
“This is for Big Tech a Big Tobacco moment ..It is a moment of reckoning,” he said. “There will be accountability. This time is different.”
The three executives — Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc. — testified at the subcommittee hearing.
“We took action on more than 7 million accounts in the first three quarters of 2021 when we learned they may belong to a user under the age of 13 — 3 million of those in the third quarter alone — as we have ramped up our automated removal efforts,” Miller said.
TikTok has tools in place, such as screen time management, to help young people and parents moderate how long children spend on the app and what they see, Beckerman said. “We are determined to work hard and keep the platform safe,” he said.
The company says it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.
The three platforms are woven into the fabric of young people’s lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers say.
The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. The aim is to develop legislation to protect young people and give parents tools to protect their children.
The company says it stores all TikTok U.S. data in the United States. The company also rejects criticisms of promoting harmful content to children.
Early this year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for the under-18 crowd.
A separate House committee has investigated video service YouTube Kids this year. Lawmakers said the YouTube offshoot feeds children inappropriate material in a “wasteland of vapid, consumerist content” so it can serve ads to them. The app, with both video hosting and original shows, is available in about 70 countries.
A panel of the House Oversight and Reform Committee told YouTube CEO Susan Wojcicki that the service doesn’t do enough to protect children from potentially harmful material. Instead it relies on artificial intelligence and self-policing by content creators to decide which videos make it onto the platform, the panel’s chairman said in a letter to Wojcicki.
Parent company Google agreed to pay $170 million in 2019 settlements with the Federal Trade Commission and New York state of allegations that YouTube collected personal data on children without their parents’ consent.
Despite changes made after the settlements, the lawmaker’s letter said, YouTube Kids still shows ads to children.
YouTube says it has worked to provide children and families with protections and parental controls like time limits, to limit viewing to age-appropriate content. It emphasizes that the 2019 settlements involved the primary YouTube platform, not the kids’ version.
Snap Inc.’s Snapchat service allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its “Ghostface Chillah” faceless (and word-less) white logo.
Only 10 years old, Snapchat says an eye-popping 90% of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.
The company agreed in 2014 to settle the FTC’s allegations that it deceived users about how effectively the shared material vanished and that it collected users’ contacts without telling them or asking permission. The messages, known as “snaps,” could be saved by using third-party apps or other ways, the regulators said.
Snapchat wasn’t fined but agreed to establish a privacy program to be monitored by an outside expert for the next 20 years — similar to oversight imposed on Facebook, Google and Myspace in privacy settlements in recent years.