The chief executives of social media companies Meta, X, TikTok, Snap, and Discord faced tough questions on their efforts to combat online child sexual exploitation at a Senate hearing on Wednesday.
Senator Dick Durbin, the Judiciary Committee’s Democratic chairman, cited statistics from the National Center for Missing and Exploited Children nonprofit group that showed financial “sextortion,” in which a predator tricks a minor into sending explicit photos and videos, had skyrocketed last year.
“This disturbing growth in child sexual exploitation is driven by one thing: changes in technology,” Durbin said during the hearing.
As the hearing kicked off on Wednesday, the committee played a video in which children spoke about being victimized on the social media platforms.
“I was sexually exploited on Facebook,” said one child in the video, who appeared in shadow.
In the hearing room, dozens of parents stood waiting for the CEOs to enter, holding pictures of their children.
“Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands,” said Senator Lindsey Graham, referring to Meta CEO Mark Zuckerberg. “You have a product that’s killing people.”
“We make careful product design choices to help make our app inhospitable to those seeking to harm teens,” Chew’s written testimony says, adding TikTok’s community guidelines strictly prohibit anything that puts “teenagers at risk of exploitation or other harm — and we vigorously enforce them.”
Chew disclosed more than 170 million Americans used TikTok monthly — 20 million more than the company said last year. Durbin said the platforms are being used by offenders to target children or trade child sexual abuse material.
Zuckerberg, whose Meta owns Facebook and Instagram; X CEO Linda Yaccarino; Snap CEO Evan Spiegel; and Discord CEO Jason Citron will testify.
“We’re committed to protecting young people from abuse on our services, but this is an ongoing challenge,” Zuckerberg’s written testimony says. “As we improve defenses in one area, criminals shift their tactics, and we have to come up with new responses.”
Speigel said Snap’s parental controls resemble “how we believe parents monitor their teens activity in the real world – where parents want to know who their teens are spending time with but don’t need to listen in on every private conversation.”
The committee last year approved several bills, including one that would remove tech firms’ immunity from civil and criminal liability under child sexual abuse material laws that was first proposed in 2020. None have become law.
Senator Amy Klobuchar told Reuters it is time for legislative action. “For too long social media companies have turned a blind eye when young children joined the platforms, increased the risk of sexual exploitation, used algorithms that push harmful content, and provided a venue for dealers to sell deadly drugs like fentanyl,” she said.