Parents and MPs say executives are not doing enough to stop dangers such as sexual exploitation and bullying.
CEOs of companies like Meta, TikTok and X have come under fire from U.S. lawmakers over the dangers children and teens face using social media platforms.
On Wednesday, amid outrage from parents and lawmakers, executives told the U.S. Senate Judiciary Committee that companies should take action against children, including blocking sex offenders and preventing teen suicide. They testified that they were not doing enough to stop online dangers.
“Children are responsible for many of the dangers they face online,” U.S. Senate Majority Whip Dick Durbin, the committee's chairman, said in his opening remarks. “Their design choices, failure to properly invest in trust and safety, and constant pursuit of engagement and profit over basic safety are all putting our children and grandchildren at risk.”
Durbin cited statistics from the nonprofit National Center on Missing and Exploited Children that show financial “sextortion,” in which criminals trick minors into sending explicit photos and videos, has skyrocketed in the last year. .
The committee also played videos of children speaking about their victimization on social media platforms. “I was sexually exploited on Facebook,” says one of the children who emerges from the shadows in the video.
“Mr. Zuckerberg, I know that's not what you and the companies before us meant, but you have blood on your hands,” said Senator Lindsey Graham, CEO of Meta. He said this in reference to Mark Zuckerberg. Facebook and Instagram. “The product you have developed will kill or injure people.”
Zuckerberg testified along with X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chu, and Discord CEO Jason Citron.
X's Yaccarino said his company supports the STOP CSAM Act. The bill, introduced by Durbin, would hold tech companies accountable for child sexual abuse content and allow victims to sue tech platforms and app stores. The bill is one of several he said aimed at addressing child safety. None of this has become law.
X (formerly Twitter) has come under intense criticism since Tesla and SpaceX CEO Elon Musk acquired the platform and relaxed its moderation policies. The company blocked searches for pop singer Taylor Swift this week after a sexually explicit fake image of her appeared on the platform.
On Wednesday, TikTok CEO Chu appeared before U.S. lawmakers for the first time since March, when the Chinese-owned short video app company faced tough questions. Some suggested the app was having a negative impact on children's mental health.
“We make careful product design choices to make our app a deterrent to people who would do harm to teens,” Chu said, adding that TikTok's community guidelines state that ” “Anything that puts young people at risk of exploitation or other harm is strictly prohibited and we vigorously enforce it.” .
At the hearing, executives touted existing safety tools on the platform and the work it has done in collaboration with nonprofits and law enforcement to protect minors.
Ahead of their testimony, Mr. Mehta and Mr. X announced new measures in anticipation of a heated session.
But child health advocates say social media companies repeatedly fail to protect minors.
“The bottom line shouldn't be the first factor these companies consider when they have to make really important decisions about safety and privacy,” said Scott, co-chairman of the youth-led organization Design It For Us. Zaman Qureshi said. A coalition advocating for safer social media.
“These companies have had the opportunity to do this before. They have failed to do so, and independent regulation needs to step in.”