In the final minutes of Wednesday's Congressional hearing in which top executives of technology companies were criticized for failing to protect children online, Sen. Richard J. Durbin, D-Ill. urged lawmakers to act to protect its youngest users.
“There are no excuses,” he said.
Lawmakers have long made similar statements about holding tech companies accountable, but with little evidence to back them up. Republicans and Democrats alike have at various points declared it is time to regulate big tech on issues such as privacy and antitrust. But for years, there were no new federal regulations for companies to follow, and that's the end of it.
The question is whether this time will be different. And there are already signs that the topic of online child safety may receive even more legal attention.
At least six legislative proposals pending in Congress target the spread of child sexual abuse content online and force platforms like Instagram, Snapchat and TikTok to do more to protect minors. will be required. This work is supported by the emotional testimonies of children who have been victimized online and who have died by suicide.
The only federal Internet law passed in recent years, SESTA (Suppression of Sex Trafficking Act and Combating Online Sex Trafficking Act) makes it easier for victims of sex trafficking to sue websites and online platforms. It was approved in 2018. Even after heartbreaking testimony from the victim's mother.
Online safety experts and lawmakers say child safety is a personally relatable and visceral topic that is easier to sell politically than other issues. At Wednesday's hearing, faced with stories of children who died as a result of sexual exploitation, Mehta's Mark Zuckerberg said he regretted the suffering the families suffered.
“Like the tobacco industry, it took a series of embarrassing public hearings, but Congress finally took action,” said Jim Steyer, president of Common Sense Media, a nonprofit children's advocacy group. Told. “The dam has finally burst.”
Legal progress on child safety online would provide a counterbalance to the gridlock that has gripped Congress on other technology issues in recent years. Proposed rules to govern big tech companies like Google and Meta have repeatedly failed to become law.
In 2018, for example, Congress pursued Mr. Zuckerberg for leaking Facebook user data to Cambridge Analytica, a company that creates voter profiles. Outrage over the incident has led to calls for Congress to pass new rules to protect people's online privacy. But while California and other states ultimately approved online privacy laws, Congress did not.
Lawmakers have also attacked a law called Section 230 of the Communications Decency Act, which protects online platforms like Instagram and TikTok from many lawsuits over content posted by users. Congress has made no substantive changes to the law other than to make it harder for platforms to use legal shields when accused of meaningfully aiding and abetting sex trafficking.
Lawmakers also proposed legislation that would make some of the companies' business practices illegal after companies like Amazon and Apple were accused of operating monopolies and abusing their power over smaller rivals. . Efforts to push the bill across the finish line in 2022 have failed.
Sens. Amy Klobuchar, D-Minnesota, and Josh Hawley, R-Missouri, and other lawmakers blame the power of tech lobbyists for killing the proposed rules. He also said high-tech regulation was not a priority for Congressional leaders, who have focused spending on bills and measures aimed at subsidizing U.S. companies that make critical computer chips and use renewable energy. There is also a view.
The Senate Judiciary Committee, which hosted Wednesday's hearing, considered five child safety bills targeting technology platforms ahead of the hearing. The committee passed the bill last year. None of this has become law.
Among the proposals are the STOPCSAM Act (Strengthening Transparency and Obligation to Protect Abuse and Victim Children Act), which would give victims new tools to report child sexual abuse to internet companies; It included the REPORT Act (amending existing procedures for reporting to Internet companies). This expands the types of potential crimes that online platforms are required to report to the National Center for Missing and Exploited Children.
Other proposals would make it a crime to distribute intimate images of a person without that person's consent and encourage law enforcement to coordinate investigations of crimes against children.
Another proposal passed by the Senate Commerce Committee last year, the Kids Online Safety Act, would impose legal obligations on certain online platforms to protect children. Parts of the bill have been criticized by digital rights groups such as the Electronic Frontier Foundation, who say it could encourage platforms to remove legitimate content while companies try to comply with the law.
Klobuchar, who questioned tech executives at Wednesday's hearing, said in an interview that the session “felt like a breakthrough.” She added, “As someone who has been involved with these companies for years, this was the first time I felt hope in her cause.”
Some were skeptical. Any proposal would need support from Congressional leaders to pass. Bills that passed out of committee last year will have to be reintroduced and go through the process again.
Hany Farid, a professor at the University of California, Berkeley, said he helped develop the technology used by the platform to detect child sexual abuse content and observed the holding of Congressional hearings on online child protection. Ta.
“This is one thing we should all agree on: We have a responsibility to protect our children,” he said. “If we can’t get this right, what other hope is there?”