Key Takeaways

  • Georgia lawmakers passed a law regulating social media companies, but the industry temporarily sidelined advertising prohibitions to minors.
  • Age verification mandates have faced legal challenges, prompting appeals from Georgia’s Attorney General.
  • Lawmakers across states are pushing for stricter regulations as parenting concerns rise over social media’s impact on children.
  • Internal documents suggest social media platforms prioritize engagement over safety, exacerbating concerns about their algorithms.
  • As the legal landscape evolves, the movement for parental rights and accountability in social media persists amid broader national debates.

In a rare bipartisan act to protect children, Georgia legislators adopted a law last year that regulated social media companies.

But the industry sued and, for now, has sidelined the prohibition on advertising to children. The law also required platforms to obtain parental consent when minors signup for service.

The age verification mandate forces everyone to share identifying information to prove their age, placing what a federal judge called “severe burdens” on adults, leading to her decision in June to issue a preliminary injunction against enforcement.

Georgia Attorney General Chris Carr has appealed the decision. His office said he is helping to defend similar laws in other states, including Texas, Florida and Ohio.

The platforms are fighting a tide of legislation as states react to congressional inaction. Stories about heedless harm in the pursuit of advertising dollars have galvanized parents and politicians.

Georgia lawmakers have not given up. They have been meeting over the summer to consider other ways to rein in social media.

“I don’t think we can wait for the federal government to do this for us,” said Sen. Sally Harrell, D-Atlanta, who has been leading a study committee looking for another approach. “So, we’re going to do it.”

The industry has made itself a target for lawmakers across the political spectrum.

The state law that was sidelined in federal court was a top priority for Lt. Gov. Burt Jones, the Republican who leads the GOP-dominated Senate and, like Carr, is running for governor in next year’s GOP primary.

Jones and his allies picked Democrat Harrell to lead the bipartisan study committee on the impact of social media on children.

She and fellow senators recently heard from parents whose children had died by suicide after social media exposure.

Industry insiders such as Ravi Iyer, a technologist and academic psychologist at the University of Southern California, who led data science, research, and product teams at Facebook, have been educating Georgia lawmakers.

At a hearing this week, he described internal company documents that suggest platform designers knew their algorithms were harmful but deployed them anyway.

The documents, referenced in a lawsuit against Meta by New Mexico Attorney General Raúl Torrez, describe how Meta undercounted harmful content and experiences reported on its platforms, including bullying and harassment.

Iyer did the math using the leaked statistics. “That is millions and millions of kids,” he said.

Pete Furlong, a researcher with the Center for Humane Technology, said in the hearing that numerous whistleblower complaints indicate Meta has “abundant” research showing its algorithmic feeds harm kids and adults but that the company is not taking clear steps to address it.

“They could make it better,” he said. “It’s their business incentives that drive them to design these products in this way.”

After conservative activist Charlie Kirk was shot dead in Utah last week, the state’s Republican governor said social media companies were addicting Americans to outrage and hate. Gov. Spencer Cox called technology companies “conflict entrepreneurs.”

It may be a cultural turning point that opens a window for successful legislation. But Georgia’s senators are wading into a legal thicket. Social media companies are sheltered by a foundational 1996 federal law that established the legal framework for the internet. Known as Section 230, the law immunizes platforms from liability for hosting users’ content.

Instead of choosing which content to promote like a typical publisher, internet platforms use algorithms to drive feeds that are tailored to each user to increase their engagement — and their value to advertisers.

The companies monitor users’ behavior on their platforms to inform the kind of custom results that frustrate critics such as Cox.

Lawmakers in California targeted this practice.

The Age-Appropriate Design Code Act regulates how platforms allow minors to use personalized recommendation algorithms.

NetChoice, the same industry group that sidelined Georgia’s law on First Amendment grounds, also filed a federal suit in California. A district judge ruled against the law, but the state appealed and earlier this month the federal appeals court there ruled partially in the state’s favor.

“It upheld the addictive feeds regulation,” said Matt Lawrence, a law professor at Emory University who is an expert on addiction law and has been following tech industry lawsuits.

It remains unclear to what extent courts elsewhere in the country will define algorithms as protected speech though.

Other states are targeting age verification. The U.S. Supreme Court recently upheld a law that does that in Texas.

The high court’s decision in June was about minors accessing websites with sexually explicit content rather than social media platforms.

Even so, Eric Segall, a federal courts and constitutional law professor at Georgia State University, said it suggests a shift in the court’s thinking.

Under Chief Justice John Roberts, the nation’s top court has prioritized the First Amendment, Segall said.

But in recent years, parent rights have become a growing priority for conservatives, and the Texas decision may reflect that, he said. If there is a recalibration, it could create an opening for laws like Georgia’s that target social media.

“There’s a big movement out there for parents’ rights to control their children and that’s going to affect how judges view these laws,” Segall said. “As long as it’s a parent consent law, not a ban law, it’s much more likely to be upheld.”

But pendulums swing and the window for regulation could close. After all, the platforms have made themselves indispensable to people across the globe.

Last week, a government crackdown on social media in Nepal inspired deadly riots among the youth.

Any attempts to blunt the power of algorithms to make them less invasive will also dull their utility, said Noah Giansiracusa, a mathematician and author of the book “Robin Hood Math: Take Control of the Algorithms That Run Your Life.”

“We don’t want to make these powerful systems that help us worse,” he said. Like many who have watched the attempts to regulate social media unfold, he drew an analogy to the battles against the automobile industry for seatbelts, airbags and cleaner emissions.

Cars pollute and kill, he said, yet we need them.

“We want to make them better, but we don’t want to ban them,” he said. “So, that’s where we are with social media, except we don’t really know how to make them safer, and we have all these free speech issues.”

Lawrence, the Emory professor, said lawmakers will have to keep trying and see what sticks.

Harrell said she and her committee will focus on algorithm design and age verification. They are also turning to future tech.

An explosive report by Reuters in August said an internal Meta policy document permitted provocative chatbot behavior on topics including sex, race and celebrities.

Harrell’s committee will hear about artificial intelligence when it meets Oct. 8.

“Be prepared for an intense meeting,” Harrell said, “because what’s happening with AI is a little overwhelming.”

This story is available through a news partnership with Capitol Beat News Service, a project of the Georgia Press Association Educational Foundation.

Ty Tagami | Capitol Beat

Ty Tagami is a staff writer for Capitol Beat News Service. He is a journalist with over 20 years experience.