COLUMBIA — Days after a bill requiring social media sites to limit certain features for children became law, a company representing major technology companies sued the state, claiming the brand new law is unconstitutional.
Gov. Henry McMaster signed the bill, dubbed the Age-Appropriate Code Design Act, into law Feb. 5, following approvals from the House and Senate.
In an effort to protect children from online predators and mental health effects, the law requires social media apps to allow users under the age of 18, or their parents, the ability to turn off certain features, such as messages, comments and the algorithms many sites use to show users posts similar to those they’ve engaged with in the past.
Parents must be allowed to restrict how much time their child can spend on an app and monitor their child’s activity. And the apps can’t collect data, such as locations, of users under 18 or show them targeted advertisements under the law.
The sites should protect underage users from harm, including “compulsive usage,” emotional distress, identity theft and scams, the law reads.
Those requirements impose “sweeping restrictions on free speech,” D.C.-based trade association NetChoice claimed in a federal lawsuit filed Monday, just two days after the law took effect.
NetChoice’s members include dozens of popular websites, including social media sites X, formerly known as Twitter; TikTok; and Meta, which owns Facebook and Instagram.
“South Carolina’s speech code is a disaster,” said Paul Taske, co-director of the association’s legal arm, in a statement. “It imposes a sweeping censorship regime fundamentally aimed at controlling how speech is presented, what speech users can see, and places new roadblocks in the way of accessing that speech.”
The association is asking a federal judge to halt enforcement of the law and overturn it.
Courts have paused similar laws regulating social media use following challenges from NetChoice in California, Louisiana, Colorado, Arkansas, Georgia and Ohio, according to the lawsuit.
The lawsuit argues that the content those sites and others show is protected under the U.S. Constitution’s First Amendment, which guarantees freedom of speech and expression. The law “regulates how covered services select, rank, recommend, and display speech to their users — all protected expression,” the lawsuit reads.
Social media sites must then “act as the government’s speech police,” removing posts that might become a liability under state law, attorneys for NetChoice wrote.
Keeping certain groups of users from seeing specific content also violates users’ First Amendment rights to find and interact with certain information, the lawsuit claims.
Many major social media sites already offer options for children and their parents to curate their experience.
Parents can toggle how long their children are allowed to spend on apps, when they receive notifications, who can message them and what sort of content they see. But because of the law’s vagueness about what is and isn’t allowed, sites may not know whether their existing controls are enough to satisfy the requirements, the lawsuit argued.
Take, for example, the requirement to protect users from emotional distress. Different content might cause different users emotional distress, and what upsets a young child may not upset a 17-year-old, despite the two both being considered minors under the law, lawyers for NetChoice wrote.
Wanting to avoid lawsuits, social media companies will likely err on the side of caution, removing certain content for all users, not just children, the lawsuit claimed.
The same issue applies to protections against “compulsive usage,” which the law defines as using an online service to the extent that it affects other parts of a user’s life. At what point, the lawsuit asked, does a child’s time on their phone go from a hobby to a compulsion?
“It is entirely vague when a computer game becomes ‘too fun’ such that a website offering that game violates a law against ‘compulsive usage,’” the lawsuit reads.
Federal communications laws also protect websites from liability for what’s posted onto them, meaning the sites can’t be held responsible for what a user sees on their feed, the lawsuit argues. Nor can sites be held responsible for the ads third-party companies pay to place, the companies claimed.
Even if social media companies felt the law was constitutional, putting in place the controls required could take months of work, the companies argued. The law went into effect as soon as McMaster signed it, immediately putting any sites that didn’t already offer certain controls out of compliance.
That violated the companies’ rights to due process under the Constitution, they argued. Enforcement is up to the attorney general, who can sue sites that break the law for damages.
“These services now face immediate liability tied to vague, complex, service-wide design changes that require services to rebuild from the ground up,” the lawsuit reads. “That rebuilding cannot occur overnight — or even in weeks.”
If that’s the case, technology companies should spend more time and money making their websites safer for children instead of filing lawsuits, said Rep. Brandon Guffey, a Rock Hill Republican who co-sponsored the bill.
Regulating social media is a priority for Guffey, whose son died by suicide after a scammer posing as a young woman on Instagram attempted to use nude photos the 17-year-old had sent to extort him for money.
Part of the bill allowing parents or children to turn off message functions could prevent strangers from asking underage users for incriminating photographs or threatening them.
Without regulation, technology companies “continue to take (children’s) data, market to them and brainwash them,” Guffey told the SC Daily Gazette.
“I don’t want to take away innovation from companies,” Guffey said. “I just want to protect our children.”
Recent lawsuits have claimed social media companies are addictive to children, causing harms in similar ways to gambling and smoking. Opening statements in a Los Angeles trial claiming major social media companies led to mental health issues for one teen began this week.
Before passing the law, legislators removed a requirement that children get permission from their parents to use social media sites. Parents would have had to call a hotline, attend a video conference or give government-issued identification to allow their child onto the app.
Also gone from the bill is a requirement that the Department of Education teach students how to safely use social media.
Although it’s not legally required, the agency is asking for $18 million to explain to students what happens in their brains when they spend too much time scrolling on their cellphones.
Students in K-12 public schools aren’t allowed to use their phones during school hours. That rule has been in effect for the past year under a clause that has been in the state budget since 2024.
SC Daily Gazette is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. SC Daily Gazette maintains editorial independence. Contact Editor Seanna Adcox for questions: info@scdailygazette.com.
