Inside the Legal War Over Arkansas’s Social Media Laws

Arkansas’s Social Media Safety Act is back in court as NetChoice challenges the law in Fayetteville, arguing it violates free speech. Families share tragic stories while Governor Sanders and Attorney General Griffin defend the legislation.

Inside the Legal War Over Arkansas’s Social Media Laws
Photo Credit: Quartz

Amid rising concerns from parents about the dangers social media poses to children, Arkansas moved quickly, becoming one of the first states to act. Lawmakers introduced the Social Media Safety Act, a measure designed to make platforms act with greater responsibility and take the safety of young users seriously.

The tech industry responded immediately. Companies challenged the law in court, and in March, a federal judge struck it down. But the dispute did not end there. What began as a single legal challenge has grown into a broader confrontation between Arkansas and powerful technology companies. Governor Sarah Huckabee Sanders has remained steadfast, standing firmly behind the legislation, while tech companies, wary that similar rules could spread nationwide, continue to mount a determined defense.

On Friday, June 27, NetChoice, a trade group that counts Meta and X among its members, filed a new lawsuit in federal court in Fayetteville. The organization is contesting Arkansas’s renewed attempt to impose stricter regulations, arguing that the updated law could be even more harmful and unconstitutional.

What Is the Social Media Safety Act

Act 689 of 2023, widely known as the Social Media Safety Act, required social media companies to verify the age of new users in Arkansas and allowed minors under 18 to create accounts only with parental permission. As the first law of its kind in the United States, it immediately faced a legal challenge and was blocked before it could take effect.

Lawmakers later amended the legislation and reintroduced it with broader provisions. Act 900, which updates the Social Media Safety Act, includes a key section prohibiting platforms from using algorithms, designs, or features that companies “know or should have known through the exercise of reasonable care” could lead a user to die by suicide, purchase controlled substances, develop an eating disorder, or become addicted to the platform.

Act 901, another amendment, allows families to pursue legal action. Parents whose children die by suicide or attempt suicide could sue social media companies if their children were exposed to content that promoted or encouraged self harm. Platforms could face civil penalties of up to $10,000 per violation.

Laws in Need but Free Speech Questions

The urgency of the issue came into sharp focus last month in an Arkansas courtroom. Jennie Deserio, a mother from the state, shared the story of her 16 year old son, Mason Edens, who died by suicide just days before his high school graduation.

Deserio believes harmful content on social media, particularly TikTok, played a role in her son’s death. After examining Mason’s phone, she discovered a disturbing pattern. What began as searches for motivational quotes and positive affirmations eventually shifted into a stream of graphic videos that romanticized self harm and suicide.

She explained that she had relied on parental controls and closely monitored his activity but still felt powerless against the influence of the platform’s algorithm.

Supporters of the laws argue that these incidents are not rare and believe the measures are necessary.

For technology companies, however, the Arkansas law raises serious constitutional issues. NetChoice argues that the provisions are unconstitutionally vague and fail to provide clear guidance on what content would be prohibited. The group warns that the restrictions could limit lawful expression for both minors and adults, creating significant First Amendment concerns.

The lawsuit even points to cultural examples. It questions whether songs referencing drugs, such as Afroman’s “Because I Got High,” would be restricted under the law’s language. For NetChoice, that lack of clarity is unconstitutional in itself.

Additional Restrictions Under Fire

NetChoice’s lawsuit also challenges a separate Arkansas law recently passed by legislators. That measure would require platforms to block notifications to minors between 10 p.m. and 6 a.m. It would also mandate that companies ensure their platforms do not “engage in practices to evoke any addiction or compulsive behavior.”

The trade group argues that these requirements are just as vague as the others. The complaint notes that addiction is difficult to define or measure. “What is addictive to some minors may not be addictive to others. Does allowing teens to share photos with each other evoke addiction?” the lawsuit asks.

Industry leaders fear that if Arkansas succeeds, similar restrictions could spread to other states, setting off a chain reaction that could reshape how social media is regulated nationwide.

What Comes Next

Governor Sanders, who enjoys strong public support on the issue, remains committed to defending the law. Arkansas Attorney General Tim Griffin’s office has said it is reviewing the latest complaint and looks forward to making its case in court.

For families like Deserio’s, the fight is intimate and urgent, bound up with grief and the hope that change might protect others. For technology companies, it is a constitutional test that could shape the future of their platforms far beyond Arkansas.

For now, the courtrooms of Fayetteville have become a stage for a national reckoning, where the clash between safeguarding children and preserving free speech is unfolding in real time.