first amendment Archives - Digital Content Next Official Website Thu, 19 Mar 2026 12:19:27 +0000 en-US hourly 1 Social media laws should focus on social media https://digitalcontentnext.org/blog/2026/03/19/social-media-laws-should-focus-on-social-media/ Thu, 19 Mar 2026 11:33:00 +0000 https://digitalcontentnext.org/?p=47021 In California, jurors heard testimony that  echos far beyond the courtroom as a warning for the digital age. In a major social media liability case that concluded last week families...

The post Social media laws should focus on social media appeared first on Digital Content Next.

]]>
In California, jurors heard testimony that  echos far beyond the courtroom as a warning for the digital age. In a major social media liability case that concluded last week families described how their children slipped into patterns of compulsive use that preceded a serious mental health crisis. In one case, a teenage girl spent hours each night scrolling algorithmically curated feeds, pulled back again and again by notifications and social validation. Over time, her parents said, that behavior led to isolation, anxiety, and depression—outcomes that mirror a growing body of research on social media’s impact on adolescents. The plaintiffs argue that these outcomes are not accidental. They are the predictable result of features designed to maximize engagement.

These stories are increasingly common. They are the reason dozens of lawsuits, investigations, and public health warnings have converged on a troubling conclusion: social media companies have built platforms deliberately designed to capture and hold the attention of young users at any cost, to maximize profit.

Engineered for attention, but at what cost?

Features such as likes, notifications, and algorithmic feeds create feedback loops that keep users coming back. They deliver small bursts of social validation that can make it difficult, especially for younger users, to step away. As Sean Parker, Facebook co-founder, famously acknowledged, these platforms were engineered to provide users with “a little dopamine hit every once in a while” so they keep coming back.

For teenagers whose brains and social identities are still developing, these design choices can have profound consequences. Studies increasingly link heavy social media use among adolescents to anxiety, depression, and body image concerns. Filters, curated images, and constant comparisons can intensify feelings of inadequacy, while endless scrolling and late-night notifications disrupt sleep and emotional well-being. Many of these lawsuits allege Meta, owner of Facebook and Instagram, had the most knowledge of these impacts but chose to suppress it. That’s one of the reasons it’s why courtroom arguments create a compelling comparison to big tobacco.

Lawmakers need to act, and to get social media regulation right

It’s no surprise lawmakers are looking for ways to respond. Across the country, proposals in California, Texas, Utah and Alabama like the Age Appropriate Design Code and the App Store Accountability Act aim to address harms to children online. While the motivation is understandable, many of these bills cast too wide a net and as a result, risk creating new problems while trying to solve existing ones.

Instead of narrowly targeting the design features and business models driving these harms, they often sweep in the broader digital ecosystem. Thus the regulation doesn’t just impact social media platforms, but also news organizations, educational services, and nonprofits. That raises serious First Amendment concerns. Laws that affect speech must be narrowly tailored, and courts have already shown skepticism toward broad, vague attempts to regulate online content.

Measures that require broad content restrictions or impose vague compliance obligations on publishers are also particularly vulnerable to legal challenges. And, if struck down, these efforts could further delay meaningful progress in addressing the real harms associated with social media.

A complex ecosystem requires precise solutions

There’s a better path: Instead of regulating the entire internet, lawmakers should focus their attention on large social media platforms whose business models depend heavily on algorithmic amplification of user-generated content. These companies derive enormous revenue from engagement-driven advertising models that reward keeping users on their platforms for as long as possible.

Policies aimed at limiting manipulative design features for minors, increasing transparency around algorithms, and establishing reasonable duties of care could address these issues without sweeping in good faith actors.

Congress has already begun exploring this more targeted approach. The Kids Online Safety Act (KOSA), which has attracted overwhelming bipartisan support, focuses specifically on platforms that rely predominantly on user-generated content and algorithmic recommendation systems. By concentrating on the companies whose products create the greatest risks for young users, KOSA offers a more precise model for addressing online safety concerns.

Overbroad regulation and unintended consequences

That narrower focus is critical not only for constitutional durability but also for avoiding unintended consequences.

If new legislation imposes sweeping compliance obligations—such as complex age verification systems, extensive data governance requirements, or new liability frameworks—many news organizations could struggle to meet them. Smaller publishers in particular lack the legal and technical resources required to implement costly regulatory regimes designed with massive social media companies in mind.

Possibly even more troubling, some proposals could inadvertently restrict teenagers’ access to credible, fact-checked journalism. If platforms respond to regulatory risk by broadly limiting content for minors, young people could find themselves cut off from some of the most reliable sources of information available online. Teenagers (and everyone else) need more access to reliable journalism from publishers who take responsibility for it. Policies designed to protect young people should not inadvertently make it harder for them to find credible reporting.

Mounting evidence of social media’s impact on youth mental health demands a serious policy response. But effective regulation must be precise. Broad, legislation may seem decisive, but it risks violating constitutional protections, burdening responsible publishers, and limiting access to reliable information.

A more focused approach that targets the design practices and business incentives of the largest platforms offers a better path forward. It should hold the most powerful platforms accountable for the environment they create and the choices they make about how young users interact with their services. If policymakers maintain that focus, they can address a generational public health challenge while preserving an open and diverse online ecosystem.

The post Social media laws should focus on social media appeared first on Digital Content Next.

]]>
Safe, for now: What Moody v. NetChoice LLC means for media companies https://digitalcontentnext.org/blog/2024/07/29/safe-for-now-what-moody-v-netchoice-llc-means-for-media-companies/ Mon, 29 Jul 2024 11:28:00 +0000 https://digitalcontentnext.org/?p=43244 On the last opinion day of the 2023-2024 term, the Supreme Court issued its decision in Moody v. NetChoice LLC. The case involved the constitutionality of two laws that sought...

The post Safe, for now: What Moody v. NetChoice LLC means for media companies appeared first on Digital Content Next.

]]>
On the last opinion day of the 2023-2024 term, the Supreme Court issued its decision in Moody v. NetChoice LLC. The case involved the constitutionality of two laws that sought to limit social media companies’ freedom to moderate content.  What was decided (as well as what wasn’t) in the Moody case will have important implications for digital media companies and sets the stage for the constitutional future of content moderation.

Moody v. NetChoice

In 2021, Florida and Texas enacted laws banning certain content moderation practices for covered platforms. The laws also require individualized explanations for users whose posts have been altered or removed.

Although similar, there are several key differences between the laws. Namely, for digital media companies, the Texas law largely excludes traditional media players, including news, sports, and entertainment platforms. There is no such exception present in the Florida law.

NetChoice and the Computer & Communications Industry Association, two trade associations that represent industry leaders such as Meta, Google, and X, challenged both laws. While both laws were initially put on hold by courts in both states, the Texas law was allowed to take effect following an appeal. However, following an order from the Supreme Court, the Texas law was put on hold once again.

In September 2023, the Supreme Court agreed to jointly hear these cases.

The Supreme Court decision  

In a somewhat unexpected move, the Supreme Court voided both judgements, sending the cases back to the lower courts. In a 9-0 decision, the Court found that neither court of appeals properly analyzed the First Amendment challenges presented by both laws. 

The Court took issue with NetChoice’s legal approach, which forced the association to prove that nearly every application of these laws would be unconstitutional. The nature of these laws and the industry they attempt to regulate, undoubtedly makes this a monumental task for NetChoice. Given the Court’s criticism of Net Choice’s approach, it is likely that organizations seeking to challenge similar laws will concentrate on challenging specific applications, a strategy that might bring more distinct victories to social media companies.

The Court also noted its dissatisfaction with the analyses carried out by the lower courts. To conduct a proper assessment, the lower courts will first have to determine the scope of these laws (which activities and which actors the laws regulate). They will also need to determine which applications of these laws violate the First Amendment, and then measure said number of applications against the number of applications that do not violate the First Amendment.

The Supreme Court also directed the lower courts to evaluate whether the laws’ content moderation provisions intrude upon protected editorial discretion and whether individualized explanation provisions disproportionately burden freedom of expression.

Implications for media companies

Although the Court did not decide the constitutionality of these laws, there is still much to take away for digital media companies. Throughout the decision, the Court repeatedly highlighted the importance of editorial discretion. Writing for the majority, Justice Elena Kagan noted that companies engaging in expressive activity are protected by the First Amendment. Because content curation and moderation has been found to fall within the scope of expressive activity, governmental interference with editorial choices would implicate the First Amendment.

The Court’s clear support of editorial discretion and freedom provides digital media companies necessary cover not only from laws of this nature but from broader governmental infringement upon editorial rights. The Court was also clear in its judgment that “the choice of material” constitutes the exercise of editorial control not only for traditional newspapers but for online platforms as well.

However, while the Court was clear in stating that editorial discretion (including content moderation practices) is subject to First Amendment protections, it was not clear whether all content moderation practices are worthy of these protections. This, because as technology has evolved, the lack of human interference in content moderation raises questions as to how “expressive” this conduct really is.

In her concurring opinion, Justice Amy Comey Barrett questioned whether an algorithm that “presents automatically to each user whatever the algorithm thinks the user will like” or AI tools meant to remove hateful content would constitute expressive activity. Delving further on this point, Justice Comey Barrett added that technological advancements will increasingly shine a light on the overlap between content moderation and consumers’ rights to decide for themselves what they wish to post and view on social media. This overlap, which is certain to have some degree of constitutional implications, should be top of mind for tech and media companies as they continue to develop and finetune their content moderation practices.

In his concurring opinion, Justice Samuel Alito was even more emphatic in raising concerns over the extent to which content moderation constitutes expressive activity. Justice Alito cast content moderation practices under a dubious light, arguing that it is not known how platforms moderate content.

Notwithstanding the accuracy of this assertion, media companies should be prepared for calls requesting increased transparency over their policies. Justice Alito also criticized the majority for assuming that “secret” algorithms are equally as expressive as print news editors. This delineation between the era of print news and today’s digital landscape will surely be echoed by the attorneys general of Florida and Texas as well as other proponents of government interference on content moderation.

What digital media companies can expect going forward

The majority’s reading of the First Amendment is a positive outcome for digital publishers. In reiterating that editorial discretion, including content moderation, warrants First Amendment protections, the Court enabled media companies to continue curating and moderating content at their discretion. If the Court were to have ruled adversely, this freedom would have been placed on the chopping block, and media companies would soon be battling a slew of unfavorable laws seeking to manipulate their editorial discretion.

It is difficult, however, to predict to what extent the Court’s opinion will inform the lower courts’ analyses. Regardless of how the lower courts rule on these laws, the Moody decision serves as a warning that while the Supreme Court remains committed to protecting the right to editorial discretion, there is also a lack of clarity, as well as a sense of skepticism, surrounding the scope of these protections over emerging forms of content moderation.

As algorithms and AI continue to grow even more prevalent and sophisticated, the Supreme Court will eventually have to clarify, and possibly draw new constitutional lines around, First Amendment precedents that originated at a time when computers were just arriving in newsrooms.

The post Safe, for now: What Moody v. NetChoice LLC means for media companies appeared first on Digital Content Next.

]]>