section 230 Archives - Digital Content Next Official Website Tue, 16 Sep 2025 21:02:24 +0000 en-US hourly 1 Accountability is not censorship  https://digitalcontentnext.org/blog/2025/09/18/accountability-is-not-censorship/ Thu, 18 Sep 2025 11:33:00 +0000 https://digitalcontentnext.org/?p=46014 The recent killing of Charlie Kirk—regardless of one’s political alignment—has intensified national reflection on the state of our political discourse. Violence against anyone for their beliefs is an assault on...

The post Accountability is not censorship  appeared first on Digital Content Next.

]]>
The recent killing of Charlie Kirk—regardless of one’s political alignment—has intensified national reflection on the state of our political discourse. Violence against anyone for their beliefs is an assault on democratic values. This moment has sparked rare bipartisan calls to reject incendiary rhetoric and recommit to civil engagement. 

Radicalization in politics is not new. But today, it is amplified, monetized, and normalized through the very platforms where our public discourse now lives. It feels more commonplace today than at any point in American history. That’s not necessarily because it happens more often, but because it’s more visible, more immediate, and more inescapable in an era of social media and live-streamed video and audio. We’re seeing and hearing things we might never have been exposed to in the past. 

Algorithmic amplification and indefensible immunity 

This is no accident. Social media algorithms are explicitly designed to maximize user engagement—not accuracy, civility, or truth. The most inflammatory content is rewarded with amplification, regardless of whether it’s true, defamatory, or dangerous. This creates a system where extremism is not just tolerated but incentivized. The result is an environment that’s not just toxic — it’s legally unaccountable. 

Section 230 of the Communications Decency Act was originally intended to protect online platforms from liability for user posts. But today, it provides near-total immunity to the largest tech companies—even when their own algorithms actively promote harmful, illegal, or even deadly content.  

For example, in Gonzalez v. Google, the family of a U.S. citizen killed in a Paris terrorist attack argued that YouTube’s algorithm actively recommended ISIS content. And yet, courts shielded Google from liability under Section 230. When a multibillion-dollar company can engineer its systems in a way that results in the promotion of extremist propaganda and then disclaim all responsibility, we must ask: What is the purpose of a liability shield that protects this behavior? 

Section 230 protections have shielded platforms from accountability even in tragic and preventable cases: 

  • In Doe v. MySpace, courts dismissed the claims of families whose children were sexually exploited, ruling that platforms aren’t responsible for foreseeable harm arising from user interactions. 
  • In Doe v. Snap Inc., parents whose children died from fentanyl-laced drugs bought via Snapchat were similarly blocked from pursuing legal remedies—even though Snapchat’s disappearing message design arguably enabled the illegal activity. 

These are not edge cases. They reveal a systemic failure: social media companies face no consequences for design choices that would be unacceptable for other types of companies. Section 230 has become a legal firewall for product decisions that would not pass muster in any other industry.  

Congress has an urgent responsibility to reform this law. At minimum, companies should be required to: 

  • Take reasonable steps to prevent foreseeable harm enabled by their platforms, 
  • Be transparent about how algorithms influence outcomes, 
  • Be held liable when platform features contribute to illegal activity. 

Tech companies argue that any effort to place guardrails on social media amounts to censorship. In reality, they are protecting their bottom line. Reform would threaten the low-cost, high-profit business model that relies on unfettered data extraction and behavioral manipulation. 

In every other industry, companies are held accountable for the products they design—especially when harm to children is involved.  

News organizations are held to account for what they publish—often in court. Whether it’s a private citizen or a powerful public figure, individuals have legal recourse when they believe they’ve been wronged by the press. Consider the high-profile case of Hulk Hogan, who sued Gawker Media for invasion of privacy and won $140 million in damages—a verdict that ultimately forced the company into bankruptcy. That case underscores a fundamental principle: when media companies cause harm, they can be held liable. 

In other industries, many major companies have been held liable for selling defective or unsafe products that led to the deaths of children, resulting in multimillion-dollar verdicts and settlements. IKEA settled for $46 million over dressers that tipped over; Fisher-Price paid a $13 million penalty and undisclosed settlement amounts and was forced to recall Rock ‘n Play sleepers tied to over 100 infant deaths, and; Evenflo is currently facing multiple lawsuits and investigations for marketing “safe” booster seats even though they allegedly had internal data showing a high risk of injury or death.  

It is outrageous that parents who have lost a child to suicide because of social media algorithms don’t have the same opportunity for justice. 

Considering the corrosive impact of online extremism, we must expect more—from platforms, from policymakers, and from ourselves. The best way to restore a healthier political discourse, a safer digital environment—and a safer world—is to make social media companies legally responsible for the products they design and the harm those products cause. This is not a debate about censorship. It’s about accountability. If your business model profits from harvesting Americans’ most personal data, you must also bear responsibility when that model causes real-world harm. 

The post Accountability is not censorship  appeared first on Digital Content Next.

]]>
Accountability and opportunity go hand in hand https://digitalcontentnext.org/blog/2021/04/08/accountability-and-opportunity-go-hand-in-hand/ Thu, 08 Apr 2021 11:14:00 +0000 https://digitalcontentnext.org/?p=30534 Relationships matter. As humans, we diverge from acting out of self-interest to accommodate the people with whom we have relationships. This might mean little things like saying “thank you” or...

The post Accountability and opportunity go hand in hand appeared first on Digital Content Next.

]]>
Relationships matter. As humans, we diverge from acting out of self-interest to accommodate the people with whom we have relationships. This might mean little things like saying “thank you” or holding the door open for the person behind you. It could be bigger things like buying a birthday gift for a friend or helping a neighbor in need.

People make sacrifices every day for those they care about. And, in any kind of relationship, there is some level of accountability. If I am jerk to a grocery store clerk, the five minutes during checkout could be really awkward or that person might decide to double-charge me for an item. Being rude to my server is not likely to speed up my dinner order. If I’m inconsiderate of my wife, I will probably be miserable until I make amends. These sorts of simple relationship dynamics play out hundreds of times every day.

The relationship business

Commercial relationships have similar dynamics. From my perch at DCN, I see premium publishers working hard every day to earn the trust and loyalty of consumers. News organizations employ journalists, who investigate and check facts, and editors, who vet content and ensure rigorous standards are followed. If they mislead, they can be held accountable by under libel laws. If they fail to engage and inform, they lose traffic and advertisers or subscribers.

Movie companies hire directors and actors to create humor, drama, or horror to entertain consumers. If they don’t do so, their movies fail to draw at the box office, they command lower fees for other distribution channels. They lose money.

Whether it’s weather, health, sports, or financial information, publishers in every vertical and across every medium work hard to create quality, compelling consumer experiences. In all of these cases, the publisher’s brand is closely tied to the content because the publisher is trying to build a relationship. And, as with any successful relationship, trust and accountability are key to developing a deeper commercial relationship with people as well.

Responsibility issues

Some of the currently pressing public policy issues have arisen in areas where there is little accountability to consumers. One big example is Section 230. It was enacted into law in 1996, as part of the Communications Decency Act, when the burgeoning tech industry was a darling of all politicians. Things have changed dramatically since those halcyon days with multiple members on both sides of the aisle introducing legislation to overhaul or eliminate Section 230.

Ironically, Section 230 was intended to empower platform companies to take responsibility. Instead, this liability shield tends to be used mostly by companies who can’t or won’t take full responsibility for their services. Tech companies tend to use Section 230 to avoid taking action. Backpage was one of the highest profile examples. However, Facebook regularly invokes the legal protections to avoid responsibility for the toxic content flowing across its services.

It’s not a coincidence that news organizations are far less reliant on Section 230 than platforms, because they stand behind their content. Content is their calling card and if customers reject it, that relationship is over.

Accountability issues

Consumer privacy is a hot topic these days because there are big tech companies building profiles about consumers behind the scenes with little transparency or accountability. From hyper-targeted advertising to potential discriminatory offerings, consumers are increasingly aware that they are being manipulated and their data is being used for myriad unexpected purposes.

Consumers have generally felt fine about their data being used within the context of a relationship with a company – e.g. ensuring the site or app loads properly on their device, remembering log-in information, or recommending new content. However, when data is used outside of that relationship, consumers react negatively. Hence, the blowback for Facebook around Cambridge Analytica. These public policy spats underscore a key difference between companies that have direct relationships with consumers versus those that are intermediaries. Direct relationships create accountability.

Building business with relationships

Accountability is an inherent part of direct relationships. That said, solid relationships provide opportunity as well.

The most visible sign of the power of direct consumer relationship is the growth of subscriptions. The New York Times and Netflix are notable success stories. However, hundreds of other media brands are finding loyal audiences that are more than willing to pay for premium content.

In addition, publishers with trusted brands are well-positioned to thrive in a world where privacy laws and tech controls increasingly restrict web-wide data surveillance. Whether it’s GDPR in Europe, CCPA and CPRA in California, or the handful of other states that are actively considering privacy laws, policymakers are trying to give consumers greater control. They seek to prevent the kind of unexpected data harvesting that happens outside of a consumer’s relationship with a company.

At the same time, key companies are rolling out privacy-friendly features. Apple built Intelligent Tracking Prevention (ITP) into Safari and is preparing to unveil App Tracking Transparency (ATT). Both are designed to crack down on companies following consumers everywhere they go online. However, they do allow for tracking within a consumer’s direct relationship with a company.

Google announced that Chrome would follow the lead of every other major browser in blocking third-party cookies. To be clear, there are a lot of suspicions about whether Google might try to give itself preferential treatment here. But, at its core, it looks like a positive move toward consumer privacy.

Real relationships

Companies with direct, trusted relationships have an opportunity. This window of opportunity, especially for news providers, could not come at a more important time for publishers — and for our society. The news industry has taken a beating in the last decade or so as intermediaries aggregated publishers’ content and retargeted audiences. Big tech platforms incentivized scale over trust. On top of that, there has been a raging debate about the impact of platform-driven disinformation and algorithmic bias on our democracy.

Well-paid lobbyists for some big tech companies are actively working to deflect accountability. However, publishers are embracing the direct, trusted relationships (and the ensuing accountability) they enjoy with consumers. News organizations continue to produce and stand behind quality journalism – researched, fact-checked and vetted. Local news publishers are leaning into what has always made them unique – critical context and deep understanding – to serve their communities.

Strong relationships are built on open, honest, accountability. They are built on trust. For quality media brands, this is nothing but good news. 

The post Accountability and opportunity go hand in hand appeared first on Digital Content Next.

]]>
Confused about section 230? You’re not alone. https://digitalcontentnext.org/blog/2019/08/22/confused-about-section-230-youre-not-alone/ Thu, 22 Aug 2019 11:11:49 +0000 https://digitalcontentnext.org/?p=24368 Passed in 1996, the Communications Decency Act includes a line under the Section 230 heading which reads: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. Those 26 words have lead to a surprising amount of confusion.

The post Confused about section 230? You’re not alone. appeared first on Digital Content Next.

]]>

Hint: Try thinking about it in terms of user-submitted content


Passed in 1996, the Communications Decency Act includes a line under the Section 230 heading which reads: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider

Written by (then) United States Representatives. Chris Cox and Ron Wyden, that line has been the shield online outlets have used to protect themselves from liability for certain kinds of content posted by users to their websites. It is important to note that Section 230 has nothing to do with content composed by publishers (user comments certainly count, though). Because of this, the law as become mired in the debate about online speech and the role moderation plays on online content platforms. 

The statute is the focal point of the two sides of the American political spectrum—for almost diametrically opposite reasons. Republicans have claimed that it is at the source of infringement of conservative voices on popular online social media platforms. On the other hand, Democrats question why racist or violent content remains online despite calls for increased moderation. 

Resolving these different perspectives won’t be easy. But what certainly isn’t helping things is the recurring problem of statue 230 being misinterpreted and misrepresented. These problems only confound the confusion over this already-complex issue. Corrections have been issued: Vox needed to rewrite a piece back in May with a mea culpa and more recently The New York Times took back a claim that Section 230 protects online publishers of hate speech (that’s the first amendment).

Conspiracy theories

Some have accused legacy of media being behind some sort of coordinated attack to undermine the power of big tech. However, Jeff Kosseff, an expert on the subject, offers another explanation.

“I think the problem, really, is just that it’s a complex issue,” he says. “And people don’t think it’s a complex issue because it’s a short law. I mean, the core of it is 26 words. But the problem here is that it’s not intuitive.”

Kosseff is a cybersecurity law professor at the United States Naval Academy and the author of The Twenty-Six Words That Created the Internet, the 328-page book that tackles the story of Section 230. One thing Kosseff is pretty sure about: there’s no conspiracy. “I seriously doubt that the Sulzburger family is telling the New York Times business section to write that Section 230 protects hate speech. That seems a bit too far fetched for me,” he says.

Historical perspective

That’s not to say all of the media coverage has been wrong either. Kosseff, who worked over seven years as a reporter at the Oregonian, points to pieces in Wired and The MIT Tech Review (both of which cite him) as successfully communicating Section 230’s intricacies. Going forward though, Kosseff recommends that reporters covering the online speech beat develop a better understanding of the statute and its history. 

Although passed in 1996, Section 230 originally came about as a way to carry forward a legal principle decided in the 1950s and apply it to the online medium. Since the case Smith v. California in 1959, distributors of content have enjoyed protections against liability of the nature of content they do not produce, as was found in the California. (At that time “distributor” meant an outlet like a bookstore.) 

The need for further clarity emerged after legal battles in the early 1990s. Two internet companies were targeted by people who felt they were harmed by the content found on the respective services. The cases resulted in two different legal outcomes. The first company, CompuServe, was spared liability for content on its platform as it had decided to not perform moderation. However, Prodigy, which did edit what was found on its service, was deemed liable for the content on its site.

Net not neutral

These two distinct outcomes are likely at the root of a common misinterpretation of the law: that a content platform needs to be a neutral party to defend itself from any legal claims. That is not the case. Section 230 was written so that content providers could decide what was added their sites by third parties without the risk of legal action.

According to Kosseff, this misinterpretation could be connected with  the way the legal departments of news sites have historically navigated the law in relation to examples like crowdsourced projects, user-submitted reviews, and comment forums . “A huge misconception that even lawyers at news organizations have is that if you make any edits or moderate user content, you lose your Section 230 protections,” says Kosseff.

This point of confusion has manifested itself in the debate between the roles of publishers versus platforms. A May op-ed authored by GOP Senator Ted Cruz explained that Facebook risks losing its protection as a neutral platform and instead risked the liability of being a publisher. Except Section 230 specifically means that if you allow third party posting, you do not count as a publisher of that content (with some exceptions like content breaking federal criminal and intellectual property law). The difference between publishers and platforms is “not really a distinction under the law for Section 230,” says Kosseff.

In addition to encouraging reporters on the topic to do their due diligence, Kosseff recommends that publishers retain legal council with the most up to date knowledge. That’s the key to not only reporting correctly on topic of Section 230, but also managing news and other media websites with user generated content, as there could be negative consequences in the most common misunderstanding the law.

“It’s terrible for business reasons,” says Kosseff. “Because if you’re just sort of voluntarily keeping up all the worst content, you’re going to be driving readers away. And there’s no reason for you to do that.”

The post Confused about section 230? You’re not alone. appeared first on Digital Content Next.

]]>