Three principles in CDA 230 that make Wikipedia possible

Translate This Post
Photo by James Petts, CC BY-SA 2.0.

Editor’s note: On March 21, 2018, the United States Congress passed an amended version of SESTA. We have sent a statement on it to our public policy mailing list.

Imagine an internet without social media, conversations, or the rich stores of free knowledge created by Wikipedia editors. An internet without content created and shared by anyone. That’s the internet we’d have without Section 230 of the U.S. Communications Decency Act, which provides important legal protections for websites that host user-generated content.
The U.S. Senate Commerce Committee recently approved the Stop Enabling Sex Traffickers Act (S. 1693, or SESTA), a bill intended to address online sex trafficking which would threaten core protections granted by Section 230. The House of Representatives has considered similar changes in recent months. As lawmakers reexamine parts of Section 230, it’s important to remember the law’s goal and essential elements.
The Wikipedia we know today simply would not exist without Section 230. User-driven projects could not thrive if websites were subject to greater liability for user content, and certainly could not be supported by a small nonprofit organization like the Wikimedia Foundation. For that reason, we have some serious concerns about the potential impact of SESTA and other amendments to Section 230. That’s why our Executive Director emphasized Section 230’s importance for Wikipedia’s hundreds of thousands of volunteer contributors in a recent campaign by the Electronic Frontier Foundation, and why we submitted a letter to the Senate Commerce Committee expressing the importance of Section 230 for the Wikimedia projects. The current bill does not reflect the careful balance that preserves small, nonprofit community projects like Wikipedia.
Here is how this balance works.

  1. Website operators need freedom to review content without legal risks

The fundamental goal of Section 230 is to keep the internet free and safe by encouraging operators to host free expression and remove problematic content without the disincentive of possible lawsuits.
SESTA introduces a vague standard for website operators that expands liability for “knowing” support of certain criminal activity. This will encourage websites to avoid gaining knowledge about content (to avoid liability) instead of actively engaging in content moderation.
As currently drafted, SESTA would amend the federal sex trafficking statute (18 U.S.C. § 1591) to state that participation in a sex trafficking venture occurs when a party, such as a website, is “knowingly assisting, supporting, or facilitating” a sex trafficking crime. While clearer than the broad “knowing conduct” standard that appeared in earlier versions of SESTA, this language could potentially make websites unintentionally liable for facilitating criminal activity if they engage in proactive, yet imperfect, monitoring efforts.
The Wikimedia projects are maintained by the collective monitoring efforts of thousands of volunteers worldwide, who promptly remove vandalism and other content that does not follow the community policies. The Wikimedia Foundation Terms of Use provide another baseline for the Wikimedia projects, and violations of those Terms are often removed by volunteer users, or may be reported to the Foundation for removal. The Wikimedia Foundation is able to rely on this community-self governance model, in large part, due to the existing clear protections provided by Section 230.
The ambiguity of the “knowledge” standard in SESTA poses problems for any website that welcomes user-generated content, including those operating on a community self-governance model. Any new laws affecting websites must be drafted carefully to state website operators’ obligations and responsibilities when they are alerted to unlawful content. There must be clear methods for reporting such content to the appropriate law enforcement authorities, without accidentally triggering liability. Without these clear guidelines, website operators will be unable to review new content without risking significant new liability.

  1. The internet needs consistent national standards, not state-specific rules

The internet is built on connectedness, and its advantages come from the ability to share information across borders. Section 230 is a federal law, providing one single, clear standard across all 50 states for when websites can and cannot be held liable for the speech of their users. Website operators in the United States should not have to navigate 50 different, potentially incompatible state rules.
SESTA would amend Section 230 to allow, for the first time, civil and criminal liability for websites under state law as well as federal law in cases where the federal sex trafficking law has also been broken. This improves upon an earlier version of the bill, which would have allowed for much broader liability under state law. Website operators should not have to monitor and attempt to comply with differing laws in all 50 states. Doing so would require substantial time and resources just to stay aware of new laws and ensure compliance, which would be particularly difficult for a small company or nonprofit like the Wikimedia Foundation. It also would put operators in an impossible bind if two states passed laws with contradictory requirements.
The latest version of SESTA avoids the most troubling consequences that could result from competing state standards. Amendments to Section 230 must not upend the balance and predictability of a single national standard that websites have relied on for over 20 years.

  1. The law should not create barriers to smaller website operators and new innovation

The original goal of Section 230 was to provide legal protection for website operators and create room for new forms of innovation. Over 20 years later, these protections remain most crucial for small and emerging platforms.
When plaintiffs target online speech, they often go after the website, not the speaker. It can be difficult to track down individual users, and suing a website may appear to be more lucrative. For two decades, Section 230 has protected websites with a shield from civil liability for user-created content. Critically, Section 230 does not prevent websites from being held responsible for their own actions—websites that are directly involved in illegal activities can already be prosecuted by the Department of Justice. However, SESTA would open up websites to more liability under federal and state law, likely resulting in increased litigation. Some of these lawsuits will be legitimate responses to improper conduct by websites; others may simply target the website over the speaker as an easier way to attack online speech. Even if these lawsuits are meritless, getting them dismissed demands significant time and resources.
Small internet companies, startups, and nonprofit websites like the Wikimedia projects lack the resources to defend against a flood of lawsuits. Websites shouldn’t be sued into the ground, or afraid to even launch, simply because of holes in Section 230’s protections. Any amendments to Section 230 must take into account their effects not just on large, well-funded tech companies, but on startups and nonprofit organizations as well.

———

We believe that Congress got the balance right when it passed Section 230 back in 1996. When users go online, they are responsible for their own words and actions. When websites act in good faith to keep their communities healthy and free of toxic content, they know that they can do so without undertaking new risks. When websites engage in unlawful conduct themselves, Section 230 provides no shield from prosecution. For over two decades, Section 230 has encouraged good-faith content moderation, under a single federal standard, and protected not only large websites, but also small startups and nonprofits. We urge Congress to avoid disrupting the balance that has made projects like Wikipedia possible.
Leighanna Mixter, Technology Law and Policy Fellow, Legal
Wikimedia Foundation

This post is also available on Medium

Archive notice: This is an archived post from blog.wikimedia.org, which operated under different editorial and content guidelines than Diff.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?