Complexity theory: Facebook’s newfound censorship ethics

Opinion by Juddson Taube
May 13, 2020, 7:15 p.m.

Part of “Complexity Theory,” a column on the tangled questions of our technological age.

Facebook says the organizing of stay-at-home protests qualifies as “harmful misinformation” and will therefore be removed. This latest move is among other recent efforts that show the company is becoming more willing to take on the responsibility of content moderation, at least in the wake of worldwide pandemic. 

The company has long leveraged its protected legal status as a “platform” (as opposed to a media empire), courtesy of Section 230 of the Communications Decency Act of 1996. This allows them to escape any liability for the content on their website. Section 230 has been a faithful shield for Facebook, so for the company to step out from behind it of its own accord demonstrates either a shift in its self-conception or a recognition of changing times.

Reactionaries have cried foul: Facebook, they argue, is abridging the freedom of speech in favor of one side of a public “debate,” and that violates their constitutional protections. Is it constitutional to prohibit this kind of organizing or its associated misinformation? Is it reasonable? Our democracy depends on finding the answers.

Let’s move past the facile argument that Facebook is a private organization and can therefore do whatever it wants to moderate its forums. It is more important to view Facebook as what it is: the de facto public sphere for the majority of Americans. According to Pew Research Center, nearly half of us use the platform to get our news and debate it. This would mean the newfound editorial discretion makes Facebook, a private company providing a public good, reasonably subject to conversations about constitutionality. 

This is what marks this action by Facebook as such a seismic shift; it is exposing itself to regulation by sticking its neck out. This also explains the recent ad campaign for Facebook Groups, indicating a push for problematic content to be shared privately, less subject to being reported.

Free speech advocates are correct about one thing: The government restricting speech solely because of its content is unconstitutional. The Supreme Court has affirmed this time and time again: Any restriction on speech by the government has to be content neutral. But the legal question the courts face is rarely about content alone; it is often about prohibiting the “incitement of imminent lawless action.” In Brandenburg v. Ohio (1969), the Supreme Court found that speech can be limited if it is “directed at inciting or producing imminent lawless action” and is “likely to incite or produce such action.”

The content currently being removed from Facebook is of those organizing anti-lockdown, in-person protests. These protests would be in direct violation of governors’ executive orders, not simply voicing an anti-lockdown viewpoint. It is thus of a different legal nature. Facebook told Politico that they will only “remove the posts when gatherings do not follow the health parameters established by the government and are therefore unlawful.”

It is promising that Facebook is making this change, even if only as a response to pressure and not out of some altruistic effort. Electing to remove this content will no doubt have political fallout for Facebook, even when protected by an indirect precedent. And while private adjudication of all of this content isn’t a perfect solution, because it is too difficult and important a task to trust to a private organization, it is a step in the right direction. The less Section 230 applies to Facebook, the less power it will have as it becomes more accountable to the public. Right now, Facebook enjoys what Tarton Gillespie has pointed out to be both “sides of safe harbor: the right, but not the responsibility, to police their sites as they see fit.” 

Constitutionality aside, we must account for the costs associated with protecting every viewpoint online and with “balancing” every debate in the digital era. Expression has long been treated neutrally by the docents of the new public spheres like Twitter, Facebook and YouTube, despite the extensive community standards they author and fail to enforce with any consistency. But in reality, extreme speech and misinformation is tacitly and algorithmically endorsed because having a venue for extreme or controversial ideas is highly profitable.

Over the past few decades, we have become witness to the free exercise of the Second Amendment and the tens of thousands of lives that serve as annual payments for this freedom. And now we are starting to see the costs of the First Amendment with misinformation: the loss of any shared notion of truth, expertise or authority. The price so far in the case of COVID-19 appears to be thousands of preventable deaths, which is a figure that may climb much higher. In 1918, the second wave of the Spanish Flu contributed to five times as many deaths as the first. 

In the United States, we privilege speech liberalism more than any other nation on the planet. There are undeniable societal and democratic benefits to legally encoding this particular cultural value. But what do we do when the exercise of rights, and this right in particular, interferes with the right of our neighbors to breathe? I urge us to make those voices accountable, both publicly and legally. When the death totals start climbing again, do not let them forget. 

Contact Juddson Taube at taube ‘at’ stanford.edu.

The Daily is committed to publishing a diversity of op-eds and letters to the editor. We’d love to hear your thoughts. Email letters to the editor to [email protected] and op-ed submissions to [email protected].

Follow The Daily on Facebook, Twitter and Instagram.

Login or create an account

Apply to The Daily’s High School Summer Program

deadline EXTENDED TO april 28!

Days
Hours
Minutes
Seconds