Scholars say social media platforms’ freedom to moderate content has important tradeoffs

May 18, 2021, 9:57 p.m.

Leading scholars on the intersection of social media platforms and democracy say that though content moderation by social media platforms can be beneficial, it has its drawbacks because of the platforms’ potential to remove valuable content and the lack of transparency about the decision-making process.

The Tuesday event hosted by the Stanford Constitutional Law Center took place amid ongoing debates about the power of social media platforms to regulate online speech and recent accusations of Instagram intentionally removing posts related to the potential eviction of Palestinians from the Sheikh Jarrah neighborhood in East Jerusalem.

Eric Goldman, professor of law and co-director of the High Tech Law Institute at Santa Clara University School of Law, said that social media platform discretion — the ability to terminate accounts and remove content — is a positive aspect of platforms that should not be taken away. 

Many governments, in an effort to limit platform discretion, go directly to companies to demand the takedown of content, according to David Kaye, clinical professor of law at the University of California, Irvine. Kaye referenced India, where the government demanded the removal of Twitter accounts and Facebook accounts of individuals who are critical of the government and its COVID-19 response.

However, according to Goldman, the freedom of platforms to enforce their own editorial policies about what to remove or restrict is important because without this discretion, platforms’ servers would be overwhelmed with “terrible content,” which he defined as anti-social content that hurts other communities and creates barriers to participation and formation of a community. “Terrible content” is not necessarily illegal, he added. 

He added that the loss of platform discretion would harm platforms’ business models, as companies would not pay for ads and users would not pay subscription fees to access platforms with uncurated and “terrible content.”

Goldman also contended that “terrible content” has implications beyond the digital sphere. Platforms permitting all content to live online establishes “a rough-and-tumble norm about how we expect to interact with each other in our entire lives and our entire society,” he said. Should a norm be established that allows anything to go online, Goldman argued that people would take that to mean all behavior is acceptable in the offline world as well, despite it being harmful to society. 

While Goldman favors platform discretion, Evelyn Douek, a lecturer on law and an S.J.D. candidate at Harvard Law School, cautioned about its tradeoffs. While she said she is also wary of an intense regulatory model in which the state controls social media platforms, she warned that platforms with complete discretion are dangerous. She referenced social media platforms that she said are currently removing “very valuable human rights evidence” in Palestine without transparency about how the decisions are being made and what material is being lost.

Kaye added that while he is worried about the government getting involved in editorial content decisions, he sees a significant problem with the lack of transparency in various platforms’ decision-making processes.

“Given the impact that the platforms do have on public life and on public institutions, there’s a real argument for regulation to encourage more transparency,” he said. 

Kaye suggested the exploration of alternative oversight models beyond government regulation and self-regulatory models. Specifically, he recommended multi-stakeholder and cross-industry models that can provide insight and a form of non-governmental regulation over the companies. These “can be very successful in providing an avenue for grievance, an avenue for transparency into how companies behave and respond to some significant public problems,” he said.

While content moderation and platform discretion are difficult problems with complex proposed solutions, Douek stressed the importance of continued conversations about the topics.

“We’re never going to solve content moderation. We’re never going to get to a point where we have a great set of house rules,” she said. “What we need to do is find a way to continue arguing about this and argue about it more productively for the rest of time, and having a common language and framework in which to do that is a really good start.”

Malaysia Atwater '23 is a senior staff writer and former Vol. 260/261 managing editor in the News section. She is a political science major from Centennial, Colorado, and she enjoys dancing and re-watching Grey's Anatomy in her free time. Contact her at matwater 'at' stanforddaily.com.

Login or create an account

Apply to The Daily’s High School Summer Program

Priority deadline is april 14

Days
Hours
Minutes
Seconds