Support independent, student-run journalism.

Your support helps give staff members from all backgrounds the opportunity to conduct meaningful reporting on important issues at Stanford. All contributions are tax-deductible.

Mistreated moderators and the pervasive violence of the internet

Recently, the Verge published a look inside one of Facebook’s deals with a content moderating contractor. Facebook hires these moderators to screen posts reported by users for violating their community standards. These moderators look at reported posts and decide whether to delete or allow them. Author Casey Newton was able to convince some former Facebook moderators, who are generally prohibited from discussing their work by NDAs, to tell her about their experiences. Their stories are deeply upsetting; they are routinely forced to witness extreme violence, constantly monitored and held to incredibly high standards for speed and accuracy. Accuracy is determined by how often moderators’ decisions agree with the decisions of slightly more senior moderators; more senior moderators are given a random sample of a regular moderators’ processed posts and asked to make their own judgments. At Cognizant, for example, moderators must be “accurate” at least 95% of the time. Within the Cognizant work site Newton examines, some moderators have responded to constant exposure to the worst of Facebook by buying into the conspiracy theories. One person genuinely believes the earth is flat, another has become convinced that 9/11 was not a legitimate terrorist attack, and another denies that the Holocaust took place.