Recently, the Verge published a look inside one of Facebook’s deals with a content moderating contractor. Facebook hires these moderators to screen posts reported by users for violating their community standards. These moderators look at reported posts and decide whether to delete or allow them. Author Casey Newton was able to convince some former Facebook moderators, who are generally prohibited from discussing their work by NDAs, to tell her about their experiences. Their stories are deeply upsetting; they are routinely forced to witness extreme violence, constantly monitored and held to incredibly high standards for speed and accuracy. Accuracy is determined by how often moderators’ decisions agree with the decisions of slightly more senior moderators; more senior moderators are given a random sample of a regular moderators’ processed posts and asked to make their own judgments. At Cognizant, for example, moderators must be “accurate” at least 95% of the time. Within the Cognizant work site Newton examines, some moderators have responded to constant exposure to the worst of Facebook by buying into the conspiracy theories. One person genuinely believes the earth is flat, another has become convinced that 9/11 was not a legitimate terrorist attack, and another denies that the Holocaust took place.
Ever been frustrated about how long it takes for your Netflix episode to buffer or how blurry the resolution of your YouTube video is? A team of computer science researchers at Stanford think they can ameliorate video-streaming standards.
Let’s face it: Sometimes you go to sleep after you’ve told yourself a couple of white lies to ease the way. 1. I’m definitely going to sleep at midnight. I … have unhealthy sleep habits. My parents know, my roommate definitely knows and I very much do know how late I sleep at night. But…
As a parody on a passage in the “Good Book”, “What profit a man when he gaineth access to social media but loses his privacy?”
On Wednesday evening, members of the Kofi Annan Commission on Elections and Democracy in the Digital Age gave a panel discussion on the opportunities for and challenges of electoral integrity created by technological innovations.
On Tuesday, Alex Stamos, former Chief Security Officer (CSO) of Yahoo and Facebook, spoke at the Hoover Institution about cybersecurity’s effect on society and the accountability of technology platforms for protecting their users.
On Tuesday, the Center for Advanced Study in the Behavioral Sciences (CASBS) and the Stanford Cyber Initiative hosted a discussion on the governmental consequences of technological developments with two CASBS fellows, Stanford Law Professor Nate Persily and Carrie Cihak, Chief of Policy in King County, Washington. A recurring theme throughout the lecture was the difficulty…
A place where individuals can “live in an alternate reality” and a “weapon of mass destruction” were among the ways in which Anne Applebaum, Ted Koppel, and Jessica Lessin described the internet’s role in the changing landscape of journalism at Monday evening’s installation of Cardinal Conversations, a recently-launched speaker series intended to engage speakers from both sides of the aisle in open political discourse.