CS + Ethics

Opinion by Anna-Sofia Lesiv
May 6, 2018, 9:36 p.m.

Early this year, research fellow Hilary Cohen and professors Jeremy Weinstein, Mehran Sahami and Rob Reich were pictured in a copy of The New York Times. They stood together in the atrium of the Gates Computer Science Building, a determined look crossing each of their faces. “On Campus, Computer Science Departments Find a Blind Spot: Ethics,” the headline read.

Pre-empting the “techlash” that would soon engulf Silicon Valley with reawakened concerns over tech industry transgressions, the article identified “a common Silicon Valley attitude that has generally dismissed ethics as a hindrance.” Academics like Stanford’s Cohen, Weinstein, Sahami and Reich, however, were finally trying to change that.

“At Stanford, the computer science department will offer the new ethics course, tentatively titled, ‘Ethics, Public Policy and Computer Science,’” the article read. This news seemed exciting, if not a little strange. After all, a class called “Computers, Ethics and Public Policy” already existed. Though perceived to be new, Stanford’s computer science department had in fact been offering this class for the past 30 years. Celebrated, forgotten and soon-to-be redeployed once again, this is the story of ethics education in America’s leading computer science program. This is the history and legacy of CS 181.

***

When the computer science department packed its things from the School of Humanities and Sciences and moved into the School of Engineering in 1985, it was the only Engineering major that did not have an ethics in society requirement. The department was small and wasn’t graduating many students. Computer science was still a niche area, a field that nobody bragged about at cocktail parties, much less one that attracted throngs of impressionable youth to its ranks.

It was the height of the Cold War and a period of mourning for the United States. The U.S. had lost the space race, and believed that a flood of military money into science and engineering labs would somehow compensate for a bruised national ego. At the time, computer science was a new field with unexplored boundaries, but its researchers were already highly sought after.

Terry Winograd joined Stanford’s faculty in 1973. He had just finished a Ph.D. in artificial intelligence and natural languages at MIT. While there, Winograd took note of the military funding that powered some of the university’s labs. As a staunch opponent of the Vietnam War, the Department of Defense (DoD) funding made him uncomfortable. After all, in Cambridge, Winograd had been a member of “Computer People for Peace,” a group that staged protests outside of Honeywell, the manufacturer of thermostats, computer monitors and cluster bombs. The cluster bombs, of course, were the true source of CPP’s disapproval. And while Terry protested, his wife Carol, a physician that would later teach at the Stanford Medical School, brought up the rear by caring for those injured.

When the couple first arrived at Stanford, Terry Winograd was merely filling in for a faculty member who happened to be conducting research in Winograd’s exact field of expertise. Winograd thought the position would be temporary, but when the faculty member never returned, he stayed on the research and teaching staff, never having received a formal interview for the job.

In a few years, Winograd made it onto the faculty committee for symbolic systems, a major that combined the study of philosophy, linguistics and computer science. It was there that he met Helen Nissenbaum, a recent Ph.D. grad and a philosopher in a non-tenure track position at the university. Their mutual interest in computing sparked a unique partnership and they endeavoured to expand the interdisciplinary nature of symbolic systems into the insular, highly specialized department of computer science.

Winograd was intimately acquainted with the dangerous applications of computers. He was aware of the strong presence the DoD had within departments like Stanford’s and MIT’s. He and Nissenbaum realized something was missing from the equation; they seemed to have “[found] a blind spot — ethics.”

The class they introduced together, merely a few years after Stanford began graduating computer science majors in 1986, was called “Computers, Ethics and Social Responsibility.” The inclusion of “social responsibility” was a nod to a larger movement among professionals who decided to leverage the authority of their discipline to steer society away from dangerous paths. It started in the sixties with physicians who announced their opposition to nuclear weapons, hoping their medical degrees would convince people of the health risks associated with nuclear war. By the time President Ronald Reagan established the Strategic Defense Initiative (SDI) in 1983, the movement would have caught on to computer scientists.

The SDI was meant to be a piece of software that could identify and destroy any oncoming nuclear missile, effectively securing the U.S. from nuclear attack. Computer scientists referred to it as “Star Wars.” However, the fact that the program was unfeasible and certainly untestable didn’t stop it from recruiting top talent from academic institutions and industry, lending its mission a dangerous veneer of legitimacy.

“Computers had a role in warfare,” Winograd tells me. He made it his responsibility to ensure that computer professionals were aware of this and intentional about the projects they worked on. In the same year that Reagan announced the start of the SDI, Winograd followed the physicians and established Computer Professionals for Social Responsibility (CPSR). The group was formed from industry consultants at Xerox PARC, where Winograd worked part-time, and a few academics at Stanford concerned about the program. They published newsletters, opened chapters across the country and even published a book. All to raise awareness about the responsibility computer professionals had to their field.

This was the environment in which Winograd and Nissenbaum taught “Computers, Ethics and Social Responsibility.” It was a time when the stakes of computer science research were higher than they had ever been, when the consequences were existential and broadcast on a daily basis. It was a time when faculty members suddenly became engineer-cum-activists.

The class was discussion-based and populated by students that self-selected to hear Nissenbaum talk utilitarianism and Kant, and to listen to Winograd lecture on professional responsibility and computers. It would be the first and only ethics class offered within Stanford’s department of computer science.

Within a few years, Nissenbaum would be offered a tenure track position at NYU and move on from Stanford, leaving Winograd without a philosopher. Feeling that he was not sufficiently expert to cover all the material on his own — “I’m not a philosopher,” he tells me today — Winograd could no longer continue with the class. Luckily, a newly hired colleague was eager to take over.

***

Eric Roberts had already heard of Terry Winograd before arriving at Stanford in 1990. After all, he was a member of the CPSR chapter in Boston. Frustrated and concerned with the developments around “Star Wars,” he submitted articles to CPSR’s newsletter, even contributing a chapter to “Computers in Battle,” a book written by CPSR’s inaugural president. “Can we develop software for SDI that takes into account the inherent complexity of the problem and yet allows us to trust that software to perform correctly the first time it is used under realistic conditions?” Roberts wrote. In the following years, Roberts would come to serve as CPSR’s national secretary for three years and its president for six.

Roberts and Winograd shared very similar sensibilities. Both were deeply troubled by the exaggerated role played by the military in the development of computer science. At a time when a majority of research at Stanford was funded by the Department of Defense, Roberts and Winograd decided categorically this was money they were not going to touch. Memories of the U.S.’s war in Vietnam were still fresh on their minds, and the consequences of conducting research for the military were too great.

After 28 years of teaching at Stanford, an illustrious career as a computer scientist and multiple publications including “The Art of Java Programming” and “Programming Abstractions in C++,” Roberts is now retired. He’s slowly moving boxes out of his office to free the space for a future faculty member. But he has a long way to go.

In front of his desk sits a copy of “Utopia” by Thomas More, among a series of other philosophical texts — course material from a class Roberts taught on techno-utopias. Apart from teaching the 106s and other technical courses, Roberts has also taught intro to the humanities, a class on CP Snow’s essay “The Two Cultures,” and of course, Winograd’s class, the one that became known as “Computers, Ethics and Public Policy.”

Roberts picked up exactly where Nissenbaum had left off, injecting healthy doses of Kant, Mill and Bentham into the syllabus alongside treatment of topics like intellectual property, hacking, civil rights and the right to privacy, and monopolies. With this much ground to cover, the course was used to satisfy the Writing in the Major requirement, and the writing ramped up quickly. In the first weeks of the course, Roberts asked students to compose essays comparing Kantian ethics to utilitarianism. By the end of the quarter, students were expected to submit final projects along with 10 to 15-page papers.

By the time Roberts took over, ethics in society had become a major requirement, and eventually the tone of the class began to change. In a report, Roberts wrote that his first year of teaching the course proved a “negative” experience. Students no longer self-selected to study ethics. Instead, as enrollment grew, it increasingly seemed to Roberts as though for many students, the class was merely “time taken away from their start-up.”

***

Three decades following its inception, “Computers, Ethics and Public Policy,” or CS 181, continues to be the only ethics class offered within the computer science department at Stanford. Though the name has stayed constant, everything from the subject matter to the pedagogical approach has varied.

What has changed the most, however, is the environment in which the class is taught. Today, computer scientists hold an unprecedented amount of sway over the most important functions of our lives and our society. In the past few years, we have witnessed internet companies started out of America’s dorm rooms become powerful enough to influence elections anywhere in the world. We’ve read increasingly alarming stories detailing how a handful of technology companies have become proprietors of personal information we’d be surprised if even our closest friends and family knew. The stakes were high when Winograd and Nissenbaum, both experts in their field, started their class. Those stakes are even higher today.

Teaching CS 181 today has meant staying on top of new controversies, new technologies and new ethical issues that arise within the field of computer science. It has meant trying to hit a moving target, rather than following a step-by-step instruction. This has made the class particularly challenging to teach, and even more important to get right. Unfortunately, for many coming out of the computer science major today, ethics once again seems as though it is merely being “dismissed as a hindrance.”

Winograd tells me that with the increasing pace of progress, and research that is fundamentally low-level and difficult to extrapolate to precise applications, ethics remains a peripheral concern for many computer scientists and researchers today. In the words of Winograd, it’s an attitude of “Oh yeah, I guess I should be thinking about that.”

That’s certainly not the case with Dev Bhargava, the current instructor of “Computers, Ethics and Public Policy.” He thinks that lack of ethics education in the computer science curriculum is a serious problem, and he’s eager to address it. In the past, Bhargava has TA’d CS 181 three times and found it one of the most influential courses he has ever taken at Stanford. And though Winograd’s Ph.D. did not prevent him from feeling unprepared to teach the class on his own, Bhargava finds himself having to do this with, so far, only a bachelor’s degree in CS under his belt. Bharvaga is a coterm, but if it weren’t for him, CS 181 would not even be offered this quarter.

As the number of requirements satisfied by CS 181 has grown, the class has become a crucial nexus for many in the computer science major. Not only does CS 181 satisfy undergraduates’ Ethical Reasoning requirement, it also fulfills Technology in Society and Writing in the Major for CS students. Required classes are notorious at Stanford, as the amount of effort it takes to motivate students eager to check off boxes is often overwhelming for instructors. Nonetheless, Bhargava has tried to take after Keith Winstein — who taught the course in the winter — and adopt a case-based approach to ethics, assigning contemporary readings and short memo responses to students — a significant departure from the class’ early days when students had to write essays comparing the categorical imperative and utilitarianism. He tries to evoke meaningful discussion and treatment of the topics, but leading such dialogue is an art in itself. Dev realizes how fickle the knowledge drawn from the course can be. He tells me that the slightest change in instruction can significantly affect how students perceive the material.

Lecturers in CS 181 are given maximal latitude in course preparation to let them address the most contemporary ethical questions. However, what might be seen as latitude could also be construed as disinterest. Over the first four weeks of the quarter, no one from the department has ever come to observe Bhargava lecture to his class.

Enrollment in “Computers, Ethics and Public Policy” is capped at 50 this quarter, providing a classroom dynamic somewhat more intimate than the class of 150 instructed by Winstein in winter. The class to be offered by Hilary Cohen, Rob Reich, Mehran Sahami and Jeremy Weinstein is expecting many more students in the fall. They’re planning for hundreds. Unlike CS 181 this year, the class will no longer be offered multiple times per quarter. It will only be offered once.

Presently, the instructors and their teaching staff are busy constructing a syllabus that will provide an integrated interdisciplinary perspective on ethics in computer science. It will treat the question of ethics from the disciplines of philosophy and political science. The case-based approach to ethics will remain. After all, “you can’t get students engaged by asking them to write an essay comparing Kant and Mill’s utilitarianism,” Reich tells me. However, it’s likely that students will be asked to solve technical problems as well. Debugging a biased neural network is an example of an assignment that might feature in the class come fall.

It remains a question, however, whether the class will succeed in being as impactful an experience for students as possible. “An ethics class really should be discussion based,” Roberts said when reflecting upon his experience. Indeed, it’s unclear whether the sheer size of this class will prevent the kind of discussion required in a course meant to teach critical thinking and good decision making. Reich, however, tells me that the physical architecture of the space a class is taught in can be more

conducive to meaningful discussion than merely class size. He envisions teaching in a space where students can break up into smaller groups to discuss topics raised in lecture.

A great deal of effort and energy is being deployed into making the new form of CS 181 as rich and educational a class as possible. However, it is a difficult job. As the only ethics course in the department, CS 181 must completely make up for all the ethical material not covered by other classes, placing a heavy burden on the instructors and syllabus of the class. “I think technical classes should incorporate units on ethics too,” says Bhargava. That way, students would be exposed to ethics in more than merely one CS class. In fact, as it stands today, CS majors can satisfy the Ethical Reasoning requirement by taking any other class that treats ethics broadly, without particular application to computer science. They can complete their degree without ever really having to confront computer ethics issues at all.

The secondary nature of ethics in computer science is not only a sense many get from the composition of the curriculum. It’s also the sense students get from the department itself. “It seems as though some professors are unaware of the conflicts of interest that may arise when they accept money from industry to conduct research,” says Noah Arthurs, a CS major who started EthiCS, a Stanford student group eager to incite more discussion on technology and ethics.

Indeed, faculty members in the department of computer science are granted one day of the week on which they are free to work part time or consult tech companies, an arrangement without which it would be difficult to incentivize qualified CS faculty to remain in academia. Winograd himself went on to consult at Google for a few years. After being Larry Page’s adviser at Stanford, he helped work on the early stages of the company with his former student.

It goes without saying that the porous boundary between the tech industry and Stanford’s CS department poses a host of ethical issues that beg further examination.

Reich agreed there were ethical questions with the current arrangement but stated it is unclear how the issue should be addressed. He did say, however, that “Stanford’s reputation has been bolstered by its proximity to Silicon Valley,” and that “continuing to maintain this proximity means that should Silicon Valley’s reputation decline, Stanford’s reputation could follow.”

The field of computer science today is home to many more gray areas than existed there before. Without the firm guidance of Stanford faculty or a rigorous ethical education, students may graduate with outstanding technical skills but be unprepared to respond to the moral dilemmas they might confront in their careers later on.

It’s remarkable how much of the onus to acknowledge the importance of ethics in this field rests on students. Ever more student groups like EthiCS and CS + Social Good are forming to address this issue, but without strong signals from the department, it’s unlikely that ethics will remain on the forefront of students’ minds. Cameron Ramos, a senior who took CS 181 in the winter, recalls Keith Winstein asking the class on the first day, “How many of you think you will write software that will affect people’s safety?”

Out of a class of 150, only two hands went up.

Contact Anna-Sofia Lesiv at alesiv ‘at’ stanford.edu.

Login or create an account