Magazine: Your philosophers were so preoccupied with whether or not they should, they didn’t even stop to see if they could

Opinion by Nick Pether
June 6, 2017, 12:27 a.m.

In the original “Jurassic Park” movie, Jeff Goldblum’s character Ian admonishes the park’s owner, John Hammond, “your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” It’s a not-entirely-subtle appropriation of Mary Shelley’s “Frankenstein,” a parable about science and technology being pursued recklessly by a naive scientific genius who hadn’t properly considered the moral implications of his work, with dire consequences for all. In this version, creating a bunch of dinosaurs predictably results in the dinosaurs eating everyone. As I said, it’s not a subtle movie.

Flash forward to Stanford University, present day, and you will hear Ian’s concern echoed a lot. STEM majors must take a Technology in Society requirement class as well as numerous humanities requirements. I myself am part of EthiCS, a student group recently formed to promote discussions about responsibility in tech. Emma Pierson’s recent piece in WIRED, “Hey Computer Scientists, Stop Hating On The Humanities,” argues that an understanding of the humanities is necessary to help technologists understand the societal implications of what they do. For example, she describes one well-publicized instance where “an algorithm that fulfills basic statistical desiderata is also a lot more likely to rate black defendants as high-risk even when they will not go on to commit another crime”, arguing that Michelle Alexander’s works on mass incarceration might serve as a more useful guide to navigating this problem than an algorithms textbook. The gist of it is we need to ensure scientists and technologists have the values to avoid doing harm deliberately and the humility and understanding to avoid doing harm accidentally. STEM education alone doesn’t provide that.

I want to make it absolutely clear that I am on board with this sentiment. Technological prowess is not the same as wisdom, and I think people building potentially world-changing technologies without a sophisticated grounding in ethical theory or an ability to anticipate and reason about the consequences of technological change is terrifying. I agree that the humanities can help provide this, by informing people’s values and worldview and better allowing scientists and technologists to empathize with people their work might affect.

I still feel the need to echo Amy Shen of the Stanford Review in saying that this sort of thing needs to cut both ways if those who study the humanities are to fulfill their duty of helping humanity navigate the many thorny ethical dilemmas ahead.

First, I think it is the duty of academics and philosophers to figure out what really matters and what we ought to care about. But even some of the most fundamental ethical questions cannot be answered without answering scientific questions first.

Take, for instance, the question of who gets to be counted as a person and therefore included in our circle of compassion. Modern neuroscience has uprooted Cartesian Dualism, the notion that the mind is independent of the physical brain, as the dominant theory of how consciousness works. This has lent support to the belief that nonhuman animals are conscious, and that their thrashing around in pain is a product of actual suffering and not the meaningless reactions of an unfeeling and unthinking machine. This has some pretty profound and obvious implications regarding the moral acceptability of the ways we treat animals. The moral status of infants, coma patients, insects and emulated people cannot be properly addressed without first furthering our understanding of cognition and neuroscience.

Then there’s the responsibility of the humanities intelligentsia to anticipate the challenges ahead of society and convince us to act on them. Since I agree that many of the most important problems ahead of us will actually come from technology, we need our skeptical, naysaying journalists and public intellectuals to actually know what they’re talking about. You can’t assess the severity of threats from climate change, racist machine learning systems or bioengineered pandemics unless you really understand how climate change, machine learning and bioengineering work. You can’t effectively communicate these threats to the public. You certainly won’t be able to convince gung-ho Silicon Valley technologists that they ought to apply the brakes.

Scientifically ignorant intellectuals run the risk of failing to spot important problems, which is a pity because they might be the only people who would take action on these problems had they seen them. Scientists and technologists have a vested interest in their work so can’t necessarily be trusted to stop when they ought to, even if they know they ought to.

On the other hand, commenting on new discoveries and innovations without a proper understanding of the principles involved risks misinforming the public about what a discovery or innovation really entails. Journalists who don’t really know what they’re talking about risk downplaying the potential benefits of a new something, or creating unnecessary alarm about a threat that doesn’t exist in the first place. It’s all too easy to imagine wild sci-fi fear-mongering about the future implications of some technology leading to widespread hysteria or unnecessarily cumbersome regulation on beneficial fields and industries unlikely to produce anything dangerous.

Finally, I’d like to point out that just as it’s incredibly annoying for humanities majors to be told their work basically amounts to pointless navel gazing, it is every bit as annoying for STEM people to read that the current state of their field basically amounts to Juicero (the stupid expensive juice machine all the cool kids are writing about). If every journalist writing about how Silicon Valley is “a stupid libertarian dystopia where investor-class vampires are the consumers and a regular person’s money is what they go shopping for” would please at least acknowledge that companies like Wave (helps immigrants send remittances to their families in East Africa), and Memphis Meats (creates cultured meats that have the potential to save millions of animals from a lifetime of torment every year) are also part of the Silicon Valley ecosystem they dismiss so haughtily.

Overall, I agree that an either/or approach to science and the humanities is a mistake. STEM people who disregard the humanities are annoying, might not be that useful and may end up blowing us all up. Humanities majors who don’t pay more than superficial attention to what’s happening in STEM fields can be equally insufferable, and just as likely to waste everyone’s time or do something harmful or stupid. Holistic education and respect for all disciplines or bust!

Contact Nick Pether at npether ‘at’ stanford.edu.

Login or create an account

Apply to The Daily’s High School Summer Program

deadline EXTENDED TO april 28!

Days
Hours
Minutes
Seconds