By Sarah Myers
Since 2014, citizens of the European Union have had the right to be forgotten: to request that search engines remove certain results about them if the information provided by those results are “inadequate, irrelevant or excessive in relation to the purposes of the processing.” The internet has made our pasts permanent, and certain citizens of the EU wanted to be able to forget (or at least distance themselves from) parts of their pasts.
The EU court uses a restrictive definition of the right to be forgotten and chose an odd group of actors to enact this right. Under the EU’s ruling, people have the right to make information about themselves (for instance, about past court cases in which they were involved) difficult to find by asking that search engines remove certain results. The individual is required to provide justification for removal of the results, and the search engine (not a court of law or a state bureaucracy) determines whether that justification meets the court’s criteria. However, the websites which actually published said information are not asked or required to remove that information.
So the EU’s right to be forgotten isn’t really a right to be forgotten — it’s a right to be difficult to find. Even that is doubtful — any particularly persistent person can continue publishing the information you’d like to suppress under different domain names in order to keep the information on Google (it’s easier and faster to buy a new website domain name than it is to submit right to be forgotten requests and have them enacted).
The right to be forgotten also empowers tech companies — something that people outside Silicon Valley are increasingly leery about. This is a sticky situation, to be fair. The EU doesn’t want to process requests to be forgotten because this would require funding and employees, and because the optics of a state telling private companies what to put on their websites are awful. Instead, the EU chooses to prioritize the right private companies have to free speech, invariably placing an individual’s right to be forgotten at the mercy of a private company.
The EU’s ruling represents the first recognition of the right to be forgotten by a state (or, more accurately, set of states). It’s a weird sort of right. It didn’t need to be discussed until recently because anyone truly desperate to forget part of their past or prevent new acquaintances from learning about their past could simply move. Without the internet and digital records, moving to the next state over, or perhaps the next country over, was enough to be forgotten.
But that ability to be forgotten isn’t necessarily a right to be forgotten— the fact that people in the past were able to escape their past doesn’t necessarily mean that they deserved to do so. Let’s say that I take a forensic science class and somehow make the professor absolutely hate me. In a fit of rage, my professor uses his expertise to frame me for a gruesome murder. America’s lamestream media jumps on the story of a murderous Stanford student. Six months later, I use my new forensic science knowledge to prove my innocence but by that time, the story is old news. Going into my senior year, I realize that any potential future employers are going to see stories about my alleged murderous rampage on page 1 of Google. Logically, I submit a request for those results to be removed.
That seems pretty reasonable. Let’s try another example. Various college students (Stanford’s own Brock Turner, who was protected by the administration and swimming team!) have been accused of sexual assault. I’m going to go out on a limb here and guess that at least some of them actually did sexually assault someone — but some didn’t. Let’s say one such student, 20 years after allegedly sexually assaulting someone else, decides to run for Senate. Their obvious first move is to make any incriminating information from college as hard to find as possible. Should Google remove results about their alleged crime?
On the surface, this question could seem easy — if the Senate hopeful was found guilty of sexual assault, Google should laugh in their face. If they were exonerated, Google should remove results about the case. But many cases like this don’t make it to court. They are often adjudicated only by the university where the perpetrator is currently enrolled. Universities are notoriously bad at determining guilt or innocence in this type of case and may never reach a verdict at all. Should search engines be responsible for investigating decades-old alleged crimes? Should they assume innocence and allow a possible rapist to keep their crime a secret?
You and I probably aren’t going to end up in either of those situations. But there is a spectrum of difficult questions between them. All of us are increasingly likely to end up in that spectrum. For instance, the majority of American adults and teenagers use social media. People my age post selfies and vacation photos — and the occasional photo from a party. Some parents post potentially embarrassing photos of their children, ranging from goofy to legitimately shaming. Unless Americans have some kind of heretofore undiscovered supply of foresight, most of us are going to end up regretting some of our social media use. Should Google sweep our indiscretions and oversharing under the carpet?
This very article, just like my other articles for the Daily, and the articles I wrote for my high school newspaper, will become a permanent part of my digital identity as soon as it is published. I’ll be the first to admit that that can be a daunting thought. It’s fun to laugh at Ted Cruz’s college theatrical performances until you realize that, to some people, your actions will be just as worthy of ridicule. It feels dishonest to censor myself for the sake of a faceless future employer (and foolhardy, given that I have no idea what future-me will find embarrassing), but it’s also impossible to ignore the permanency of my words. Will I someday find myself asking Google to remove results from the Daily from my search results?
There are limits on our right to suppress the past and there are limits on the right of our past to determine our future. The EU established the right to be forgotten, more than the U.S. has done, but its policy has flaws. Limits on the right to be forgotten should probably be determined and applied by all of us, not tech companies. We, as individuals, should probably accept some responsibility for moderating our tendency to overshare and perhaps we should consider the long term impact of our online activity. Between those two ideas lie a host of difficult questions, and it’s time to stop ignoring them.
Contact Sarah Myers at smyers3 ‘at’ stanford.edu.