By Claire Zabel
Last Friday at Stanford, President Obama declared the cyberworld a modern day “…Wild, Wild West ,” a place where hackers are desperadoes and laptops are their weapons of choice. Cybercrime is on the rise and, although tech companies have taken serious steps to quell this threat when it affects them directly, they have been less enthusiastic when it doesn’t. Intel’s security division has estimated the annual global cost from cybercrime at more than $400 billion. This puts tech and other companies on the defensive against hackers. Meanwhile, law enforcement plays offense by investigating and prosecuting cybercrime. But they are cautious allies, not teammates, and they don’t always follow the same playbook.
Technology is developing quickly, and more secure ways of protecting our private data are being released every day. But these protections can put information beyond the reach of law enforcement, causing it to “go dark.” Although technology companies have served critical roles in assisting the efforts of law enforcement (often designing their products with built-in channels that allow law enforcement to access crucial data), this is no longer the norm.
Apple CEO Tim Cook recently talked about developing systems that “don’t store the details of [your] transactions,” (such as iOS 8), making data inaccessible to the companies that make them. This means these data in these devices, some of which may be or may become critical to law enforcement investigations, are irretrievable.
Apple washes its hands of how its devices are used, advertising on its information page that it cannot respond to government warrants for data extraction from iOS 8 devices. Good for Apple; not necessarily for us. Now, companies reply the way Tim Cook did, saying,“If law enforcement wants something, they should go to the user and get it. It’s not for me to do that.” But Cook’s advice is empty—once you go to the user, covert investigation becomes impossible. Criminals, especially the most dangerous and sophisticated ones, are often the relevant users and are generally disinclined to cooperate.
It’s not surprising that tech companies from Apple to WhatsApp have raced to see who can develop the best data security. The publicization of WikiLeaks and Edward Snowden’s whistleblowing, among other incidents, have fueled public mistrust of government surveillance, which has extended to law enforcement agencies. Consumers want more robust privacy protections and tech companies have responded. In some cases this initiative may spring from genuine concern for user privacy, but much of it seems to be driven by the desire to grab market share from competitors.
Free competition is the cornerstone of our economy, but we rightly regulate socially costly practices. The government must address negative externalities, since industries have little incentive to do so. Regulations can reduce a company’s profits and competitive edge, but they also protect public goods like the environment and national security. It is not unreasonable to impose limits on a company’s ability to create systems so secure that they put our safety at risk by preventing law enforcement from ever accessing those protected data.
Criminals gravitate toward secrecy, thus encrypted systems have become a salt lick for those who mean us harm. Robert Hannigan, the Director of GCHQ (Britain’s intelligence and security organization) has described US tech companies as: “…the command-and-control networks of choice for terrorists and criminals…[even if they]…are in denial about [this].” His statements are not unfounded. Criminal organizations from ISIS to Silk Road have increasingly relied on encrypted technology systems to propagate their messages, capitalizing on the impermeable protection afforded by this extreme privacy.
We have dealt with a similar problem before. In 1994, Congress passed the Communications Assistance for Law Enforcement Act (CALEA), requiring that all digital telephone networks be wiretap-enabled. This proved to be profoundly helpful, even critical, to law enforcement investigations of some of the most serious crimes and criminal organizations in the past twenty years. However, this law is now outdated; we need new regulations to govern new technology.
Twenty years ago, the U.S. Congress determined a privacy standard that protected innocent citizens while allowing law enforcement officials to do their job, as long as they obeyed the law and obtained warrants for their investigations. The Fourth Amendment to the U.S. Constitution protects the privacy of its citizens, but it has never completely shielded them from law enforcement. Even our homes, long considered to be our most private domain, can be searched if a judge grants a warrant. Saying we want to protect the content of our smartphones above that of our homes may demonstrate an important shift in privacy values. But has the will of the people really changed? Or have people not yet understood the link between inaccessible data and the flourishing of criminal organizations?
People are rightly concerned about the data that technology enables to be collected and preserved. In some situations, backdoors intended for law enforcement use may become a source of vulnerability that allows criminals to access data, while “going dark” can protect information from criminals and law enforcement officials alike. However, we cannot rely on tech companies to make these tradeoffs appropriately, since they may have little understanding of national security concerns and little incentive to interfere with criminal organizations that don’t threaten them or their consumers. Likewise, policymakers may not fully understand the the technological constraints involved in safeguarding data from criminals while making it available to law enforcement. That’s why communication and cooperation between government officials, security experts, and tech companies is essential to managing these complex situations.
The United States tech industry has made great strides in protecting our private data, and there have been instances of cooperation between tech companies and law enforcement. But a dangerous rift is forming between what technology enables and what law enforcement needs. Thus, we must reconcile our protective instincts towards our personal data with our protective instincts towards ourselves and our country. Law enforcement and tech companies like Apple need to work together. Otherwise, we may trade one insecurity for another.
Contact Claire Zabel at email@example.com and Joseph (Joey) Zabel at firstname.lastname@example.org.