Going Dark: Who is Our Enemy?

If you leave your backdoor open, you’re letting anyone in. Now, the United States government is asking technology companies to build them a backdoor to American communication technologies. Such a door would allow the government to bypass the security of American devices to access all of our data. With the click of a button, our messages and phone calls would be in the hands of the government. And given the track record of U.S. government agencies, American citizens wouldn’t just be opening up their homes to the U.S. government but any bad actor. This past year, American government agencies were ranked the worst in cybersecurity, compared to 17 major private industries like transportation and healthcare. From April 2015 to April 2016, there have been 35 major government data breaches, including the July 2015 Office of Personnel Management hack, which affected 21.5 million people, and the February 2016 Internal Revenue Service hack.

To the government, the backdoor is seen as the fail-safe method against going dark. “Going dark” used to reference radio silence in the intelligence community. Today, the FBI uses the term to allude to law enforcement’s lack of technical ability to “intercept and access communications and information pursuant to court orders” because of shifts in technologies. In a press release, FBI Director James Comey emphasized the threat of going dark: “Armed with lawful authority, we increasingly find ourselves simply unable to do that which the courts have authorized us to do, and that is to collect information being transmitted by terrorists, by criminals, by pedophiles, by bad people of all sorts.”

But, we’re not going dark. We’re far from it.

The Wrong Case

It’s over. A deep sigh of relief resounded from all iPhone users on March 28, 2016, when the Justice Department dropped its suit against Apple. But is it over? In the wake of the terrorist attack in San Bernadino, California, the Federal Bureau of Investigation was determined to unlock the iPhone used by gunman Syed Rizwan but lacked the technical capability to do so without Apple’s assistance. When Apple refused to comply with the FBI’s requests, the FBI took to the courts to force compliance.

This, unfortunately, was the wrong case for the federal government to use to champion the Going Dark issue. For one reason, the public and high-profile dispute generated a lot of bad press for both Apple and the FBI. To some, Apple was portrayed as unwilling to aid American national security objectives. To others, the FBI was seen as a relentless enabler of the surveillance state. In an interview with the HPR, co-inventor of public key cryptography and Stanford Professor Martin Hellman said, “We need to sit down. Stop fighting this out in the media. We are all here to do what’s best for the country. Let’s look at the trade-offs. It’s not like the government is wrong in wanting access. We have to look at the cost of getting information and the precedent of doing so.” In this case, having a company that has sold 700 million mobile devices change its software infrastructure seemed unreasonable.

It soon became clear that not all means were exhausted in obtaining the information. It wasn’t completely necessary for the FBI to have Apple change its operating system. The suit against Apple was dropped because the FBI found alternative means to getting the information on the phone. Additionally, at that point, there was no sense of urgency to procure the data on the phone. Law enforcement agencies were aware of who had perpetrated the attack in San Bernadino.

The public was evenly split on the Apple v. FBI case. According to a Pew Research Center Poll, 51 percent of the public said that Apple should unlock the iPhone to assist the ongoing FBI investigation whereas 38 percent said that Apple should not unlock the phone to ensure the security of its other users’ information, and 11 percent did not know. Slight majorities of both Republicans and Democrats supported the Justice Department’s suit. The battle between Apple and the federal government may have lasted only 43 days but the war has been going on for the past two decades.

Fighting a CryptoWar

The tussle between Apple and the FBI isn’t America’s first time fighting a Crypto War. (The phrase “Crypto War” has been used to describe the ongoing tussle between the United States government and strong encryption technologies.) The first backdoor to a communication technology introduced by the federal government was the Clipper Chip in 1993. The Clipper Chip was a cryptographic chipset that was intended to be installed into every secure phone in the United States. Proponents of the chip believed that new encryption methods that had taken hold in phone technology fostered an environment for criminals to thrive because the secure communication would allow them to evade law enforcement. There was tremendous pushback from the public in accepting the Clipper Chip, which was being used by the federal government to survey all phone calls. According to the New York Times in 1994, the Clipper Chip was seen as a “Big Brother,” enabling a “cyberspace police state.” The chip was declared defunct by 1996.

During the time of the Crypto Wars of the 1990s, many studies were conducted to assess what would happen if the Clipper Chip was not adopted. According to the Open Technology Institute, many of the predictions made at the time were self-fulfilling: strong encryption would “benefit the economy, strengthen Internet security, and protect civil liberties.” Without the Clipper Chip, strong encryption would permeate the marketplace of new technologies. Former Secretary of Homeland Security Michael Chertoff told the HPR that “About 20 years ago, there was a similar fear around telephone communications and going dark. Ultimately, the government was able to find a way around it.” Even in this environment of strong encryption, the government didn’t go dark. Instead, it found a way to survey and protect the interests of national security. History often repeats itself, and from the telephone to the iPhone, this fear of Going Dark is reoccurring.

How Dark?

Surveillance has become an integrated part of citizen lives, whether we like it or not. To some extent, all communications companies receive some form of payment from the National Security Agency in return for access. In a 2013 leaked inspector general’s report, it was revealed that the NSA pays AT&T, Verizon, and Sprint several hundred million dollars annually for access to 81 percent of international phone calls made to the United States. The telecommunications companies have been known to fully cooperate with the government, selling data from its users. Reform Government Surveillance, an advocacy group for technology companies that aims to limit government surveillance, has retaliated against the government’s authority to collect user information. RGS companies include Apple, Dropbox, Facebook, Google, Microsoft, LinkedIn, and more. Although RGS companies have supported many requests from the federal government similar to the telecommunications companies, this group has refused to provide so-called “backdoors” in their operating systems for the government to survey their users. The backdoor represents a built-in security vulnerability. Although the backdoor is only intended for the government’s usage, it could allow any hacker access to these systems. A grand fissure has emerged between the companies that fully cooperate with the federal government and those that don’t.

With the data from the companies that do cooperate, there is enough information for law enforcement to pursue their investigations. Metadata, the information that describes data, can provide a lot of context to the content itself, without knowing the content. Secretary Chertoff points out the value in metadata, stating, “In general, if you are looking at the most important ways to get information, it is not by breaking encrypted communications. By the time that you run across the encrypted, you have the sender and recipient. The metadata who calls who and what people’s travel patterns are gives you a comprehensive picture of the haystack.” In other words, the United States government doesn’t need to listen in on a phone call or read an email as long as it has data that describes those interactions. The phone number of the recipient or the duration of the call can provide useful information to surveillance agencies. Former CIA and NSA Director Michael Hayden has testified to the power of metadata, famously saying in 2014, “We kill people based on metadata.”

The most critical element in stoking the fear of Going Dark is the absence of technical ability to “stay in the loop” with new information technologies. The lack of technical ability in the FBI and other government agencies is due to the flight of tech talent to the private sector. According to Code.Org, a nonprofit focused on expanding access to computer science, there are currently 607,708 open computing jobs nationwide. In 2015, only 42,969 computer science students graduated into the workforce, adding to the existing shortage of graduates that can fill computer science jobs. Furthermore, these new graduates are more oriented toward the private sector than the public sector due to a higher salaries. In an interview with the HPR, Susan Landau, a professor of Cybersecurity Policy at Worcester Polytechnic Institute, posited that “Law enforcement needs to develop better skills in this area. It has not kept in pace with the remarkable advancements of the current technological age. We are using 20th century tools and thinking to solve our 21st century problems.” In her most recent testimony before the House Judiciary Committee, she recommended that the FBI should develop their expertise in modern computing research.

Lawful Hacking

In the meantime, the law is being loosely interpreted to fuel either side of the debate. Academics refer to the Communications Assistance for Law Enforcement Act, a U.S. wiretapping law passed in 1994, in their arguments against backdoors. Harvard Law Professor Susan Crawford wrote in Medium, “The law is clear. FBI cannot make Apple rewrite its [operating system].” Professor Crawford was referring to Section 1002 of CALEA, which specifies that the federal government cannot “require any specific design of equipment, facilities, services, features or system configurations” from phone manufacturers. However, the FBI contests that CALEA only applies to traditional telecommunications carriers, providers of interconnected Voice over Internet Protocol services, and providers of broadband access services. There is no provision in CALEA that protects technology companies that specialize in telecommunications hardware. Instead, the FBI maintains that the 227 year-old All Writs Act, a federal statute, is sufficient legal precedent for the FBI to do what is “necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” Both pieces of legislation are outdated for current technological standards.

Since the Justice Department dropped its suit against Apple, Apple has not been forced to rewrite its operating system. Instead, a new discussion on legal hacking has taken root. Legal hacking is the lawful exploitation of existing vulnerabilities in end-user software and platforms. It is becoming an increasingly viable option as centralized wiretapping is less feasible. This method is rumored to have been used in the San Bernadino case; NBC and other news sources report that an Israeli firm unlocked the iPhone and sold their hack to the FBI. The use of external agencies has been cited as necessary by Professor Hellman: “In the case of the FBI, unless they develop a lot of in-house talent, they will probably need to rely on external talent.” Technology companies like Google, Facebook, and Tesla already use hackers to exploit vulnerabilities in their own technology. Like bounty hunters, hackers are hired to find zero-day bugs, which are security flaws that can be easily abused without detection. In the Going Dark debate, lawful hacking is our night light.

According to Cybersecurity Analyst Marshall Erwin, the future of lawful hacking does raise significant concern on our reliance on such a strategy. First, formally employing the use of lawful hacking means an inherent lack of transparency in law enforcement activities. The ability to hack into different systems would never be documented in company transparency reports. Furthermore, given the lack of transparency, lawful hacking opens the door for abuse. There would be fewer checks on law enforcement agencies by the technology companies on the data that they are able to obtain. In order to institute a successful framework for lawful hacking, a more nuanced system of compliance must be established.

At this point, with enough metadata to fill a couple hundred data warehouses, surveillance agencies are no longer asking “Who is our enemy?” but rather, “What are they saying?” They’re not completely in the dark, yet.

 

Image Source: Wikimedia

Leave a Comment

Solve : *
20 − 16 =