Apple’s battle with the US government over the unlocking of a terrorist’s iPhone is causing heated debate everywhere. The outcome of it may affect how much protection individuals can have over their personal information. Apple is fighting a US Magistrate judge’s court order that would require it to hack the iPhone 5C that belonged to Syed Rizwan Farook, the San Bernadino shooter who, along with his wife, killed 14 and injured 22 at a holiday party. The couple had pledged loyalty to ISIS, the Islamic terrorist group. As the deadliest terrorist attack in the US since 9/11, the stakes are high in the investigation. The FBI is under pressure to better understand the terrorists’ connections and enablers. But they claim the iPhone’s encryption and security feature is a major stumbling block in the investigation.
In 2014, Apple made a dramatic improvement in iPhone security with the release of iOS8. For the first time, all the data in the iPhone could be encrypted with the user’s personal passcode. If one used a strong enough code, the encryption would be nearly impossible to break. On top of that, Apple built in a feature that if the passcode was entered wrong 10 times, the phone could erase all the data. This could effectively prevent a brute force attack, where password tries could be entered in rapid succession. Why did Apple do this? One can argue that it was in the wake of a couple of well publicized events – one being the iCloud leaks of compromising celebrity photos, mostly due to the cloud service’s then relatively lax security model. The other was the Edward Snowden leaks that revealed just how extensive the NSA’s spying was in the U.S. and around the world. Prior to iOS8, encryption on smartphones was usually used for email and ecommerce transactions, but the rest of your data (like text messages) might be in the clear. At the time of the iOS8 release, law enforcement agencies decried the use of full encryption, and warned that the iPhone might become the communication tool of choice for criminals and terrorists. In the San Bernadino case, there’s some irony in that the iPhone 5c was an employer issued phone – from the county health agency. With iOS8, Apple had no master key to unlock a phone’s data – if it’s locked by the passcode, it can only be unlocked by that passcode, and it’s not stored anywhere.
What All The Fuss is About
At the heart of the case is the FBI’s demand that Apple help the agency unlock the contents of the phone. Apple has stated that it has no way of doing that, as the encrypted data can only be unlocked with the personal passcode – and the terrorist Farook is dead. What the court order is trying to compel Apple to do is to build a backdoor around the data-wiping feature. In the iPhone 5C model in question, the feature that wipes the data after 10 password tries is built in the iOS software. Interestingly enough, on a 5S and successor phones, that’s implemented in hardware. Essentially, the FBI wants Apple to write a custom version of iOS to disable the feature, which could then enable a brute force password attack on the iPhone to break in. The method they want to use is to attach a cable to the phone’s Lightning charging port, where a computer could then feed unlimited password attempts to unlock the phone.
Apple CEO Tim Cook has said that the technology is something the company doesn’t have and does not want to create, as it constitutes a “back door” into the iOS system security. In an open letter to throw the matter into the court of public opinion, Apple’s takes the position that creating this custom iOS version compromises years of improvement in the security of the system, and that once created the technology could fall into the wrong hands and be used by hackers to have a back door into millions of individuals’ data. Due to the lack of legislation in the area, as technology is typically decades ahead of law (which we discussed in our last blog post), the government has utilized an obscure law from 1789 called the All Writs Act. The law, as applied in previous cases in modern times, can compel Apple to help the government in the investigation, as long as it is necessary, the help is directly material to the case, and doesn’t pose an undue burden on the company. Apple has gone further and called the government’s request an “overreach” with “chilling effects”. FBI Director James Comey has countered that the agency only wants to get the data from this terrorist’s iPhone and that it does not seek the technology from Apple, nor for it to be built into every iPhone.
Tech Players in Apple’s Corner
Other major technology companies and leaders have voiced their support – but not necessarily loudly – in favor of Apple. Speaking this week at Mobile World Congress in Barcelona, Facebook’s Mark Zuckerberg said, “I don’t think building back doors is the way to go, so we’re pretty sympathetic to Tim and Apple.” Google’s Sudar Pichai Tweeted “forcing companies to enable hacking could compromise users’ privacy” and Twitter CEO Jack Dorsey said “We stand with @tim_cook and Apple”. Amazon and Microsoft have not commented publicly, but some have surmised that both companies have significant revenue from businesses with government agencies and may not want to become directly involved. There’s some irony in that Google and Facebook are publicly supporting Apple, as they are perhaps the two companies that collect the most information on users and it drives their advertising business models. Apple itself may also hold much of that information, but their business model depends on device sales and content services.
Implications For The Future
Apple’s stance on the matter is also in line with what security experts say – that weakening encryption or building back doors weakens security for everyone. But the company may have staked its position because of the legal precedents it can set. If it complies with the use of the law in this case, it may be compelled to use the same technique in may other cases, which may not involve national security. Manhattan district attorney Cyrus Vance Jr. says that there are 175 iPhones that the city has been unable to unlock in significant cases ranging from murder to sex crimes. He thinks Apple should be forced to comply with the order, and that ultimately Congress will need to weigh in on privacy versus security. The FBI insists they are seeking a narrow request, to unlock only Farook’s iPhone. But the law being what it is, other agencies, national and local, will likely seize on the precedent to make related demands in other cases. Apple is a global company, and it may also be warily eyeing other potential mandates from other governments to use the same techniques unlock iPhone data to comply with local laws and orders. The security that Apple has built into the iPhone helps to burnish its consumer brand and its image with its customers, and if that security is weakened it may affect its differentiation and ultimately its business. Corporations may be also looking at the precedents in this case. Do they want back doors that could potentially compromise sensitive information?
It’s not an easy issue. We can all appreciate how law enforcement agencies need to do their job but may be stymied by powerful technology – like strong encryption – that is as easily accessible to criminals as anyone else. On the other hand, the Patriot Act, and the revelations by Edward Snowden (harmful as they were in many other ways) also show how civil liberties we take for granted have been abridged in recent years in the name of security. Apple may be being altruistic here because it’s good for their business, or perhaps they are staking out a position they think is right one regardless, looking to bring the discussion to the public so that the right laws can be enacted. This is a key case, and certainly won’t be the last.
What do you think about the stance Apple has taken against the FBI’s request? Please comment below to let us know your thoughts.