Apple vs. FBI – a curious case of Privacy, National Security and Public Safety
- In Mathematics, Science & Technology
- 09:46 PM, Mar 12, 2016
- Suresh S Murthy
For the last few weeks, an interesting legal battle has ensued between Apple and the FBI which has gathered a lot of press. The Department of Justice (DOJ) filed a motion to compel Apple, to build a software to help them unlock the FBI unlock an iPhone 5c and gain access to the data inside it. A federal judge on 16th February 2016, in the US District Court of Central California, ordered the tech giant to provide the FBI with ‘reasonable technical assistance’. The DOJ motion is primarily based on the All Writs Act and hopes to get Apple comply fully by assisting it in getting access to the data in the iPhone.
The motion filed by DOJ states the following:
· The court doesn’t want Apple to ‘hack’ or ‘decrypt’ its own phones.
· It doesn’t give the government ‘the power to reach into anyone’s phone’ without a warrant or authorization
· It doesn’t order apple to create any ‘back door’ to all phones.
· Apple will have the custody of any software it builds and has the flexibility in the way it chooses to provide assistance.
The FBI which has been investigating the San Bernardino terror attack that occurred on December 2, 2015, has in its possession an iPhone 5c, with the iOS9 version of software running on it. The iPhone belonged to one of the terror accused Syed Rizwan Farook. The FBI believes the iPhone in question could help them establish links between the perpetrators of this heinous terror attack and their accomplices.
Why does FBI need Apple’s help?
The investigation by FBI hit a dead end, when it was unable to unlock the iPhone and tap into the data on the phone. Apparently, a security feature in iPhone clears all the data in the phone when there is an attempt to unlock the iPhone with an incorrect password, after 10 failed attempts. More about the security features of iPhone here.
To make things worse, the Apple Id and password associated with the user of this iPhone seem to have been reset after the iPhone was taken into custody. This eliminated one of the options where the data in the phone could have been backed up to the iCloud account of the iPhone user when it connects to an earlier known network. This pretty much left the FBI with no other option but to approach Apple for help.
What’s FBI really asking Apple to do?
The FBI is asking Apple to compromise with the security feature and add a new ability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by brute force, trying thousands or millions of combinations with the speed of a modern computer. The FBI also wants to try passcodes as quickly as possible, so it wants Apple to disable the delay between passcode attempts, plus allow passcodes to be inputted by a computer, either through the iPhone’s Lightning port or wirelessly, a feature that has never existed in a publicly shipping version of iOS.
The above requests are nothing but asking Apple to deliberately compromise the security features of its product. Such software is not in existence today and by asking Apple to develop one, FBI is setting a dangerous precedent.
How did Apple respond?
Even though in his earlier recent blog post the FBI director James Comey writes that litigation isn't about trying to set a precedent or send any kind of message, it’s not what Apple thinks.
Apple provided a legal response to the court order by filing a motion to vacate the court order on February 25, 2016. In addition to it, Apple CEO Tim Cook, wrote a letter addressing the customer by explain Apple’s stance, on what it thinks as setting a dangerous precedence. In his 30 minute long interview with ABC’s David Muir here, Tim Cook outlines his stance with great detail. Tim cook described this request from FBI as equivalent of software equivalent of cancer! It is something Apple claims it has never done it, nor wants to ever write it.
As Tim cook says here, “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
The million dollar question – Can Apple hack into its own iPhone?
On all devices of Apple running iOS 8, personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of the passcode. At the onset Apple hacking into its own iPhone seems to be really difficult.
However, as one of the iOS security experts Will Strafach, (CEO of SudoSecurityGroup, a mobile security firm) who made a name for himself as one of the most widely known and respected iOS hackers in the world, writes here about Apple’s options. In this article, he talks about how Apple could help FBI here which might really be a ‘one off’ solution.
According to him, “Apple could carry out the order by creating a custom RAM disk signed by the company’s production certificate for the specific ECID of the suspect’s iPhone. This solution would allow Apple to use existing technologies in the firmware file format to grant access to the phone ensuring that there is no possible way the same solution would work on another device.”
The only issue he cites with the above approach is that, once Apple does it, it will establish 2 things:
1. Breaking into an iPhone is ‘possible’.
2. Allow the FBI to use this case in the future as leverage and make more unreasonable demands in future cases.
One can read here some other options that the author suggests are possible but could cause problems for Apple in foreign countries where it operates if the governments of those countries make similar demands. Apple surely doesn’t want to go down the path of RIM (maker of BlackBerry phones) had knuckled under the pressure to the Indian government, giving security forces access to private instant messages.
Another excellent piece here by cryptographer and professor, Matthew Green, at Johns Hopkins University, covers the details of how encryption really works on Apple iPhones and what is the way forward for Apple. It also answers a common question people have had about Apple supporting law enforcement in the past to get data from iPhones and their policy this time around to not do it.
For those would like to dive into understand the technicalities of full disk encryption, please read this brilliant piece here by David Scheutz. Below is a really super concise summary from this article.
“The bottom line, the real “too long, didn’t read”:
- Apple does seem to have made it much more difficult for anyone to get at data on a locked phone
- Some of these protections are reduced once you’ve unlocked the phone once
-
Many of these protections are aimed at attacks requiring a reboot of the device
- Apple may be able to access the filesystem on a device, but this would require a reboot
- So unless they crack your passcode, they won’t be able to read the protected files.
- Apple may be able to crack a passcode
- But each attempt takes at least 80 mS, and as much as 5 seconds on newer devices
- A strong passcode (6 or more letters or 8 or more numbers) can take years to break
To sum it up in one sentence:
- Use a strong passcode, and power off your device whenever it’s out of your control.”
Is there a middle ground?
This case has brought up some very interesting questions which needs a fresh set of debate in the public domain involving experts from multiple areas. Let’s look at some important challenges that this case has brought up.
1. Can the government legally force a technology company to compromise with the security features offered in their products?
2. The device in San Bernardino terror attacked belonged to the employer of the terror accused. Why did they employer not implement some basic things like Multi device management practices? It is a failure on the part of the employer? Is this an issue of creating awareness amongst employers?
3. As Apple said, people shouldn’t have to choose between ‘Security’ and ‘Privacy’. There has to be a smarter way to solve this problem and achieve both. Is encryption on the iPhone in its current form, proving to be a double edge sword? On one side, it addresses the concern with security and privacy but leaves no room for law enforcement to ‘get-into’ the devices if they need to investigate.
4. If the companies really do have back-doors into their products, how can they protect the critical data of their customers by keeping the back-doors from the ‘bad guys’? And, what exactly is the definition of a ‘bad guy’ anyway. Can a government with the knowledge of such backdoors start sniffing into the private lives of people?
5. Will compromising security features in the products for purpose of helping law enforcement, pose a greater threat to customer data?
For now Apple seems to enjoy a huge support from other technology companies like Google, Facebook, Microsoft, Intel, Twitter, LinkedIn, Thought Works, etc. All these companies have openly expressed supported for Apple in this case. With his overwhelming support from its peers, Apple seems to have temporarily won the PR battle with FBI. However a case this crucial shouldn’t be really reduced a PR battle.
One can read this open letter from Roy Singham, Chairman of Thought Works here, expressing his support to Apple. The letter brings to the light the significance of Apple’s position here and what precedent it would set to other governments in the world, where it does business. Any policy decision by Apple will have ramifications on all it users worldwide.
There is a dire need to come up with comprehensive legislations which provide guidelines to technology companies around how they can assist the law enforcement agencies without having to compromise the security and privacy of their customer data. A bipartisan encryption bill has been tabled in congress for a discussion. Hopefully this will help avert such standoff between technology companies and the law enforcement agencies in future.
Comments