Rhizomes

The Most Important Tech Case in a Decade

· Ronen Lahat

The Most Important Tech Case in a Decade

In the wake of the Apple-FBI case and the importance of privacy issues in the digital age, I gathered together pieces of relevant articles for those who want to better understand the case which Edward Snowden called on Twitter “the most important tech case in a decade.” Most notably, the official order compelling Apple to assist agents of search, and Apple’s official letter to customers rejecting the request.

The FBI demands Apple’s technical assistance to accomplish the following:

The FBI wants a Software Image File that can be loaded to the subject device’s RAM without modifying any data, and coded by Apple with a unique identifier so that it presumably wouldn’t work on any other phone. The phone belongs to Syed Rizwan Farook, who with his wife was responsible for the terrorist attack in San Bernardino, Calif., that left 14 dead and 22 wounded.

“Building a version of iOS that bypasses security in this way would undeniably create a backdoor”

Apple writes, “the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession. The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. (…) Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.”

Apple has cooperated in the past to unlock dozens of phones in other cases. According to prosecutors in that case, Apple has unlocked phones for authorities at least 70 times since 2008. (Apple doesn’t dispute this figure.) And the Justice Department is demanding Apple’s help in unlocking at least nine more iPhones. Since the allegations by Edward J. Snowden of mass surveillance by the NSA Apple, with its new iOS 8 in September 2014, announced that they “will not perform iOS data extractions in response to government search warrants”. The phone used by deceased San Bernardino shooter was a model 5C, was running version 9 of the operating system.

“We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

But it turns out it overlooked a loophole in doing this that the government is now trying to exploit. The loophole is not about Apple unlocking the phone but about making it easier for the FBI to attempt to unlock it on its own. Rather than ask Apple to unlock the phone, the FBI wants Apple to develop a way to “brute-force” the password (guess until it finds a match). Currently, 10 wrong password tries will trigger a mechanism that deletes the key that decrypts the data, making it inaccessible forever. (Yet if Farook chose a complex six-digit alpha-numeric for his password, Apple said that brute-force attempts could take five-and-a-half years or more to crack, if ever.) Apple themselves can’t deactivate this feature. “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.” Apple would have to build a new version of its iOS smartphone software that allows the F.B.I. to bypass the restrictions.

Tim Cook said that requiring it to create software to bypass this feature would set a dangerous precedent. In a note on its website, Apple said law enforcement agencies nationwide “have hundreds of iPhones they want Apple to unlock if the F.B.I. wins this case.” The Chinese government and the European Union has made similar demands upon them, so the implications are international in scope. Furthermore, once you open a door for somebody, you unwillingly allow anybody smart enough to access, like malicious hackers and repressive governments.

The F.B.I can’t build the software themselves. The iPhone is designed to run only iOS software created by Apple. For the phone to recognize that the software was made by Apple, the company must sign each piece with an encrypted key to verify it. Even if the F.B.I. tried to build a new version of iOS, it would not have Apple’s signature.

Authorities already have the power to get data stored on online services like iCloud and Google’s Gmail through search warrants. And they can get records of phone calls and text messages from companies like Verizon and AT&T. Before the issue was publicised, Apple gave the FBI the iPhone backup stored on iCloud, yet the last backups stopped on October 19 (One and a half months prior to the attack). “When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.” Apple guided the FBI to have the phone sync automatically to iCloud through Farook’s paired devices at his home wi-fi, but the owner of the iPhone (the county, Farook was a county employee) already tried to gain access to the iPhone by reseting the password of the apple ID on the device, and since then they eliminated the possibility of auto-backup to iCloud.

“Compromising the security of our personal information can ultimately put our personal safety at risk.”

Apple wrote in their letter to customers: “Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going. All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. (…) Compromising the security of our personal information can ultimately put our personal safety at risk.”

It will give law enforcement agencies the power to put demands on technology companies to send their users malicious code that is signed to look as if it’s coming from Apple, Google or Microsoft as a regular product update.

This case is not about this one phone, and in fact it’s not just about phones. In the way that US courts work, if a US court rules that in a criminal matter the government can compel Apple to write malicious code to run on its devices, it can do so in every kind of criminal investigation. That’s not a matter of debate since law enforcement agents acknowledge that are seeking this precedent. If the FBI succeeds it will give law enforcement agencies the power to put demands on technology companies to send their users malicious code that is signed to look as if it’s coming from Apple, Google or Microsoft as a regular product update. This applies for all devices and even devices that haven’t yet even been invented. The effect that this would have would not only harm privacy, it would harm actual physical security for millions of people all around the globe. It would also harm technology companies because it would put them at a disadvantage, enabling foreign companies to create secure products that companies, manufacturers and ordinary people would prefer to use.

This case is not just the backdoors that would arise, but also the legal precedent for “frontdoors” for government agencies ordering software updates to downgrade the privacy and security of devices.

I believe that it’s very difficult to find the balance between security and privacy. In a real world simile, could we -under a federal warrant- entrust a master key for an apartment home knowing that it could be tweaked to possibly open the doors of all homes? Yet no actual door is unbreachable, and no terrorist can safely hide behind one once a warrant has been granted. In the age of constant tracking and exposure, Apple’s engineers have made a supposedly unbreachable device, and that can also be dangerous. This case is not just the backdoors that would arise, but also the legal precedent for “frontdoors” for government agencies ordering software updates to downgrade the privacy and security of devices.

References:

In the Matter of the Search of an Apple IPhone Seized during the Execution of a Search Warrant on a Black Lexus IS300. Courtroom of the Hon. Sheri Pym. 16 Feb. 2016.

“Customer Letter - Apple.” Apple. N.p., 16 Feb. 2016.

In the Matter of the Search of an Apple IPhone Seized during the Execution of a Search Warrant on a Black Lexus IS300. Courtroom of the Hon. Sheri Pym. 19 Feb. 16. Wired.com. Conde Nast Digital, 19 Feb. 2016.

“Fight over Gunman’s Locked IPhone Could Have Big Impact.” AP, 17 Feb. 2016.

“Apple Says the Government Bungled Its Chance to Get That IPhone’s Data.” Wired.com. Conde Nast Digital, 19 Feb. 2016.

“Apple’s FBI Battle Is Complicated. Here’s What’s Really Going On.” Wired.com. Conde Nast Digital, 18 Feb. 2016.

“Apple: Congress, Not Courts, Must Decide.” The New York Times. Eric Tucker, 24 Feb. 2016.

“Why Apple Is Right to Challenge an Order to Help the F.B.I.” The New York Times. N.p., 18 Feb. 2016.

Bratton, William J., and John J. Miller. “Seeking IPhone Data, Through the Front Door.” The New York Times. N.p., 22 Feb. 2016.

Barret, Brian. “Tim Cook Says Apple Will Fight Order to Help Unlock IPhone.” Wired.com. Conde Nast Digital, 17 Feb. 2016.

Isaac, Mike. “Explaining Apple’s Fight With the F.B.I.” The New York Times, 17 Feb. 2016.

Crockford, Kade. “Apple vs. the State - BBC Newsnight.” BBC Newsnight, 25 Feb. 2016.


The Most Important Tech Case in a Decade was originally published on LinkedIn.

#Privacy #Apple #Fbi #Encryption #Security