BankThink

Mobile Security Is at Stake in Apple-FBI Case

I am glad the Department of Justice picked a fight with Apple. Let me explain why.

For a long time, Apple has had a differentiated stance on how its choices in security drive privacy of user data — one that the tech giant has often sought to contrast with that of its competitors, especially Google. Where it once seemed that Apple's rigid stance on privacy and encryption hobbled its own ability to deliver a continuous user experience across its devices and services, that position has now equipped Apple to take up a principled position against the DOJ on behalf of user privacy.

The careful considerations Apple has made toward encryption and device security throughout its device lineup, such as those detailed in the iOS security white paper, are based on a core premise: Apple is disassociated enough from how the data on the device is encrypted. The data encryption on the device uses both the user-supplied PIN code and a globally unique device ID that cannot be queried by anyone or ever leave the device. In other words, Apple can't be influenced by a third party to decrypt data on the device. So, iCloud backups — which Apple encrypts with its own encryption keys — remain the only accessible data for Apple to decrypt. This is bound to change in favor of better privacy and security in the future as Apple pivots slowly from a device-centric company to a services-oriented one.

Since Apple won't easily be intimidated, the company can offer a better challenge to the DOJ than others the latter has battled. Take the anonymous email service Lavabit, for example. Chronicled here, Lavabit went from being told to install surveillance equipment that can sniff traffic, to being told to hand over its encryption keys so that every single email can be read, to ultimately deciding to shut down its service — all in under six weeks. Something tells us that this fight, however, won't be so asymmetrical and shrouded in secrecy.

What is being asked of Apple and why should it concern us?

The FBI has asked Apple to provide custom firmware, which can be installed on the device in question at either an Apple or FBI facility to: a) remove the rate-limiting restrictions that exist to discourage endless attempts to enter a device PIN code; and b) remove the automatic-wipe security measure that wipes the data on the device after 10 incorrect PIN code attempts.

Lacking the encryption keys Apple uses to sign its firmware, the FBI cannot attempt to do so itself. As Apple continues to be silent on the technical feasibility of an FBI request to break into an iPhone 5c, there has been much industry debate on whether such an attempt will work on newer iOS devices. The FBI contends that not only is Apple able to comply but it also can ensure the custom firmware it creates will be limited to the specific device.

Irrespective of whether Apple can limit the firmware's applicability to the device in question, the risk is the legal precedent may guide futureadjudication in matters relating to encryption, data security and privacy.

It could also mean that other tech and security companies may get pressured similarly — and the outcome may not be as scope-bound as a specific device, and instead could result in poorly conceived and badly implemented back doors or master keys. Stuxnet is the perfect example of such an asymmetrical weapon, engineered to sabotage Iran's uranium enrichment program, only to lose control of it and see it spread outside of its originally intended scope.

Regardless of whether Apple custom designs a firmware, specifically for the iPhone 5c in question, the larger concern is once such firmware exists, it amounts to a back door that cannot be unlearned. If Apple's own private keys are what afford such firmware legitimacy and trust so that it can be installed on a device, then what happens if the federal government asks Apple to surrender its keys, as they have in the case of Lavabit under a gag order? Apple has an unenviable position, that any decision the company makes specific to a single device can affect security decisions it is allowed to make on everything else Apple hasn't yet designed.

What does this impact?

Apple has commendably abstracted away all of the complexity around pervasive and unbreakable security in its devices so user data can remain secure, private and conveniently accessible. By encrypting the data on a device using both the PIN code and the unique device ID, Apple has ensured that no one can break into your device without authorization. You can back up your data into iTunes and store data on a computer, which then again is encrypted by the same key combination — putting the data out of reach for any and all malicious or unauthorized access. The iCloud backup remains the third route for either you or law enforcement to access your files — the former using your Apple ID, and the latter with a subpoena.

The question then becomes: How will Apple respond in the future, especially in how it designs security into its devices, to counter what it sees as an overreach by the U.S. government?

It has already disassociated itself from encrypting data on the device, using a combination of user-supplied PIN codes and unique device IDs that aren't known to Apple. But will it go even further? Will Apple discourage future law enforcement requests to weaken device security by triggering a device data wipe if a firmware were to be forced without the correct PIN code? Will Apple stop using its own encryption keys for iCloud backups, and instead, bring backups in line with how user data is encrypted in the other two scenarios — a two-factor method that uses a user-supplied PIN code? These decisions, if Apple chooses to implement them, could cost Apple the careful balance it provides between user experience and pervasive and unbreakable security.

And what if the government or the courts begin to force companies like Apple from engineering stronger security? Instead of recommending thresholds in system security to protect consumer data and commerce, will we begin to dictate how tall the ceiling can be? For instance, could Apple be stopped from changing how iCloud backups are encrypted — which will obviate any and all incoming subpoenas for access to encrypted data? Should judges, who are woefully behind in understanding technology, be proposing boundaries in cryptography?

San Bernardino won't be the last time we explain the importance and role of cryptography in securing everything from private messages and pictures to banking and global commerce. Regardless of how the DOJ makes this out to be about a single, specific device, outcomes rarely behave by the same rules.

What is at risk requires a strong articulation in a climate of fear, and Apple and Tim Cook deserve all the credit for trying their best. Choices in security affect us all. Cryptography is not a tool for the criminal and math has no agenda.

Cherian Abraham is a mobile security and payments consultant. He writes regularly on topics surrounding fraud, identity and payments on his blog Drop Labs. He can be reached on Twitter @cherian_abraham.

For reprint and licensing requests for this article, click here.
Bank technology Consumer banking Cyber security Digital banking Authentication Nonbank
MORE FROM AMERICAN BANKER