Apple shot down a request from U.S. Attorney General William Barr this week, saying it will not help unlock two iPhones used by a terrorist suspect last month in the deadly shooting at the Naval Air Station in Pensacola, Florida.
Barr said the shooter, 21-year-old Mohammed Saeed Alshamrani, acted alone when he shot and killed three service members and wounded several others, including two sheriff's deputies responding to the attack.
Alshamrani, a member of the Saudi Air Force and an aviation student at the base, was shot dead on the scene by police.
The back-and-forth between Apple and the Department of Justice is the latest scuffle involving the company's privacy stance and government efforts to get around that stance.
Beyond the immediate dispute over the current investigation, the standoff has implications for the safety of corporate data on personal devices.
If the government succeeds in forcing Apple to subvert iPhone security, corporate IT managers will be put in a sticky situation, said Alan Butler, general counsel for the Electronic Privacy Information Center (EPIC).
That's because most employees either use smartphones under "bring your own device" (BYOD) policies or rely on company devices to conduct business and transfer sensitive information, whether it’s communications or data schematics.
"That's a lot of sensitive information that may be privileged, that may be trade secrets or covered by [International Traffic in Arms Regulations (ITAR)] – so [it's] things you have a legal obligation to protect," Butler said.
"So companies also need assurances that the hardware they’re deploying is secure. If the government is ordering the company to introduce flaws into the security of the hardware or software ... it could compromise that corporate data."
In a statement posted on Twitter, Apple disputed Barr's claim that it hasn’t given “substantative assistance” in the investigation, noting it provided access to the cloud service used to back up data on Alshamrani’s phones.
“Our responses to their many requests since the attack have been timely, thorough and are ongoing,” Apple said.
“Within hours of the FBI's first request on December 6th, we produced a wide variety of information associated with the investigation. From December 7th through the 14th, we received six additional legal requests and in response provided information including iCloud backups, account information and transactional data for multiple accounts.”
Barr, however, apparently wants a more permanent method of access, known as a “backdoor,” to be installed in iOS software to allow law enforcement access to encrypted devices in the future.
“This situation perfectly illustrates why it is critical that the public be able to get access to digital evidence,” Barr said during a news conference Monday. He called on Apple and other tech firms to find a permanent solution to help in this and future investigations.
Reiterating its past stance, Apple said: "We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers.
"Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users' data."
Kurt Opsahl, deputy executive director of the non-profit digital rights advocacy group Electronic Frontier Foundation, said Apple is right to provide strong security to its users, requiring a passcode or biometrics to unlock their smartphones.
“The Attorney General’s request that Apple re-engineer its phones to break that security imperils millions of innocent Americans and others around the globe, and is a poor trade-off for security policy,” Opsahl said.
Vladimir Katalov, CEO of Russian forensic tech provider ElcomSoft, called Barr's request unrealistic because Apple can’t “technically” unlock iPhones because of file-based encryption and secure enclave technology; it boots up separately from iOS and runs its own microkernel not directly accessible by the iPhone operating system.
“Of course, it is technically possible to add backdoors, implement escrow keys and things like that. But first, one cannot legally regulate the technology - secure communication channels and [secure data storage] will still remain… and no government can force any person not to use the encryption, or use only a ‘certified’ one,” Katalov said via email. “Second, such backdoors will be exploited by criminals now or later. The consequences may be catastrophic.”
By including a method to unlock smartphones at will, Apple would be opening up all of its smartphones to potential attacks by any bad actor, Katalov said.
The latest request by the Justice Department is part of an ongoing struggle between law enforcement and Apple.
In 2016, the Justice Department, backed by a federal court injunction, ordered Apple to unlock the iPhone of Syed Rizwan Farook, a suspect in the San Bernadino terrorist attack, in December of that year.
At the time, Apple CEO Tim Cook said his company couldn’t give the FBI any more help, claimed that utilising the law justifying the court’s order was “unprecedented” and again refused to help unlock the iPhone.
In 2018, two companies claimed to be able to unlock any iPhoneusing blackbox technology acquired by regional law enforcement officials and accessed through contracts with Immigration and Customs Enforcement (ICE) and the U.S. Secret Service.
Atlanta-based Grayshift and Israeli-based Cellebrite claimed they could thwart iPhone passcode security through brute-force attacks and full file-system extraction on any iOS device, or a physical extraction or full file system (File-Based Encryption) extraction on many high-end Android device.
Cellebrite’s UFED Cloud Analyzer tool can purportedly unlock, decrypt and extract phone data, including “real-time mobile data … call logs, contacts, calendar, SMS, MMS, media files, apps data, chats, passwords,” according a document obtained by a Freedom of Information Act request filed by EPIC.
Grayshift’s GrayKey blackbox could apparently unlock an iPhone in about two hours if the owner used a four-digit passcode and in about three days or longer if a six-digit passcode was used.
Apple later announced it had found a way to block Grayshift’s GrayKey iPhone hacking tool.