“It just really shocked me, because I came into this project thinking that these phones are really protecting user data well,” says Johns Hopkins cryptographer Matthew Green, who oversaw the research. “Now I’ve come out of the project thinking almost nothing is protected as much as it could be. So why do we need a backdoor for law enforcement when the protections that these phones actually offer are so bad?”
Terrorists should not feel free to upload terrible images of slaughter, but neither should they be empowered to empty people’s bank accounts or to tap the phones of presidents and prime ministers.“But,” people say, “What if only legitimate requests can get into the protected communications?” Weaknesses in computer systems are discovered by attackers all the time.
Before you delete all your data and throw your phone out the window, though, it's important to understand the types of privacy and security violations the researchers were specifically looking at. When you lock your phone with a passcode, fingerprint lock, or face recognition lock, it encrypts the contents of the device. Even if someone stole your phone and pulled the data off it, they would only see gibberish. Decoding all the data would require a key that only regenerates when you unlock your phone with a passcode, or face or finger recognition. And smartphones today offer multiple layers of these protections and different encryption keys for different levels of sensitive data. Many keys are tied to unlocking the device, but the most sensitive require additional authentication. The operating system and some special hardware are in charge of managing all of those keys and access levels so that, for the most part, you never even have to think about it.
With all of that in mind, the researchers assumed it would be extremely difficult for an attacker to unearth any of those keys and unlock some amount of data. But that's not what they found."On iOS in particular, the infrastructure is in place for this hierarchical encryption that sounds really good," says Maximilian Zinkus, a PhD student at Johns Hopkins who led the analysis of iOS. "But I was definitely surprised to see then how much of it is unused." Zinkus says that the potential is there, but the operating systems don't extend encryption protections as far as they could.
When an iPhone has been off and boots up, all the data is in a state Apple calls “Complete Protection.” The user must unlock the device before anything else can really happen, and the device's privacy protections are very high. You could still be forced to unlock your phone, of course, but existing forensic tools would have a difficult time pulling any readable data off it. Once you've unlocked your phone that first time after reboot, though, a lot of data moves into a different mode—Apple calls it “Protected Until First User Authentication,” but researchers often simply call it “After First Unlock.”
The FBI Backs Down Against Apple—Again
If you think about it, your phone is almost always in the AFU state. You probably don't restart your smartphone for days or weeks at a time, and most people certainly don't power it down after each use. (For most, that would mean hundreds of times a day.) So how effective is AFU security? That's where the researchers started to have concerns.