Is Apple too secure?
The iPhone maker wants to protect its products from prying eyes, even its own
The public discussion surrounding the FBI vs. Apple fight over iPhone encryption seems to be missing some relevant information. Most of what I have seen or read centers on the privacy aspect of the discussion, with some expressing concern about Apple providing law enforcement with “backdoor” access to people’s data. Most of this concern seems to come from the NSA spying revelations from several years ago.
As someone who works in a digital crime lab for law enforcement, I can attest: There are no means to get around a fully encrypted iPhone, other than with the passcode. A forensic examination can obtain limited data from other methods (iCloud backups, etc.), but this information is very limited compared to what is present on the phone itself, and Apple is rumored to be moving to block access to even these methods in the name of privacy.
I agree that privacy is important to maintain, but I also believe that trust in law enforcement is important as well.
Each month, the lab I work in is forced to reject several requests for digital analysis due to inability to get past the encryption. In most cases, these investigations involve rape, child pornography and sexual assault. In each of these cases, a legal search authority (verified by a judge), has been provided to examine the phone. In each rejected case involving an encrypted phone, the investigation usually ends due to insufficient evidence. That means each time, a federal agent or police officer has to go to the victim and tell them “I’m sorry. We understand something terrible happened to you, but the suspect didn’t provide the passcode to their phone, and we have no further evidence. There’s nothing more we can do to help you.” When that happens, trust in the legal system erodes. I can’t imagine being a victim (or the parent of a victim) of an awful crime, and having the police respond with “There’s nothing more we can do.”
In a perfect world, we wouldn’t have to choose between privacy and security. In a perfect world, we also wouldn’t have to choose between buying expensive things we want and paying our bills. Unfortunately, most of us have to make choices over what’s more important. The same goes for the reality of encryption. Do you want privacy, or do you want the police to be able to help you if a crime happens to you? Currently, this is something only a small percentage of victims experience. If Apple prevails, and its style of encryption is adopted in a more widespread manner, then this is something many more victims will come to experience.
Apple officials have been clear on their position. They want to encrypt the user’s data to the point that even Apple can’t access it if they want to. Thinking from a practical experience, imagine if this was a car with the world’s best locks. What happens when an owner forgets their keys (or locks them inside)? How do you get into the car you own? What happens when police have legitimate reason to believe evidence of a crime exists within the car? Nobody (not even the car manufacturer) can access the car. This is not a good thing.
Apple has raised some good points about the need to maintain privacy. They have also raised a good point that if a vulnerability is provided, it may be exploited by others. However, Apple has forgotten to mention that no single person on the planet (not even Tim Cook, Apple’s CEO) has all of their data maintained on Apple devices. That means every person’s data can be compromised, regardless of what Apple does. Also, Apple has no control over how people use their products. It isn’t Apple’s fault if people use their products to commit crimes. It is their fault, though, if they choose to do nothing to assist law enforcement with protecting the victims of those crimes.