Law enforcement’s response to the announcement by Apple (and later Google’s Android) that it would be encrypting by default the contents of users’ phones was met with predictable outrage by some representatives of the law enforcement community.  This outrage is fundamentally misplaced, and shows just how “out of the loop” the law enforcement and intelligence communities are when it comes to privacy and security.

It reflects the fundamental position that privacy and security are only for those with something to hide, and that if you want privacy, you must be a criminal.  This attitude fundamentally inhibits and restricts new privacy enhancing and security enhancing technologies.

Entities are forced to make their products or services vulnerable to hacking (as it they weren’t already) at great costs to them and to their users.  Security companies lose credibility in international markets, and law enforcement gains only a tiny incremental capability — and that only with respect to bad guys who are sloppy, lazy or uneducated.  We all become more vulnerable.  A lot.

Lets start with an admission.  The threat is real.  Bad guys use the Internet.  They use cell phones, e-mail, and TOR.  The 2014 iOCTA (Internet Organised Crime Threat Assessment), published by Europol’s European Cybercrime Centre (EC3), notes that “Cybercriminals also abuse legitimate services and tools such as anonymisation, encryption and virtual currencies.”

And they do so to commit all kinds of horrendous crimes.  And their use of these technologies can often not only facilitate the crimes, but also make them more difficult to investigate and prosecute.  But that doesn’t mean that we destroy the technology.

Years ago, when Congress was debating a law called CALEA (Communications Assistance to Law Enforcement Act) that required phone companies to deliberately spend millions of dollars to render their systems LESS secure, and in the early days of cell tower communications, representatives of the law enforcement community bemoaned the fact that, with the new “handoff” from tower to tower technology, cops might not be able to intercept communications travelling on cell phones.

“Make the cell phones like corded phones” they complained.  Make them so we can intercept the communications seamlessly.  And they did.  But when the phone companies wanted to make cell phones like coded phones in the sense that they didn’t retain the physical location of the callers  at they travelled, the cops objected.

Thus, it seems that law enforcement will adopt any technology that invades consumer privacy (e.g., stingray, GPS monitoring, trap and trace, pen registers, etc.) but will object to any that enhances privacy.

When Apple announced that it would be encrypting the contents of cell phones by default, and would not retain a key to these phones, FBI Director James Comey and other expressed outrage.  According to the Washington Post, Comey told reporters, “There will come a day when it will matter a great deal to the lives of people . . . that we will be able to gain access” to such devices, “I want to have that conversation [with companies responsible] before that day comes.”

The Post went on to note that “Comey added that FBI officials already have made initial contact with the two companies, which announced their new smartphone encryption initiatives last week. He said he could not understand why companies would “market something expressly to allow people to place themselves beyond the law.”

Now let’s be clear on what the technology does and does not do.  If it works (which it probably doesn’t) it gives the consumer the ability to encrypt the contents of their own data on their own device with their own password, which they control.

If the government wants access to that data, the government must serve a warrant on the owner of the data, and force them (consistent with the obligations of the Fifth Amendment) to decrypt the data.  That’s hardly placing them beyond the law.  If anything, it prevents law enforcement from placing themselves beyond the law.

Comey isn’t the only one objecting to putting control over people’s data in the hands of the people who own that data.  According to the Post, the Chief of Detectives for the Chicago Police Department, John J. Escalante noted that “Apple will become the phone of choice for the pedophile,” and that “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”  Really?

It’s funny because Blackberry has offered whole disk encryption, as has both the Apple OSX operating system (e.g., Truecrypt) and the Windows OS (BitLocker), which permit the user to lock the contents of their own machines.  I assume that pedophiles would pick phones for the same reasons non-pedophiles pick phones.

To understand how narrow minded these comments are, you have to understand how limited the proposed actions of Apple and Google actually are.  All they propose to do is to not retain a key to the encryption used by the consumer ON THEIR OWN DEVICE.

The emails, photographs, SMS, postings, Facebook pages, chats, and everything else that travels through the network remains available to law enforcement (or intelligence agencies) as does GPS tracking, triangulation and cell tower spoofing.  The contents of email, messages and telephone calls remain available to the cops.

The only thing that is limited is the ability of the seller of the phone to, at the behest of law enforcement, unlock the contents of the physical phone itself.  It would be like the cops bemoaning the fact that your mortgage company didn’t keep a spare key to your house to allow the cops to search it when you weren’t home (with, or without a warrant.)  Damn that mortgage company — enabling pedophiles!

The security proposals also don’t mean that the cops can’t read the contents of your phone.  Recently the Supreme Court ruled that cops can’t just examine the contents of a person’s cell phone without a warrant as a “search incident to a lawful arrest.”

So if the cops grab the cell phone of a pedophile and want to search it for evidence, they can likely get a warrant PERMITTING the search.  Permitting the search and making the search effective are different things.

The police can get a warrant to search your house.  This doesn’t mean they will find what they are looking for.  If your phone is encrypted, the cops would have to get another warrant ordering you to either give them the PIN, would have to crack the encryption, guess the PIN, or force you to decrypt the contents.  That’s what they object to.  Empowering the data owner.

Whether police can legally force someone to decrypt encrypted files is currently up in the air, with some courts saying yes, others saying no.

But actually, they don’t have to even go that far.  The contents of the phone are encrypted with a PIN — usually a four-digit number, occasionally a longer phrase (it’s an option, clearly only used by pedophiles, right?)  The longer the phrase, the harder it is to decrypt.

Applying Director Comey’s logic, only criminals have a need to protect their data.  Strong passwords, encryption, tokens, biometrics and the like are all signs of criminal activity.  In fact, the NSA went so far as to assert that the use of TOR routers per se established a reason to believe that someone was engaged in criminal or intelligence activities!

If cops want to decrypt the contents of a phone, there’s one other thing they can do (besides get a warrant, ask nicely, or brute force.)  They can demand that Apple or Google simply reset the user’s password — assuming that the providers will retain that capability.

If a password can be reset, it doesn’t have to be guessed.  The ability to do remote password resets has been used by hackers for generations to bypass security — most recently in the more than 1 billion userids and passwords found on Russian hacker boards.  If the crackers can do it, so can the FBI.

We have to get beyond the mindset that surveillance is a right, and that the default position should be that everything should be capable of surveillance.  When a medical device manufacturer is making a new pacemaker, they should be concerned about privacy and security — not with “how do I make sure that the FBI can tap this device?”

We see the FBI and NSA attempting to secure their own information — why should this ability be denied to everyone on the chance that someone is using the technology for evil.  Moreover, even without Apple and Google’s help, clever criminals and pedophiles can already use tools and technologies to attempt to hide their activities.

What Comey complains of is a classical logical fallacy.  Some criminals encrypt, therefore all people who encrypt are potential criminals.  Surveillance should not be overwhelming and ubiquitous. It should be narrow, limited and approved.  And difficult. So it is not abused.  And that’s what the cops object to.  But in the end, it’s the difficulty of surveillance, not its ease that protects privacy and liberty.

Leave a Reply