Saturday, February 20, 2016

Digital Privacy

A story has recently broken about a fight between Apple Computer and the FBI.  The context is the San Bernardino massacre which resulted in 14 deaths and many injuries.  The perpetrators, Syed Farook and Tashfeen Malik, were dead within hours.  So the "who" in "who done it" has been known for some time now.  The only open questions have to do with how much help they got and from whom.  There has been a lot of progress on that front too.

Enrique Marquez, a friend and neighbor has been arrested.  Among other things, he purchased some of the guns that were used by the perps.  Literally hundreds of searches have been done and mountains of evidence has been seized.  Online accounts of all kinds have been scrutinized.  Even after all this effort there is more to be learned.  A few days before I wrote this Syed's brother's house was searched.  This was only the latest in a series of searches of his house.

As a result of all this effort the story is pretty much known.  All that is left is to fill in some details.  It is possible that a new major development could be unearthed in the future, say substantial participation by overseas terrorist groups.  But the chances are small.  And that brings us to the phone.

A tiny part of the mountain of seized evidence is a smart phone that belonged to one of the perps.  It has been in FBI custody for some time now.  But that hasn't stopped the FBI from being frustrated, literally.  The phone is encrypted.  The FBI has not been able to break or get around the encryption so they have not been able to access the contents of the phone.  This literal frustration has not been for want of trying.  At least that's the story from both the FBI and Apple Computer.  The FBI has asked Apple for assistance and Apple has provided it.  But the FBI now says Apple must take that assistance to a new level.  And that's what the fight is about.

Before proceeding let me stop to make what I believe is an observation of monumental importance.

ENCRYPTION WORKS

Why is this so important?  Have you seen an "action" movie or TV show any time in say the last 50 years?  These shows often feature a scenario where encrypted data is critical, frequently a matter of life and death.  Sometimes a good guy is trying to decrypt the bad guy's secret plan.  Sometimes it is a bad guy trying to decrypt the good guy's security system so he can steal the secret formula or the invasion plans or whatever.  Regardless, the scene is always handled in the same way.

A geek types away furiously while "action" visuals play out on screen and dramatic music (queue the "Mission Impossible" theme) plays underneath so we will know that important things are happening.  This goes on for about 20 seconds of screen time which may represent perhaps a few hours or days of "mission" time.  But we always have the "Aha" moment when the geek announces that the encryption has been cracked.  And it never takes the geek more than a week to crack it.  In fact it is common for the geek to only need a few minutes.

This is the pop culture foundation for a belief that is widespread and grounded in things that are a lot more solid then a TV script.  We've all seen it over and over so it must be true.  Any encryption system can be broken.  All you need is the genius geek and perhaps a bunch of really cool looking equipment.  People in the real world support this idea often enough for one reason or another that most people have no reason to doubt its veracity.  But it is not true.  And we know it is not true because the FBI has just told us.  Let's look at why.

It starts with the fact that FBI has publicly said that it has been unsuccessful in breaking Apple's encryption.  This is in spite of the fact that they have had weeks in which to try and they have had a considerable amount of cooperation from Apple.  But wait, there's more.  Which government agency is the one with the most skill, equipment, and experience with encryption?  The NSA (National Security Agency).  It's literally what they do. 

Before 9/11 it was possible to believe that the FBI and the NSA did not talk to each other.  It was possible to believe this because it was true.  But in the post-9/11 era those communications barriers were broken down and there is now close cooperation between the two agencies, especially on terrorism cases like this one.  It is literally unbelievable that the FBI has not consulted with the NSA on this problem.  And that means the NSA has also not been able to crack Apple's encryption either.

Let's say they had.  Then the FBI could easily have covered this up by claiming that their own people had cracked the phone.  Even if this was not believed it provides the standard "plausible deniability" that is commonly used in these situations.  It doesn't matter if the official line is credible.  It only matters that there is an official line that officials can pretend to believe.  This is why I believe the NSA failed too.  (For a counter-argument see below).

There is actually a lot of evidence that encryption works but it is the boring stuff that the media ignores.  It gets dismissed as a "dog bites man" story.  I worked in the computer department of a bank for a long time.  They treated computer problems that could screw up data very seriously.  "We are messing with people's money and people take their money very seriously."  I then worked for a company that ran hospitals and clinics.  After observing the culture there I remarked "If you want to see people who treat computer problems seriously, talk to bankers.  They deal with money.  Around here we only deal with life and death and that's not as serious."  That's a cute way of highlighting that people take money very seriously.  And every aspect of handling money now depends critically on encryption.

If even one of the common encryption systems used in the money arena could be cracked there is a lot of money to be made.  Look at the amount of noise generated by people stealing credit card information.  It has finally caused the credit card industry in the US to move from a '60s style magnetic stripe technology to a modern RFID chip based one.  The important take away is that the hackers have never broken into a system by breaking the encryption.  They have used what is generally referred to as a "crib".  One of the most successful cribs goes by the name of Social Engineering.  You call someone up (or email them or whatever) and talk them out of information you are not entitled to like say a high powered user id and password.  You use this information to break into the system.

Important data has been encrypted for many decades now.  The DES standard was developed and implemented in the '70s.  It is considered weak by modern standards but I know of no successful attempt to crack it.  But the long voiced but so far not validated idea that "it might be crackable soon" has been enough to cause everybody to move on.  Something called triple-DES was shown to be harder to crack after double-DES was shown to provide no improvement.  We have since moved on to other encryption standards.

A common one in the computer business is "Secure Sockets".  Any web site with a prefix of HTTPS uses it.  It is now recommended for general use instead of being restricted to use only in "important" situations.  The transition has resulted in some variation of a "show all content" message popping up with annoying frequency.  That's because the web page is linking to a combination of secure (HTTPS) and unsecure (HTTP) web sites.

If the basic algorithm (computer formula) is sound the common trick is to make the key bigger.  DES used a 56 bit key.  The triple-DES algorithm can be used with keys that are as long as 168 bits.  Behind the scenes, HTTPS has been doing the same thing.  Over time the keys it uses have gotten longer and longer.  And a little additional length makes a big difference.  Every additional bit literally doubles the number of keys that need to be tested in a "brute force" (try all the possible combinations) attack.

So piling on the bits fixes everything, right?  No!  It gets back to that crib thing.  Let's say I have somehow gotten hold of your locked cell phone.  What if I call you and say "I'm your mother and I need the key for your phone."  Being a dutiful child you always do what your mother says so you give me the key.  At this point it literally doesn't matter how long the key you use is.  Actually no one would fall for so transparent a ploy but it illustrates the basic idea of Social Engineering.  It boils down to tricking people into giving you information that you can use to get around their security.

If I can get your key I have effectively reduced your key length to zero.  Cribs can be very complex and sophisticated but a good way to think of them is in terms of ways to reduce the effective key length.  If I can find a crib that reduces the effective key length to ten bits that means a brute force attack only needs to try a little over a thousand keys to be guaranteed success.  I once used a brute force approach to figure out the combination of a bicycle lock.  The lock could be set to a thousand different numbers but only one opened it.  It took a couple of hours of trying each possibility in turn but I eventually succeeded in finding that one number.  Under ideal circumstances a computer can try a thousand possibilities in less than a second.

And Apple is well aware of this.  So they added a delay to the process.  It takes about a twelfth of a second to process a key.  This means that no more than a dozen keys can be tried in a second.  And the Apple key is more than ten bits in length.  But wait.  There's more.  After entering a certain number of wrong keys in a row (the number varies with iPhone model and iOS version) the phone locks up.  Under some circumstances the phone will even go so far as to wipe everything clean if too many wrong keys are tried in a row.

The FBI is not releasing the details of what they have tried so far.  And Apple has not released the details of what assistance they have rendered so far.  But this particular iPhone as currently configured is apparently impervious to a brute force attack.  Whatever else the FBI has tried is currently a secret.  So what the FBI is asking from Apple is for changes to the configuration.  Specifically, they want the twelfth second delay removed and they want the "lock up" and "wipe after a number of failed keys" features disabled.  That, according to the FBI, would allow a medium speed brute force attack to be applied.  Some combinations of iPhone and iOS version use relatively short key lengths so this would be an effective approach if the phone in question is one of them.

But Apple rightly characterizes this as a request by the FBI to build a crib into their phones.  Another name for this sort of thing is a "back door".  And we have been down this path before.  In the '90s the NSA released specifications for something called a "Clipper chip".  It was an encryption / decryption chip that appeared to provide a high level of security.  It used an 80 bit key.  That's a lot bigger than the 56 bit key used by DES so that's good, right?  The problem is that the Clipper chip contained a back door that was supposed to allow "authorized security agencies" like the NSA to crack it fairly easily.  The NSA requested that a law be passed mandating exclusive use of the Clipper chip.  After vigorous push back on many fronts the whole thing was dropped a couple of years later without being implemented broadly.

We can also look to various statements made by current and former heads of various intelligence and law enforcement agencies.  The list includes James Clapper (while he was Director of National Intelligence and since), former NSA director Keith Alexander, and others.  They have all railed against encryption unless agencies like theirs are allowed back doors.  Supposedly all kinds of horrible things will happen if these agencies can't read everything terrorists are saying.  But so far there is no hard evidence that these back doors would be very helpful in the fight against terrorism.  What they would be very helpful for is making it easy to invade the privacy of everybody.  Pretty much nothing on the Internet was encrypted in the immediate post-9/11 period.  Reading messages was helpful in some cases but the bad guys quickly learned how to make their messages hard to find and hard to read.

These agencies have swept up massive amounts of cell phone data.  Again, mass data collection has not been shown to be important to thwarting terrorist plots.  After they are on to a specific terrorist then going back and retrospectively reviewing who they have been in contact with has been helpful.  And, by the way, that has already been done in the San Bernardino Massacre case.  But the FBI argues that even after all these other things have been done they still desperately need to read the contents of this one cell phone.  We have been told for more than a decade that the "collect everything" programs are desperately needed and are tremendously effective.  The FBI's current request indicates that they are not all that effective and that means they were never needed as badly as we were told they were.

The FBI also argues that this will be a "one off" situation.  Apple argues that once the tool exists its use will soon become routine.  If cracking a phone is difficult, time consuming, and expensive after the tool exists then the FBI may have a case.  But if it is then what's to stop the FBI from demanding that Apple build a new tool that is easier, quicker and cheaper to use.  Once the first tool has been created the precedent has been set.

The fundamental question here is whether a right to privacy exists.  The fourth amendment states:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

A plain reading of the language supports the idea that a privacy right exists and that the mass collection of phone records, whether "metadata" or the contents of the actual conversation, is unconstitutional.  The Supreme Court has so far dodged its responsibility by falling back on a "standing" argument.  I think the standing argument (which I am not going to get into) is bogus but I am not a Supreme Court justice.  And the case we are focusing on is clearly covered by the "probable cause . . ." language.  The FBI can and has obtained a search warrant.  The only problem they are having is the practical one of making sense of the data they have seized.

The problem is not with this specific case.  It is with what other use the capability in question might be put to.  We have seen our privacy rights almost completely obliterated in the past couple of decades.   Technology has enabled an unprecedented and overwhelming intrusion into our privacy.  It is possible to listen in on conversations in your home by bouncing a laser off a window.  A small GPS tracking device can be attached to your car in such a way that it is nearly undetectable.  CCTV cameras are popping up everywhere allowing your public movements to be tracked.  Thermal imaging cameras and other technology can tell a lot about what is going on inside your house even if you are not making any noise and they can do this from outside your property line.

And that ignores the fact that we now live in a highly computerized world.  Records of your checking, credit card, and debit card activity, all maintained by computer systems, make your life pretty much an open book.  Google knows where you go on the Internet (and probably what you say in your emails).  And more and more of us run more and more of our lives from our smart phones.  Imagine comparing what you can find out from a smart phone with what you could have found out 200 years ago by rifling through someone's desk (their "papers").  Then a lot of people couldn't read.  So things were done orally.  And financial activity was done in cash so no paper record of most transactions existed.  The idea that the contents of a smartphone should not be covered under "persons, papers, and effects" is ridiculous.  Yet key loggers and other spyware software are available for any and all models of smart phones.

Apple was one of the first companies to recognize this.  They were helped along by several high profile cases where location data, financial data, and other kinds of private data were easily extracted from iPhones.  They decided correctly that the only solution that would be effective would be to encrypt everything and to do so with enough protections that the encryption could not be easily avoided.  The FBI has validated the robustness of their design.

Technology companies have been repeatedly embarrassed in the last few years by revelations that "confidential" data was easily being swept up by security agencies and others.  They too decided that encryption was the way to cut this kind of activity off.  Hence we see the move to secure (HTTPS) web sites and to companies encrypting data as is moves across the Internet from one facility to another.

Security agencies and others don't like this.  It makes it at least hard and possibly impossible to tap into these data streams.  And, according to agency heads this is very dangerous.  But these people are known and documented liars.  And they have a lot of incentive to lie.  It makes the job of their agency easier and it makes it easier for them to amass bureaucratic power.  Finally, given that lying does not put them at risk for criminal sanctions (none of them have even been charged) and can actually enhance their political standing, why wouldn't they?

Here's a theory for the paranoid.  Maybe the FBI/NSA successfully cracked the phone.  But they decided that they could use this case to leverage companies like Apple into building trap doors into their encryption technology.  The Clipper case shows that this sort of thinking exists within these agencies.  And agency heads are known to be liars.  So this theory could be true.  I don't think it is true but I can't prove that I am right.  (I could if agency heads could actually be compelled to tell the truth when testifying under oath to Congress but I don't see that happening any time soon.)  

The issue is at bottom about a trade off.  The idea is that we can have more privacy but be less secure or we can have less privacy but be more secure.  In my opinion, however, the case that we are more secure is weak to nonexistent and the case that we have lost a lot of valuable privacy and are in serious danger of losing even more is strong.  I see the trade off in theory.  But I don't see much evidence that as a practical matter the trade off actually exists in the real world.  Instead I see us giving up privacy and getting nothing, as in no increase in security, back.  In fact, I think our security is diminished as others see us behaving in a sneaky and underhanded way.  That causes good people in the rest of the world to be reluctant to cooperate with us. That reduction in cooperation reduces our security.  So I come down on the side of privacy and support Apple's actions.

In the end I expect some sort of deal will be worked out between the FBI and Apple.   It will probably not be one that I approve of.  It will erode our privacy a little or a lot and I predict that whatever information is eventually extracted from the phone will turn out to be of little or no value. And, as Tim Cook, the CEO of Apple, has stated, once the tool is built it will always exist for the next time and the time after that, ad infinitum.  That is too high a cost.

No comments:

Post a Comment