Thursday, 18 February 2016

Apple

As I am sure you all know, Apple have taken a stand on a recent court order requiring them to make a back-door version of iOS so the FBI can try and unlock a phone of a known terrorist. Their customer letter makes their position very clear.

I know some people do not like Apple, and there are a lot of issues around the way they do business, but in this case I am very pleased with the stand they are taking. I, and anyone else with any clue as to the technology, have been saying the same all along. This is in part why I have started yet another petition (please sign).

There is, however, a big problem with explaining this to the public - because TERRORIST! I mean TERRORIST FFS!!!

I asked my wife if she had seen this in the news and her reaction was along the lines of "well, if he is a terrorist then they should unlock the phone". I do think I have convinced her that this is a really bad idea and a hugely bad precedent to be set.

The fact they have used an ancient law to force this order is just a clue to how underhand this is, and if allowed could open up all sorts of orders.

It is also crucial to realise that this is theatre. Criminals can encrypt things - the "secret" of encryption is out of the bag. I can encrypt things and store them on my phone, and the FBI would not be able to decrypt them even with Apple's help. This order may help one investigation with one phone now, but it is not a help in general, but it is a serious risk to the normal day to day security that we all expect and deserve. It is just about control of the largely innocent population. It is putting the government on the same side as the criminals in the security "battle", which is just silly.

Of course, one of the issues would be, if allowed, that every other country's law enforcement would ask Apple the same under each of their own laws, whether that is the UK, or France, or Russia, or China, or North Korea, and how would Apple have any argument? Indeed, once the magic version of iOS is made, Apple cannot even argue that they would have a cost in making it for other countries.

But what could Apple do if they fail to defeat this order? Well, one possible move would be to put keys in a separate tamper proof module in the hardware design in future. Much as SIM cards and bank cards work. This would allow a separate bit of hardware to impose retry timeouts and fail counts and erasure of keys on repeat fails. If that was in the hardware design then they would be unable to bypass in the firmware of the phone. Would they be ordered to change the hardware design? It clearly would not make sense to make an order for decoding one phone in future if it had such hardware...

Another simple idea, which they may be able to do now with the new s/w release even, is to make the firmware not allow loading new (signed) firmware on a locked phone. That would mean that the magic firmware the FBI get would work until the next iOS release and never again after that!

Really, we need governments to understand that encryption exists and if you make any part of it illegal or weakened you only do so for those that obey such laws - actual criminals will be unaffected by such rules, and you make their life so much easier when they are hacking us.

Indeed, part of the reasoning to explain this to my wife was another news article of an LA hospital being held to ransom by computer hacks. That is quite serious, and it is vulnerabilities and back doors in s/w that allow such things - the very sort of thing the FBI are asking for.

P.S. Seems later models already have a separate hardware security model! See here for good explanation.

P.P.S. Reading more details, the order is very specific to one phone and can even be done in Apple's premises, but the bigger concern for apply is the use of this old law to make such an order - if allowed, then it could mean any number of more intrusive orders. This is a "foot in the door" situation that needs to be stopped.

4 comments:

  1. Re: the hardware changes, that's effectively what they've done beginning with the A7 chip - which among other things enforces the time delay for failed password attempts. But unfortunately they only introduced it with the iPhone 5S and this is an iPhone 5C (a good reason to upgrade, perhaps.) There's also the question of whether Apple have protected themselves from being forced to apply some sort of firmware update to the secure enclave; it strikes me that if you're the designer of a piece of software, it's easy to require updates to be digitally signed by you, but much harder to prevent yourself from writing one that would weaken security.

    I agree with you, I think they should ask for the password not just on successful boot, but before you try and install a software update, either within iOS or using DFU mode (not sure if the latter is possible.)

    Good discussion about this on IRC last night, including the benefits of not a fixed length (4 or 6 digit) passcode (you can change this in your device settings.)

    http://www.apple.com/business/docs/iOS_Security_Guide.pdf

    ReplyDelete
    Replies
    1. Indeed, the SE could have perhaps a rule that you can only update its firmware when "unlocked" maybe.

      Delete
  2. What ancient law? I've seen lots of talk about this, but noone else is mentioning ancient laws.

    ReplyDelete
    Replies
    1. The Apple letter mentions The All Writs Act of 1789

      Delete